Page 116 of 123

Re: ZScript Discussion

Posted: Tue Mar 21, 2017 7:32 pm
by AFADoomer
Nash wrote:Man, I wish there was a way to only compile ZScript without actually running the game. I'm guessing the sum of the total accumulated time every time my ZScript fails to make the game start has reached hours by now, haha...
Not the fastest route, since you're still essentially loading the game, but a batch file that looks something like this works for me:

Code: Select all

gzdoom.exe -file myfile.pk3 +log logfile.txt -stdout -norun
pause
This both dumps the output to a text file and dumps the output to the command prompt window so that I can immediately review the errors once it's done. '-norun' causes the engine to only load up to right before video is initialized, then exit, so it's at least slightly faster than having to exit the game yourself.

Re: ZScript Discussion

Posted: Tue Mar 21, 2017 7:40 pm
by Nash
That works for me. :D Cheers!

Re: ZScript Discussion

Posted: Sun Mar 26, 2017 10:55 pm
by Nash
Theoretical question: would there be performance gains if ZScript had to be pre-compiled?

[like Unreal Engine's .uc -> .u]

Or is the entire system already running as fast as it could.

Re: ZScript Discussion

Posted: Sun Mar 26, 2017 11:08 pm
by Major Cooke
Only if something like JIT happened (not necessarily JIT itself mind) where code is compiled down to bytes.

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 12:40 am
by Matt
Can we please not even theoretically go there!? :puke:

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 12:57 am
by Kinsie
Nash wrote:Theoretical question: would there be performance gains if ZScript had to be pre-compiled?

[like Unreal Engine's .uc -> .u]

Or is the entire system already running as fast as it could.
This seems like a nice way to allow certain persons to only ship compiled scripts without the appropriate source, ala ACS. Which is to say, I don't think it's a good idea.

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 12:59 am
by Rachael
Nash wrote:Theoretical question: would there be performance gains if ZScript had to be pre-compiled?

[like Unreal Engine's .uc -> .u]

Or is the entire system already running as fast as it could.
ZScript already compiles internally at startup, the only thing it would improve is startup times.
Kinsie wrote:This seems like a nice way to allow certain persons to only ship compiled scripts without the appropriate source, ala ACS. Which is to say, I don't think it's a good idea.
Which contributes in small part to why I agree with this.

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 1:58 am
by ZZYZX
Nash wrote:Theoretical question: would there be performance gains if ZScript had to be pre-compiled?
None, but it also would introduce various issues with binary incompatibility (currently even offsets into structures are stored as ints) and people hiding their possibly malicious source code.
Currently GZDoom's save games are completely incompatible between major stable releases (I've seen a report on a Russian Doom board that someone tried to open a 2.3 save with GZDoom 2.4, resulted in monsters walking through walls and shit, and this didn't even fix on the next level, only on new game...). This is exactly because savegames store partial object data and if it doesn't match, everything breaks.

Also, I'm strongly against the idea of allowing source-less ZScript.
Not only closed source is ideologically bad in the world of non-profit arts, but also it's actually dangerous to allow things like that the more advanced ZScript becomes.
I mean, it's not like people are thoroughly examining source code of every WAD they run, but then again if ZScript ever gets direct FS access of any kind it'd make sense to have permissioning system that'd disallow calling certain methods if the user does not allow it explicitly with a command-line switch or something.
Rachael wrote:ZScript already compiles internally at startup, the only thing it would improve is startup times.
Script compilation takes 1 second or so on ZScript-heavy wads. Just saying. This is nothing compared to 5+ (usually 10+) seconds spent on texture conversion during "init refresh something" stage.

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 2:18 am
by Rachael
And yeah - with everything ZZYZX said - all the more the reason precompiled ZScript is a bad thing. Let's just leave it open as it is. :)

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 2:32 am
by Graf Zahl
Nash wrote:Theoretical question: would there be performance gains if ZScript had to be pre-compiled?

[like Unreal Engine's .uc -> .u]

Or is the entire system already running as fast as it could.
ZScript cannot be precompiled, the compiler creates a lot of data that is needed at run time and the script code itself also heavily depends on being able to address static game content as constants, none of which can be done with precompilation. The generated data is also machine dependent, i.e. it differs between 32 and 64 bit targets.

At the moment the compile time for the internal scripts lies around 0.2 s, depending on the CPU's speed.

Aside from technical issues, let me make abundantly clear that I consider compiled scriping languages in open source projects the antithesis of what these projects stand for - that includes ACS, btw.

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 2:38 am
by Graf Zahl
ZZYZX wrote:This is exactly because savegames store partial object data and if it doesn't match, everything breaks.
The reasons why savegames between 2.3 and 2.4 are incompatible has nothing to do with partial saves, the reason simply is that between these versions a lot of internal stuff got changed (e.g. the entire inventory system has changed its internal layout and saves data differently.) I should have bumped the savegame version but I simply forgot, I only did it in the post-2.4 branch where I introduced a real breaking change. Once the current rework is done it should stabilize and do what the JSON rewrite was supposed to accomplish: Ensure long term stability. But that can only happen if the engine doesn't see constant fundamental refactoring.
ZZYZX wrote: Also, I'm strongly against the idea of allowing source-less ZScript.
Not only closed source is ideologically bad in the world of non-profit arts, but also it's actually dangerous to allow things like that the more advanced ZScript becomes.
Not only that - it'd also prevent future changes, like implementing a faster VM, doing JIT-compiling which would work better if done directly from the IR instead of the byte code and other stuff like code generation errors. This isn't GLSL where 3 vendors implemented 6 different compilers.

And let's not forget: Compiled scripts can only be done if the internal scripts are also compiled. Worse, it'd block many potential internal improvements. All summed up, it's a bad idea from start to finish - it's wrong ideologically and would amount to a gigantic amount of work.

Here's a fun story: 6 years ago I was working on a scripting engine whose lead designer got all the wrong ideas about efficiency. He actually did implement all the crap from an intermediate binary representation, to a linker, to a final binary format - and nothing in that thing worked right. It was a bug ridden mess that couldn't be fixed because all those binary formats constantly got into the way and this guy couldn't be convinced to simplify his IR and compile it directly to the final byte code. Needless to say, that thing was magnitudes slower than the ZScript compiler. Compiling half a megabyte of script code took several seconds - because every file was compiled on its own, and in order to resolve its external symbols had to pull in everything it referenced, i.e. each file was processed up to 30-40 times for each compile run. That's where things would end up with binary precompilation.

I know one thing: Should I ever see the need to professionally need a homegrown scripting language, I'll use ZScript as a base. It may not be perfect but since I had that bad example as a guideline I was able to avoid the design errors it made.

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 2:41 am
by Nash
Alright, thanks for the technical info.

And of course I agree, closed-source-hoarding like the QuakeC community is something I utterly despise. Exact reason why I'm not modding for Quake after all these years. I couldn't learn shit.

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 3:25 am
by ZZYZX
Graf Zahl wrote:Here's a fun story: 6 years ago I was working on a scripting engine whose lead designer got all the wrong ideas about efficiency. He actually did implement all the crap from an intermediate binary representation, to a linker, to a final binary format - and nothing in that thing worked right. It was a bug ridden mess that couldn't be fixed because all those binary formats constantly got into the way and this guy couldn't be convinced to simplify his IR and compile it directly to the final byte code. Needless to say, that thing was magnitudes slower than the ZScript compiler. Compiling half a megabyte of script code took several seconds - because every file was compiled on its own, and in order to resolve its external symbols had to pull in everything it referenced, i.e. each file was processed up to 30-40 times for each compile run. That's where things would end up with binary precompilation.
Technically, the main problem here is the absence of caching and modules (in the Python sense) and presence of the linker instead, not that it was compiling to bytecode. I get the main idea though.
This problem haunts all these complex compiler toolchains that for whatever reason try to be as separate as possible, not in the actual compiler structure, but in the executables used... (I hate C++ header-heavy libraries like Boost)

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 4:13 am
by Graf Zahl
ZZYZX wrote: Technically, the main problem here is the absence of caching and modules (in the Python sense) and presence of the linker instead, not that it was compiling to bytecode. I get the main idea though.
Correct, but the need to compile to static byte code made some things a bit more complicated, too.

And that part is the biggest issue with ZScript.

ZScript generates run time types while it compiles - all these would need to be stored in the bytecode as well - easy for integral types but quite tricky if the type system can liberally create complex types. And the game needs the types for serialization, parameter checking and many other things.
ZScript also statically resolves references to class types, sound indices, names, fonts, global CVARs and functions.
So the least a static byte code would need is a relocation table for all this, unless the code was to look this up dynamically.
And last but not least, the different size of pointers makes field indices architecture dependent. I think something like this doomed Eternity's flirt with the Small language. It took the worst of both worlds: Compile to byte code and make the output architecture dependent, a recipe for guaranteed failure.

Re: ZScript Discussion

Posted: Mon Mar 27, 2017 4:25 am
by dpJudas
Having toyed a bit with LLVM of late, I find that any IR that has to be backwards compatible (which any bytecode does) causes so many problems when it comes to passing optimization opportunities from the frontend to the backend.

For example, telling the backend in LLVM to unroll certain loops required adding an entire "metadata" system to their IR model, and some of the intrinsic functions wasn't implemented in all their backends. The worst part was that their optimizing backend ended up relying on IR looking exactly like clang - stray but a little from how it outputted things and half the optimization passes would not work. The Law of Leaky abstractions in full effect.