ZScript Discussion

Ask about ACS, DECORATE, ZScript, or any other scripting questions here!

Moderator: GZDoom Developers

Forum rules
Before asking on how to use a ZDoom feature, read the ZDoom wiki first. If you still don't understand how to use a feature, then ask here.

Please bear in mind that the people helping you do not automatically know how much you know. You may be asked to upload your project file to look at. Don't be afraid to ask questions about what things mean, but also please be patient with the people trying to help you. (And helpers, please be patient with the person you're trying to help!)
User avatar
AFADoomer
Posts: 1337
Joined: Tue Jul 15, 2003 4:18 pm

Re: ZScript Discussion

Post by AFADoomer »

Nash wrote:Man, I wish there was a way to only compile ZScript without actually running the game. I'm guessing the sum of the total accumulated time every time my ZScript fails to make the game start has reached hours by now, haha...
Not the fastest route, since you're still essentially loading the game, but a batch file that looks something like this works for me:

Code: Select all

gzdoom.exe -file myfile.pk3 +log logfile.txt -stdout -norun
pause
This both dumps the output to a text file and dumps the output to the command prompt window so that I can immediately review the errors once it's done. '-norun' causes the engine to only load up to right before video is initialized, then exit, so it's at least slightly faster than having to exit the game yourself.
User avatar
Nash
 
 
Posts: 17465
Joined: Mon Oct 27, 2003 12:07 am
Location: Kuala Lumpur, Malaysia

Re: ZScript Discussion

Post by Nash »

That works for me. :D Cheers!
User avatar
Nash
 
 
Posts: 17465
Joined: Mon Oct 27, 2003 12:07 am
Location: Kuala Lumpur, Malaysia

Re: ZScript Discussion

Post by Nash »

Theoretical question: would there be performance gains if ZScript had to be pre-compiled?

[like Unreal Engine's .uc -> .u]

Or is the entire system already running as fast as it could.
User avatar
Major Cooke
Posts: 8196
Joined: Sun Jan 28, 2007 3:55 pm
Preferred Pronouns: He/Him
Location: QZDoom Maintenance Team

Re: ZScript Discussion

Post by Major Cooke »

Only if something like JIT happened (not necessarily JIT itself mind) where code is compiled down to bytes.
User avatar
Matt
Posts: 9696
Joined: Sun Jan 04, 2004 5:37 pm
Preferred Pronouns: They/Them
Operating System Version (Optional): Debian Bullseye
Location: Gotham City SAR, Wyld-Lands of the Lotus People, Dominionist PetroConfederacy of Saudi Canadia

Re: ZScript Discussion

Post by Matt »

Can we please not even theoretically go there!? :puke:
User avatar
Kinsie
Posts: 7402
Joined: Fri Oct 22, 2004 9:22 am
Graphics Processor: nVidia with Vulkan support
Location: MAP33

Re: ZScript Discussion

Post by Kinsie »

Nash wrote:Theoretical question: would there be performance gains if ZScript had to be pre-compiled?

[like Unreal Engine's .uc -> .u]

Or is the entire system already running as fast as it could.
This seems like a nice way to allow certain persons to only ship compiled scripts without the appropriate source, ala ACS. Which is to say, I don't think it's a good idea.
User avatar
Rachael
Posts: 13797
Joined: Tue Jan 13, 2004 1:31 pm
Preferred Pronouns: She/Her

Re: ZScript Discussion

Post by Rachael »

Nash wrote:Theoretical question: would there be performance gains if ZScript had to be pre-compiled?

[like Unreal Engine's .uc -> .u]

Or is the entire system already running as fast as it could.
ZScript already compiles internally at startup, the only thing it would improve is startup times.
Kinsie wrote:This seems like a nice way to allow certain persons to only ship compiled scripts without the appropriate source, ala ACS. Which is to say, I don't think it's a good idea.
Which contributes in small part to why I agree with this.
User avatar
ZZYZX
 
 
Posts: 1384
Joined: Sun Oct 14, 2012 1:43 am
Location: Ukraine

Re: ZScript Discussion

Post by ZZYZX »

Nash wrote:Theoretical question: would there be performance gains if ZScript had to be pre-compiled?
None, but it also would introduce various issues with binary incompatibility (currently even offsets into structures are stored as ints) and people hiding their possibly malicious source code.
Currently GZDoom's save games are completely incompatible between major stable releases (I've seen a report on a Russian Doom board that someone tried to open a 2.3 save with GZDoom 2.4, resulted in monsters walking through walls and shit, and this didn't even fix on the next level, only on new game...). This is exactly because savegames store partial object data and if it doesn't match, everything breaks.

Also, I'm strongly against the idea of allowing source-less ZScript.
Not only closed source is ideologically bad in the world of non-profit arts, but also it's actually dangerous to allow things like that the more advanced ZScript becomes.
I mean, it's not like people are thoroughly examining source code of every WAD they run, but then again if ZScript ever gets direct FS access of any kind it'd make sense to have permissioning system that'd disallow calling certain methods if the user does not allow it explicitly with a command-line switch or something.
Rachael wrote:ZScript already compiles internally at startup, the only thing it would improve is startup times.
Script compilation takes 1 second or so on ZScript-heavy wads. Just saying. This is nothing compared to 5+ (usually 10+) seconds spent on texture conversion during "init refresh something" stage.
User avatar
Rachael
Posts: 13797
Joined: Tue Jan 13, 2004 1:31 pm
Preferred Pronouns: She/Her

Re: ZScript Discussion

Post by Rachael »

And yeah - with everything ZZYZX said - all the more the reason precompiled ZScript is a bad thing. Let's just leave it open as it is. :)
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49184
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: ZScript Discussion

Post by Graf Zahl »

Nash wrote:Theoretical question: would there be performance gains if ZScript had to be pre-compiled?

[like Unreal Engine's .uc -> .u]

Or is the entire system already running as fast as it could.
ZScript cannot be precompiled, the compiler creates a lot of data that is needed at run time and the script code itself also heavily depends on being able to address static game content as constants, none of which can be done with precompilation. The generated data is also machine dependent, i.e. it differs between 32 and 64 bit targets.

At the moment the compile time for the internal scripts lies around 0.2 s, depending on the CPU's speed.

Aside from technical issues, let me make abundantly clear that I consider compiled scriping languages in open source projects the antithesis of what these projects stand for - that includes ACS, btw.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49184
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: ZScript Discussion

Post by Graf Zahl »

ZZYZX wrote:This is exactly because savegames store partial object data and if it doesn't match, everything breaks.
The reasons why savegames between 2.3 and 2.4 are incompatible has nothing to do with partial saves, the reason simply is that between these versions a lot of internal stuff got changed (e.g. the entire inventory system has changed its internal layout and saves data differently.) I should have bumped the savegame version but I simply forgot, I only did it in the post-2.4 branch where I introduced a real breaking change. Once the current rework is done it should stabilize and do what the JSON rewrite was supposed to accomplish: Ensure long term stability. But that can only happen if the engine doesn't see constant fundamental refactoring.
ZZYZX wrote: Also, I'm strongly against the idea of allowing source-less ZScript.
Not only closed source is ideologically bad in the world of non-profit arts, but also it's actually dangerous to allow things like that the more advanced ZScript becomes.
Not only that - it'd also prevent future changes, like implementing a faster VM, doing JIT-compiling which would work better if done directly from the IR instead of the byte code and other stuff like code generation errors. This isn't GLSL where 3 vendors implemented 6 different compilers.

And let's not forget: Compiled scripts can only be done if the internal scripts are also compiled. Worse, it'd block many potential internal improvements. All summed up, it's a bad idea from start to finish - it's wrong ideologically and would amount to a gigantic amount of work.

Here's a fun story: 6 years ago I was working on a scripting engine whose lead designer got all the wrong ideas about efficiency. He actually did implement all the crap from an intermediate binary representation, to a linker, to a final binary format - and nothing in that thing worked right. It was a bug ridden mess that couldn't be fixed because all those binary formats constantly got into the way and this guy couldn't be convinced to simplify his IR and compile it directly to the final byte code. Needless to say, that thing was magnitudes slower than the ZScript compiler. Compiling half a megabyte of script code took several seconds - because every file was compiled on its own, and in order to resolve its external symbols had to pull in everything it referenced, i.e. each file was processed up to 30-40 times for each compile run. That's where things would end up with binary precompilation.

I know one thing: Should I ever see the need to professionally need a homegrown scripting language, I'll use ZScript as a base. It may not be perfect but since I had that bad example as a guideline I was able to avoid the design errors it made.
Last edited by Graf Zahl on Mon Mar 27, 2017 2:52 am, edited 1 time in total.
User avatar
Nash
 
 
Posts: 17465
Joined: Mon Oct 27, 2003 12:07 am
Location: Kuala Lumpur, Malaysia

Re: ZScript Discussion

Post by Nash »

Alright, thanks for the technical info.

And of course I agree, closed-source-hoarding like the QuakeC community is something I utterly despise. Exact reason why I'm not modding for Quake after all these years. I couldn't learn shit.
User avatar
ZZYZX
 
 
Posts: 1384
Joined: Sun Oct 14, 2012 1:43 am
Location: Ukraine

Re: ZScript Discussion

Post by ZZYZX »

Graf Zahl wrote:Here's a fun story: 6 years ago I was working on a scripting engine whose lead designer got all the wrong ideas about efficiency. He actually did implement all the crap from an intermediate binary representation, to a linker, to a final binary format - and nothing in that thing worked right. It was a bug ridden mess that couldn't be fixed because all those binary formats constantly got into the way and this guy couldn't be convinced to simplify his IR and compile it directly to the final byte code. Needless to say, that thing was magnitudes slower than the ZScript compiler. Compiling half a megabyte of script code took several seconds - because every file was compiled on its own, and in order to resolve its external symbols had to pull in everything it referenced, i.e. each file was processed up to 30-40 times for each compile run. That's where things would end up with binary precompilation.
Technically, the main problem here is the absence of caching and modules (in the Python sense) and presence of the linker instead, not that it was compiling to bytecode. I get the main idea though.
This problem haunts all these complex compiler toolchains that for whatever reason try to be as separate as possible, not in the actual compiler structure, but in the executables used... (I hate C++ header-heavy libraries like Boost)
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49184
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: ZScript Discussion

Post by Graf Zahl »

ZZYZX wrote: Technically, the main problem here is the absence of caching and modules (in the Python sense) and presence of the linker instead, not that it was compiling to bytecode. I get the main idea though.
Correct, but the need to compile to static byte code made some things a bit more complicated, too.

And that part is the biggest issue with ZScript.

ZScript generates run time types while it compiles - all these would need to be stored in the bytecode as well - easy for integral types but quite tricky if the type system can liberally create complex types. And the game needs the types for serialization, parameter checking and many other things.
ZScript also statically resolves references to class types, sound indices, names, fonts, global CVARs and functions.
So the least a static byte code would need is a relocation table for all this, unless the code was to look this up dynamically.
And last but not least, the different size of pointers makes field indices architecture dependent. I think something like this doomed Eternity's flirt with the Small language. It took the worst of both worlds: Compile to byte code and make the output architecture dependent, a recipe for guaranteed failure.
dpJudas
 
 
Posts: 3134
Joined: Sat May 28, 2016 1:01 pm

Re: ZScript Discussion

Post by dpJudas »

Having toyed a bit with LLVM of late, I find that any IR that has to be backwards compatible (which any bytecode does) causes so many problems when it comes to passing optimization opportunities from the frontend to the backend.

For example, telling the backend in LLVM to unroll certain loops required adding an entire "metadata" system to their IR model, and some of the intrinsic functions wasn't implemented in all their backends. The worst part was that their optimizing backend ended up relying on IR looking exactly like clang - stray but a little from how it outputted things and half the optimization passes would not work. The Law of Leaky abstractions in full effect.

Return to “Scripting”