Shader Cache

Moderator: GZDoom Developers

Post Reply
User avatar
Rachael
Posts: 13562
Joined: Tue Jan 13, 2004 1:31 pm
Preferred Pronouns: She/Her
Contact:

Shader Cache

Post by Rachael »

As evidenced here, people who use Intel GPU's are in desperate need of a shader cache. In my tests, unless I run my processor in full turbo (which for reasons you'll figure out later in this post, I don't like to do), it can take upwards of 1-2 minutes to start GZDoom.

The reason why I am using my Intel GPU more lately is because lately, GZDoom has been getting more and more demanding from the NVidia GPU which causes my fans to go crazy and I don't like that sound, and I still get a smooth frame rate on Intel anyway so I figure why not just use that instead. The computer runs silently this way, and that's a better experience for me.

So is it possible to do such a thing? An idea to do it might be something which caches it based on Git version, writes it to GZDoom's cache folder, but would have to use the "ARB_get_program_binary" extension. To save on disk space it could delete anything that hasn't been used in over a month.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49067
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Shader Cache

Post by Graf Zahl »

1-2 minutes? For me a complete shader recompile takes 10 seconds. Intel must really be doing something wrong... :?
dpJudas
 
 
Posts: 3040
Joined: Sat May 28, 2016 1:01 pm

Re: Shader Cache

Post by dpJudas »

@Rachael: if you want to code support for this, what you need to do is call glGetProgramBinary right after each glLinkProgram call. Then instead of calling glCompileShader and glLinkProgram on next launch it should call glProgramBinary with the output from the previous call to glGetProgramBinary.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49067
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Shader Cache

Post by Graf Zahl »

Furthermore, it should only be active for Intel, it makes absolutely no sense with NVidia and AMD to cache the shader binaries ourselves.
User avatar
Rachael
Posts: 13562
Joined: Tue Jan 13, 2004 1:31 pm
Preferred Pronouns: She/Her
Contact:

Re: Shader Cache

Post by Rachael »

Alright. I might be able to manage that.

If I do this, is it going to interfere with the backend abstraction refactor?
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49067
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Shader Cache

Post by Graf Zahl »

If you keep your code inside gl_shader.cpp, no.
User avatar
Marisa the Magician
Posts: 3886
Joined: Fri Feb 08, 2008 9:15 am
Preferred Pronouns: She/Her
Operating System Version (Optional): (btw I use) Arch
Graphics Processor: nVidia with Vulkan support
Location: Vigo, Galicia
Contact:

Re: Shader Cache

Post by Marisa the Magician »

If you're doing a shader cache for Intel, it should be only for Windows. On Linux (and pretty much any other platform that uses Mesa) there already is a system-wide shader cache by default on most distros.
User avatar
drfrag
Vintage GZDoom Developer
Posts: 3141
Joined: Fri Apr 23, 2004 3:51 am
Location: Spain
Contact:

Re: Shader Cache

Post by drfrag »

What happened with this? Any technical problems?
User avatar
Rachael
Posts: 13562
Joined: Tue Jan 13, 2004 1:31 pm
Preferred Pronouns: She/Her
Contact:

Re: Shader Cache

Post by Rachael »

I haven't started yet. Needless to say the posts in this thread instituting arbitrary rules on this, with reasons that are understandable but easily mitigated by hashing the GPU name with the pre-compiled code to make such restrictions wholly unnecessary, have killed my motivation on this.
User avatar
drfrag
Vintage GZDoom Developer
Posts: 3141
Joined: Fri Apr 23, 2004 3:51 am
Location: Spain
Contact:

Re: Shader Cache

Post by drfrag »

I might look into it myself, but i'd only check the graphics vendor.
Edit: testing could be a problem, since i only have access to my sister in law's laptop once every few days. I chose amd over intel for my new "crappy" laptop, but it's great compared to my three Pentium M ones (radeon 7000, radeon 9200, intel gma 900). Unfortunately it's gma 900 and not 950 like yours so it's even slower. BTW you should change thermal paste (in case you didn't). And let's not forget my P4 2.8 with trident blade xp, the crappiest of them all. :)
User avatar
Rachael
Posts: 13562
Joined: Tue Jan 13, 2004 1:31 pm
Preferred Pronouns: She/Her
Contact:

Re: Shader Cache

Post by Rachael »

To be quite frank, you shouldn't even need to do that.

There are two things that can change the shader compilation: The driver doing it, and the shader program, itself. As long as those two stay consistent, that is what the shader cache is meant to help you with. If you have a dual GPU system - fine, all is dandy there, you compile two sets of shaders instead of one. It shouldn't be that complicated. Either way, Intel benefits, and NVidia sees no real drawbacks.

If a shader changes obviously it has to be recompiled. If you properly implement hashing to the original program, that's not an issue - it simply means the hash for the new shader source is gone. So? - recompile! Simple!

If you change your GPU, that again means the hashing scheme chanegs - simply recompile, problem solved.

As for old caches hanging around, that's what the prune is meant to solve - anything that is ~2 months old - gone.

The issues brought up in this thread overcomplicate the original issue unnecessarily.
User avatar
drfrag
Vintage GZDoom Developer
Posts: 3141
Joined: Fri Apr 23, 2004 3:51 am
Location: Spain
Contact:

Re: Shader Cache

Post by drfrag »

I've seen dpJudas has added this. Well done! :wub: This was way more complicated than i expected, probably i couldn't have done it. :oops: :)
I've tested this with the vintage build only on nvidia.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49067
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Shader Cache

Post by Graf Zahl »

I'd still want to disable this on cards that have a native shader cache. There it will only create some needless overhead.
User avatar
Rachael
Posts: 13562
Joined: Tue Jan 13, 2004 1:31 pm
Preferred Pronouns: She/Her
Contact:

Re: Shader Cache

Post by Rachael »

I really would prefer, instead of doing a card check, to do a CVAR check instead, and make the whole thing optional. And before you think I am insane, hear me out:

There is no guarantee every NVidia and AMD will have a cache. There is no guarantee every Intel won't. As a feature it's a godsend for those who don't have such a feature, and it WILL reduce the number of reports appearing in Bugs and/or Technical Issues.

Even so - I've noticed NVidia's caches are like, 2-3 megabytes, maybe? Peanuts. - Intel's, on the other hand, is 37 megabytes. So overall, I think you're dramatically overstating the "harm" that this feature is doing.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49067
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Shader Cache

Post by Graf Zahl »

The thing with NVidia is: It doesn't really cache a compiled binary! The entire feature is a fake on their hardware so that third parties cannot analyze their hardware implementation.
Post Reply

Return to “Closed Feature Suggestions [GZDoom]”