[Added] Shader Cache

Moderator: Developers

Shader Cache

Postby Rachael » Sat Oct 27, 2018 8:33 am

As evidenced here, people who use Intel GPU's are in desperate need of a shader cache. In my tests, unless I run my processor in full turbo (which for reasons you'll figure out later in this post, I don't like to do), it can take upwards of 1-2 minutes to start GZDoom.

The reason why I am using my Intel GPU more lately is because lately, GZDoom has been getting more and more demanding from the NVidia GPU which causes my fans to go crazy and I don't like that sound, and I still get a smooth frame rate on Intel anyway so I figure why not just use that instead. The computer runs silently this way, and that's a better experience for me.

So is it possible to do such a thing? An idea to do it might be something which caches it based on Git version, writes it to GZDoom's cache folder, but would have to use the "ARB_get_program_binary" extension. To save on disk space it could delete anything that hasn't been used in over a month.
User avatar
Rachael
QZDoom + Webmaster
 
Joined: 13 Jan 2004

Re: Shader Cache

Postby Graf Zahl » Sat Oct 27, 2018 8:47 am

1-2 minutes? For me a complete shader recompile takes 10 seconds. Intel must really be doing something wrong... :?
User avatar
Graf Zahl
Lead GZDoom Developer
 
Joined: 19 Jul 2003
Location: Germany

Re: Shader Cache

Postby dpJudas » Sat Oct 27, 2018 1:27 pm

@Rachael: if you want to code support for this, what you need to do is call glGetProgramBinary right after each glLinkProgram call. Then instead of calling glCompileShader and glLinkProgram on next launch it should call glProgramBinary with the output from the previous call to glGetProgramBinary.
dpJudas
 
 
 
Joined: 28 May 2016

Re: Shader Cache

Postby Graf Zahl » Sat Oct 27, 2018 2:47 pm

Furthermore, it should only be active for Intel, it makes absolutely no sense with NVidia and AMD to cache the shader binaries ourselves.
User avatar
Graf Zahl
Lead GZDoom Developer
 
Joined: 19 Jul 2003
Location: Germany

Re: Shader Cache

Postby Rachael » Sat Oct 27, 2018 2:56 pm

Alright. I might be able to manage that.

If I do this, is it going to interfere with the backend abstraction refactor?
User avatar
Rachael
QZDoom + Webmaster
 
Joined: 13 Jan 2004

Re: Shader Cache

Postby Graf Zahl » Sat Oct 27, 2018 3:48 pm

If you keep your code inside gl_shader.cpp, no.
User avatar
Graf Zahl
Lead GZDoom Developer
 
Joined: 19 Jul 2003
Location: Germany

Re: Shader Cache

Postby Marisa Kirisame » Sun Oct 28, 2018 5:34 am

If you're doing a shader cache for Intel, it should be only for Windows. On Linux (and pretty much any other platform that uses Mesa) there already is a system-wide shader cache by default on most distros.
User avatar
Marisa Kirisame
ZScript Magician
 
 
 
Joined: 08 Feb 2008
Location: Vigo, Galicia
Discord: Marisa Kirisame#4689
Twitch ID: magusmarisa

Re: Shader Cache

Postby drfrag » Mon Nov 19, 2018 9:46 am

What happened with this? Any technical problems?
User avatar
drfrag
I.R developer, I.R smart
Vintage GZDoom Developer
 
Joined: 23 Apr 2004
Location: Spain

Re: Shader Cache

Postby Rachael » Mon Nov 19, 2018 1:29 pm

I haven't started yet. Needless to say the posts in this thread instituting arbitrary rules on this, with reasons that are understandable but easily mitigated by hashing the GPU name with the pre-compiled code to make such restrictions wholly unnecessary, have killed my motivation on this.
User avatar
Rachael
QZDoom + Webmaster
 
Joined: 13 Jan 2004

Re: Shader Cache

Postby drfrag » Mon Nov 19, 2018 2:44 pm

I might look into it myself, but i'd only check the graphics vendor.
Edit: testing could be a problem, since i only have access to my sister in law's laptop once every few days. I chose amd over intel for my new "crappy" laptop, but it's great compared to my three Pentium M ones (radeon 7000, radeon 9200, intel gma 900). Unfortunately it's gma 900 and not 950 like yours so it's even slower. BTW you should change thermal paste (in case you didn't). And let's not forget my P4 2.8 with trident blade xp, the crappiest of them all. :)
User avatar
drfrag
I.R developer, I.R smart
Vintage GZDoom Developer
 
Joined: 23 Apr 2004
Location: Spain

Re: Shader Cache

Postby Rachael » Mon Nov 19, 2018 3:10 pm

To be quite frank, you shouldn't even need to do that.

There are two things that can change the shader compilation: The driver doing it, and the shader program, itself. As long as those two stay consistent, that is what the shader cache is meant to help you with. If you have a dual GPU system - fine, all is dandy there, you compile two sets of shaders instead of one. It shouldn't be that complicated. Either way, Intel benefits, and NVidia sees no real drawbacks.

If a shader changes obviously it has to be recompiled. If you properly implement hashing to the original program, that's not an issue - it simply means the hash for the new shader source is gone. So? - recompile! Simple!

If you change your GPU, that again means the hashing scheme chanegs - simply recompile, problem solved.

As for old caches hanging around, that's what the prune is meant to solve - anything that is ~2 months old - gone.

The issues brought up in this thread overcomplicate the original issue unnecessarily.
User avatar
Rachael
QZDoom + Webmaster
 
Joined: 13 Jan 2004

Re: Shader Cache

Postby drfrag » Wed Nov 21, 2018 3:42 pm

I've seen dpJudas has added this. Well done! :wub: This was way more complicated than i expected, probably i couldn't have done it. :oops: :)
I've tested this with the vintage build only on nvidia.
User avatar
drfrag
I.R developer, I.R smart
Vintage GZDoom Developer
 
Joined: 23 Apr 2004
Location: Spain

Re: Shader Cache

Postby Graf Zahl » Wed Nov 21, 2018 4:09 pm

I'd still want to disable this on cards that have a native shader cache. There it will only create some needless overhead.
User avatar
Graf Zahl
Lead GZDoom Developer
 
Joined: 19 Jul 2003
Location: Germany

Re: Shader Cache

Postby Rachael » Wed Nov 21, 2018 5:20 pm

I really would prefer, instead of doing a card check, to do a CVAR check instead, and make the whole thing optional. And before you think I am insane, hear me out:

There is no guarantee every NVidia and AMD will have a cache. There is no guarantee every Intel won't. As a feature it's a godsend for those who don't have such a feature, and it WILL reduce the number of reports appearing in Bugs and/or Technical Issues.

Even so - I've noticed NVidia's caches are like, 2-3 megabytes, maybe? Peanuts. - Intel's, on the other hand, is 37 megabytes. So overall, I think you're dramatically overstating the "harm" that this feature is doing.
User avatar
Rachael
QZDoom + Webmaster
 
Joined: 13 Jan 2004

Re: Shader Cache

Postby Graf Zahl » Wed Nov 21, 2018 6:06 pm

The thing with NVidia is: It doesn't really cache a compiled binary! The entire feature is a fake on their hardware so that third parties cannot analyze their hardware implementation.
User avatar
Graf Zahl
Lead GZDoom Developer
 
Joined: 19 Jul 2003
Location: Germany


Return to Closed Feature Suggestions

Who is online

Users browsing this forum: No registered users and 2 guests