Re: [Blade of Agony] Road to Wolfenstein Devblog Part 07 | p
Posted: Thu May 16, 2019 10:42 am
I mostly concur with what Rachael said. The first impression is the most important one and it can be devastating.
Let's take one example: By default some image degrading postprocessing filters are active. No, they do not make the game look 'realistic' or 'cool', they make it look broken. The average user will have no clue why the visuals are so noisy. Worse, they have a huge impact on performance, e.g. on my Geforce 1060 using C3M3_A right after surfacing the game runs at 39 FPS with those effects on - with 49 fps with them off. And this is with a modern high end card and a fast CPU. Go a few generations back (i.e. what the average is for GZDoom's user base) and the frame rate will utterly tank on these effects.
There's a very good reason why GZDoom ships with lightmaps set to OFF, i.e. we know that many of our users cannot handle the feature. This is virtually the same here, the entire game is tuned to high end hardware, not giving the user any clue whatsoever how to make it run better. You absolutely cannot take a high end graphics card for granted, especially if these effects have such an easily detectable performance impact on high end hardware this spells Doom for more average GPUs, because the effect is 100% GPU overhead. A Geforce 1060 is 4x faster than a Geforce 550Ti, for example, and 8x faster than a current Intel integrated chipset.
A little math: The default postprocessing effects have a 2-3 ms impact on my system. Multiply that with 3 for a Geforce 550Ti and with 6 for a modern Intel GPU, (not 4 and 8 to account for inevitable overhead) and you suddenly get time values that make the effect game breaking. Have it on by default and people will start complaining left and right.
Let's take one example: By default some image degrading postprocessing filters are active. No, they do not make the game look 'realistic' or 'cool', they make it look broken. The average user will have no clue why the visuals are so noisy. Worse, they have a huge impact on performance, e.g. on my Geforce 1060 using C3M3_A right after surfacing the game runs at 39 FPS with those effects on - with 49 fps with them off. And this is with a modern high end card and a fast CPU. Go a few generations back (i.e. what the average is for GZDoom's user base) and the frame rate will utterly tank on these effects.
There's a very good reason why GZDoom ships with lightmaps set to OFF, i.e. we know that many of our users cannot handle the feature. This is virtually the same here, the entire game is tuned to high end hardware, not giving the user any clue whatsoever how to make it run better. You absolutely cannot take a high end graphics card for granted, especially if these effects have such an easily detectable performance impact on high end hardware this spells Doom for more average GPUs, because the effect is 100% GPU overhead. A Geforce 1060 is 4x faster than a Geforce 550Ti, for example, and 8x faster than a current Intel integrated chipset.
A little math: The default postprocessing effects have a 2-3 ms impact on my system. Multiply that with 3 for a Geforce 550Ti and with 6 for a modern Intel GPU, (not 4 and 8 to account for inevitable overhead) and you suddenly get time values that make the effect game breaking. Have it on by default and people will start complaining left and right.