When did ATI performance on GZ stop being equal to nvidia?
-
- Posts: 431
- Joined: Fri Aug 03, 2012 6:44 am
When did ATI performance on GZ stop being equal to nvidia?
I'm shopping for a card that will work in both types of AGP slots which also has opengl2. They're really, really hard to find as a lot of the geforce fx series don't have hardware opengl2 and anything newer than them is only slotted for newer AGP. I'm finding a few radeon 9600's, but I'm not sure what ATI was like back then? Also got a 3650 which is much closer to when development was halted over the whole ATI issue (2010).
-
- Posts: 13718
- Joined: Tue Jan 13, 2004 1:31 pm
- Preferred Pronouns: She/Her
Re: When did ATI performance on GZ stop being equal to nvidi
It never was. ATI's draw call performance was terrible from the very beginning. That is using proprietary drivers, though. OSS drivers might be a different story - from what I understand ATI's hardware is very powerful, but it's the drivers themselves that are always such shit.
Obviously this means performance comparison will probably be dramatically different between the two on Linux versus Windows.
Obviously this means performance comparison will probably be dramatically different between the two on Linux versus Windows.
-
- Lead GZDoom+Raze Developer
- Posts: 49130
- Joined: Sat Jul 19, 2003 10:19 am
- Location: Germany
Re: When did ATI performance on GZ stop being equal to nvidi
ATI performance has never been equal to NVidia. The drawcall performance issue has been present since the earliest days I know, that is the Radeon 9xxx series. NVidia never had that, even a lowly Geforce 4 had low drawcall overhead. Of course back in the day graphics hardware performance was a far larger issue than today. A low end card like the Geforce 5200 couldn't even remotely hold up to a Geforce 5900, for example.
A good indicator for this is the texture atlases in the software renderer. Apparently performance on SM 1.4 cards, i.e. low end Radeon 9xxx, was bad enough that for a low-elements 2D HUD is was necessary to manually batch calls as much as possible. On the other hand I'm not sure, because this was Randi who had a tendency for doing optimizations which neither did the code any favor nor provided any real improvements. I'm still occasionally faced with such code and most of the time it can be removed without a second thought, as it won't break anything.
And never forget that fog is very much broken on any ATI hardware before the implementation of shaders. They had some weird screen space clipping issue in their fog routines that turned it into a visual mess - and they never ever fixed the bug, despite having it reported several times over the years. Apparently commercial games did not use OpenGL fog and it was deemed not relevant enough.
Ironically the drawcall overhead issue became far more problematic at the time when shaders became fast enough to render fog properly. Running the game on a Geforce 8600 vs. a high end ATI card of the same vintage was a night and day difference with the mid range Geforce literally running circles around ATI's most expensive product.
A good indicator for this is the texture atlases in the software renderer. Apparently performance on SM 1.4 cards, i.e. low end Radeon 9xxx, was bad enough that for a low-elements 2D HUD is was necessary to manually batch calls as much as possible. On the other hand I'm not sure, because this was Randi who had a tendency for doing optimizations which neither did the code any favor nor provided any real improvements. I'm still occasionally faced with such code and most of the time it can be removed without a second thought, as it won't break anything.
And never forget that fog is very much broken on any ATI hardware before the implementation of shaders. They had some weird screen space clipping issue in their fog routines that turned it into a visual mess - and they never ever fixed the bug, despite having it reported several times over the years. Apparently commercial games did not use OpenGL fog and it was deemed not relevant enough.
Ironically the drawcall overhead issue became far more problematic at the time when shaders became fast enough to render fog properly. Running the game on a Geforce 8600 vs. a high end ATI card of the same vintage was a night and day difference with the mid range Geforce literally running circles around ATI's most expensive product.
-
- Vintage GZDoom Developer
- Posts: 3146
- Joined: Fri Apr 23, 2004 3:51 am
- Location: Spain
Re: When did ATI performance on GZ stop being equal to nvidi
FX cards had GL 2.0 support and probably even incomplete 2.1 support. I know they work with the old renderer for sure. But have you actually tested the card?
I remember radeon 9600 was much faster than FX cards with SM 2.0. BTW on the SM 1.4 cards the old truecolor renderer have serious graphics giltches, and the new one probably too.
I remember radeon 9600 was much faster than FX cards with SM 2.0. BTW on the SM 1.4 cards the old truecolor renderer have serious graphics giltches, and the new one probably too.
-
- Lead GZDoom+Raze Developer
- Posts: 49130
- Joined: Sat Jul 19, 2003 10:19 am
- Location: Germany
Re: When did ATI performance on GZ stop being equal to nvidi
The shaders on NVidia's FX series were a total disaster. ATI was definitely better back then. But for vintage GZDoom that's mostly irrelevant because it only got working shader support after I had a Geforce 8600 a few years later, and the shader support was never active on this old hardware.
Regarding SM 1.4, it is very likely that support will be removed after the next release. This is the classic case of such old stuff that it cannot be tested anymore and stands in the way of some necessary refactoring. I plan to up the requirements to SM 2.0 plus NPOT texture support to be able to do some work on the 2D drawing and these old cards stand in the way.
Regarding SM 1.4, it is very likely that support will be removed after the next release. This is the classic case of such old stuff that it cannot be tested anymore and stands in the way of some necessary refactoring. I plan to up the requirements to SM 2.0 plus NPOT texture support to be able to do some work on the 2D drawing and these old cards stand in the way.
-
- Posts: 431
- Joined: Fri Aug 03, 2012 6:44 am
Re: When did ATI performance on GZ stop being equal to nvidi
How about compared to risen3d? I think the port was created around the time you would have had your 6800.Graf Zahl wrote:The shaders on NVidia's FX series were a total disaster. ATI was definitely better back then. But for vintage GZDoom that's mostly irrelevant because it only got working shader support after I had a Geforce 8600 a few years later, and the shader support was never active on this old hardware.
Regarding SM 1.4, it is very likely that support will be removed after the next release. This is the classic case of such old stuff that it cannot be tested anymore and stands in the way of some necessary refactoring. I plan to up the requirements to SM 2.0 plus NPOT texture support to be able to do some work on the 2D drawing and these old cards stand in the way.
-
- Lead GZDoom+Raze Developer
- Posts: 49130
- Joined: Sat Jul 19, 2003 10:19 am
- Location: Germany
Re: When did ATI performance on GZ stop being equal to nvidi
The only port that ever performed better than GZDoom is PrBoom. Which is not really surprising because both renderers share the same roots.
Both are also the ones with the simplest render loop. Most of the rest loses all the time to update their render data from the game data instead of rendering from the game data directly.
Both are also the ones with the simplest render loop. Most of the rest loses all the time to update their render data from the game data instead of rendering from the game data directly.