GZDoom in Development Hell?

Discuss anything ZDoom-related that doesn't fall into one of the other categories.

Re: GZDoom in Development Hell?

Postby Kappes Buur » Mon Apr 19, 2010 3:05 am

Graf Zahl wrote:... What makes this so hard is that Doom works by changing sector and wall properties meaning that the geometry is not static. This makes it very hard - almost impossible - to use a 'modern' approach to rendering the level. ...


The analogy of driving a square pin through a round hole comes to mind. :D
User avatar
Kappes Buur
 
 
 
Joined: 17 Jul 2003
Location: British Columbia, Canada

Re: GZDoom in Development Hell?

Postby ReX » Mon Apr 19, 2010 8:21 am

Kappes Buur wrote:
ReX wrote:... * GZDooM is dead. ...


My take on this is, that GZDoom is not dead but only on hiatus. :D

One can only hope, but I do get your meaning. As you probably figured, I was only referring to the death of GZDooM in jest.

For my part, I continue with development on GZDooM projects. [Please note emphasis on the plural form.]

However, would you mind still going through the process of the benchmarks and posting the results.

Indeed, that was my intention.
User avatar
ReX
Title? I don't need no steenkin' title!
 
Joined: 05 Aug 2003
Location: Quatto's Palace

Re: GZDoom in Development Hell?

Postby VortexCortex » Tue Apr 20, 2010 11:57 am

Graf Zahl wrote:This is simply something the old fashioned rendering methods don't handle well unless the driver has been specifically optimized for it. NVidia's driver has been, ATI's apparently not as they seem to concentrate on the 'modern' stuff. The unfortunate result is that GZDoom on the latest ATI HD 5870 is barely the same performance as on a Geforce 8600 which is 3 years old.


I have to disagree. I don't know how much clearer I can be: Graf, Please stop spreading ATI FUD -- it sounds very foolish.

There's a huge difference between saying, "I don't know why GZDoom runs faster on NVidia than on ATI, but it does." and saying, "NVidia's driver has been specifically optimized for the old fashioned methods, while ATI's driver concentrates on the 'modern' stuff, which is why GZDoom runs faster on 3 year old NVidia than newer ATI hardware." The first statement might possibly be true, while the second statement is utterly false. I know you used "seem" and "apparently" when referring to ATI drivers. This indicates that you have some doubt that your statement about ATI is a fact. However, this doesn't change the fact that you're still making baseless assumptions and false claims about ATI and NVidia's drivers i.e. "NVidia's driver has been [specifically optimized for old fashioned rendering methods]." No, it hasn't...

Spoiler: "Redundant Redundancy check is Redundant."


I had hoped that at least someone had tried to test GZDoom to find out if my claims of inefficient code were true or false... If one does a simple Google search for "Debug OpenGL" they will find more than enough resources to test GZDoom with. Many free OpenGL testing tools output results in plain text (which may be hard to understand for some) or require a bit of work to provide the testing features you may desire. For these reasons I'll post a screen shot that I hope everyone can understand.

Spoiler: "gDEBugger trial + GZDoom 1.4.08"


Spoiler: "Explanation of above screen."


If gDEBugger is too pricy, you may find the free and open source GLIntercept a suitable OpenGL testing tool (both of which can be found via searching for "Debug OpenGL" via Google). Note: all of gDEBugger's features can be easily duplicated via extending the current X11+OpenGL source code (which can be obtained from nearly any Linux distro repository), or via extending GLIntercept's functionality. There are other OpenGL testing tools besides these as well...

Graf Zahl wrote:So the only way to bring the performance up would be to dump everything that's there, fully concentrate on somehow optimizing it for more modern methods and then hope that in the end it all works out (emphasis on 'hope'!)

Eeek! "only way" ::runs screaming, "Nooo!"::

Seriously though, I've yet to find a problem that has only one solution... For one, you could run a performance analyzer tool on GZDoom and alleviate any issues (like redundant state changes) to improve performance. However, I do agree that the most performance can be gained via a total re-write of the code -- in which case it would be better to not use GZDoom as a base to avoid licensing issues, IMHO.

Moral of the story: "It's not what we know or don't know that brings us down. It's what we know for sure, but just ain't true."
You do not have the required permissions to view the files attached to this post.
User avatar
VortexCortex
 
Joined: 20 Apr 2010
Location: Houston

Re: GZDoom in Development Hell?

Postby Gez » Tue Apr 20, 2010 2:23 pm

All that is just fine and dandy, but it's a bit useless now. Work on the OpenGL renderer has ceased for the time being. Even if it resumes, Graf has often stated he's not going to invest money for a hobby project, so stuff like your gDEBugger are a waste of time; in its free version it's unusable and nobody's going to buy a license for GZDoom. Your alternatives are not more attractive either, since by your own admission one has first to extend their functionality, i.e. it's a big DIY project.

Now as far as removing redundant state calls. How would you do it, technically? I don't think there are many places where the calls are always redundant and could be deleted entirely. More probably, the redundancy would have to be removed through equality checks.

If you've got performance patches to propose, I'll be interested. But given GZDoom's current situation, do not expect Graf to work on it (he quit) and do not expect me either (I am absolutely not an OpenGL developer). Sooo... Too little, too late.
Gez
 
 
 
Joined: 06 Jul 2007

Re: GZDoom in Development Hell?

Postby Graf Zahl » Wed Apr 21, 2010 7:10 am

VortexCortex wrote:I have to disagree. I don't know how much clearer I can be: Graf, Please stop spreading ATI FUD -- it sounds very foolish.


So it's FUD, just because you don't believe the benchmarks I made that clearly traced the performance loss to the amount of draw calls being made? This was one of the things I extensively investigated last year. The result was pretty clear what was causing the slowdowns.

Spoiler: "Redundant Redundancy check is Redundant."




Ugh...

Sorry, but I'll have to disagree here. I can't imagine any scenario which guarantees that no two changes of the same state are different unless the application does this kind of check itself. Even in the least optimal situation I can think of the caching would even out the overhead easily. (Let's just say that the caching overhead is 10% of the overall time for the state change so in order to win with ATI you only could afford 1 out of 10 changes being redundant which is ridiculously low.) So ATI just shoots itself in the head here by being less efficient than they could.

BTW, you can check in GZDoom (not Skulltag!) if state caching actually helps, by setting the CVAR gl_direct_state_change to false. This will cache all state changes and only apply the ones that really change things when a draw call is started. On NVidia this doesn't help and only causes more overhead than it saves, even though it does significantly reduce the amount of actual non-redundant state changes. But the added overhead for application side caching isn't even compensated by the savings inside the driver (that's why this feature is not active right now.) If ATI's drivers work as you say it might help but I doubt it's the main problem.

If gDEBugger is too pricy, you may find the free and open source GLIntercept a suitable OpenGL testing tool (both of which can be found via searching for "Debug OpenGL" via Google). Note: all of gDEBugger's features can be easily duplicated via extending the current X11+OpenGL source code (which can be obtained from nearly any Linux distro repository), or via extending GLIntercept's functionality. There are other OpenGL testing tools besides these as well...


I'll try glIntercept but you could do me a real favor if you checked one large level (Hellcore's MAP09 would be good). Just bind a key to 'bench', create a savegame from a good position (the steeple is perfect for this), load this savegame with the CVAR on and of and press the key bound to 'bench' for each. Then post the output (benchmarks.txt) here.



Graf Zahl wrote:So the only way to bring the performance up would be to dump everything that's there, fully concentrate on somehow optimizing it for more modern methods and then hope that in the end it all works out (emphasis on 'hope'!)

Eeek! "only way" ::runs screaming, "Nooo!"::

Seriously though, I've yet to find a problem that has only one solution... For one, you could run a performance analyzer tool on GZDoom and alleviate any issues (like redundant state changes) to improve performance. However, I do agree that the most performance can be gained via a total re-write of the code -- in which case it would be better to not use GZDoom as a base to avoid licensing issues, IMHO.

Moral of the story: "It's not what we know or don't know that brings us down. It's what we know for sure, but just ain't true."[/quote]


Let's be frank here:

The wall rendering code in GZDoom is an utter mess that has grown way beyond maintainability. No matter what I tried over the last few years, I always gave up when it came to the walls. Far too much there is just hacked in and causes problems now. If someone could clean up that mess in a way that still allows all the stuff that can be done right now I'm sure the renderer would be 50% faster on modern hardware that's mostly CPU limited.

But things like Decals, translucent sorting and portals all depend on the crappy data structures being used for walls. This has the unfortunate effect that once I start working on it the whole thing comes crumbling down and everything needs to be rebuilt, preferably in a more robust fashion.

Trust me, that code is not salvageable if optimization is the main concern..
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
 
Joined: 19 Jul 2003
Location: Germany

Re: GZDoom in Development Hell?

Postby Caligari87 » Wed Apr 21, 2010 11:22 am

You know, all else aside, I was going to make my next graphics card an nVidia just for GZDoom. :cry:

(I still will, but this is sad.)

8-)
User avatar
Caligari87
I'm just here for the community
User Accounts Assistant
 
Joined: 26 Feb 2004
Location: Salt Lake City, Utah, USA
Discord: Caligari87#3089

Re: GZDoom in Development Hell?

Postby VortexCortex » Wed Apr 21, 2010 12:44 pm

Graf Zahl wrote:
VortexCortex wrote:... Graf, Please stop spreading ATI FUD ...
So it's FUD, just because you don't believe the benchmarks I made that clearly traced the performance loss to the amount of draw calls being made? This was one of the things I extensively investigated last year. The result was pretty clear what was causing the slowdowns.

No, it's FUD because you've been making false statements about ATI and NVidia. As I said, there's a big difference between making false statements instead of saying, "I'm not sure why GZDoom runs better on NVidia than ATI, but it does."

As far as everything else, looks like we're on the same page, eh?
Graf Zahl wrote:If ATI's drivers work as you say it might help but I doubt it's the main problem.
VortexCortex wrote:This is just one of several screens that show problems with GZDoom...


Graf Zahl wrote:Let's be frank here: [...] Trust me, that code is not salvageable if optimization is the main concern..
VortexCortex wrote:I do agree that the most performance can be gained via a total re-write of the code.


Graf Zahl wrote:I'll try glIntercept but you could do me a real favor...
VortexCortex wrote:There are other OpenGL testing tools besides these as well...


The usefulness of the benchmarks you suggest pale in comparison to the performance tools I've referred you to (which was a favor, IMHO), especially when you have stated that you will not be contributing to further GZDoom development. In addition to the tools I've already mentioned you might find it beneficial to do some searching of your own. You might try using NVidia's Performance Kit instead of guessing and narrowing down the sections of your code until you discover areas that appear to need optimization (if for no other reason than it will save you some time -- as I have said previously). Since it seems apparent that you like developing with NVidia hardware, I may have incorrectly assumed that you were familiar with NVidia's tools. Links to NVPerfKit can be found on the NVidia Developer's home page http://developer.nvidia.com/ (bottom center -- the section titled "Tools").

Forgive me if I perceive contributing to GZDoom as a close minded and hostile work environment, but it's taken far too long to get the developers to stop laying blame on imagined ATI hardware and driver faults -- a simple task which I'm not even sure I've accomplished yet. The goal of my posts have been to put aside the ATI and Intel scapegoats since it seems to be an issue that is actively harming the GZDoom community. I can barely justify my time spent in this regard, but I can not justify spending my time performing tasks that many of your users could have (or have already) performed...

I prioritize my time, and working on code that benefits the Skulltag project must take precedence in most cases. None of my time spent working with the latest GZDoom is of any benefit to Skulltag since your license change prevents Skulltag from using the newer GZDoom versions. It is regrettable that I haven't more time available and that Skulltag is closed source, but in it's current state it must remain closed or be overrun with cheaters (I believe this is one reason that ZDaemon is closed now as well). Even if your license permitted me to use the latest GZDoom code in Skulltag I could not promise to immediately begin work on GZDoom since there are other more important Skulltag centric issues that I must attend to first.
User avatar
VortexCortex
 
Joined: 20 Apr 2010
Location: Houston

Re: GZDoom in Development Hell?

Postby Graf Zahl » Wed Apr 21, 2010 1:17 pm

You know, I could respect you a lot more if you'd actually helped instead of constantly riding attack after attack. But well, apparently you don't want it to be improved and as long as this is everything I get from the Skulltag community I really see no point in changing the license. So far I only get the feeling that they expect me to give them everything but I get nothing in return, not even the slightest favor.

All I asked was for you to verify that changing the stuff you pointed out so verbosely actually helped improve the performance and you refused. Well done, pal!
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
 
Joined: 19 Jul 2003
Location: Germany

Re: GZDoom in Development Hell?

Postby Shadelight » Wed Apr 21, 2010 2:23 pm

I'm not quite sure what's going on here, but it looks like all VortexCortex is doing is showing that he's quite capable of trolling Graf Zahl in a thread where we specifically asked that there should be no more drama in here. Cut it out, please.
User avatar
Shadelight
You must construct additional lumber.
 
Joined: 20 May 2005
Location: Labrynna
Discord: Shadelight#4920

Re: GZDoom in Development Hell?

Postby VortexCortex » Wed Apr 21, 2010 2:31 pm

Graf Zahl wrote:You know, I could respect you a lot more if you'd actually helped instead of constantly riding attack after attack.

Me (paraphrased): Stop saying things that are not true. Also, use some free tools you should know about already, and why must everything close minded and hostile?

Yeah, I guess that might sound offensive to you. I'm in no need of your respect, and it's your decision to take my words as an attack. However, all of my posts have contained helpful suggestions and even tools you can use yourself (let us not over look this fact).

Graf Zahl wrote:But well, apparently you don't want it to be improved and as long as this is everything I get from the Skulltag community I really see no point in changing the license. So far I only get the feeling that they expect me to give them everything but I get nothing in return, not even the slightest favor.

If I didn't want GZDoom to have improvements I wouldn't have mentioned those helpful tools... I still don't see why I must be the one to grant this trivial "favor" for you (BTW, I'm at work so I couldn't run GZDoom even if I wanted to), especially when we're both in agreement that this isn't the only issue GZDoom has.

If you or anyone else wants to see if those tools I've mentioned show improvements why not use them instead of asking me to do it for you? None of the tools I've mentioned are dependent on hardware that you don't have. Many of the NVidia tools require that you have NVidia hardware, and all of the tools I've mentioned are free or have free trial versions you could use immediately. I believe it would be more useful to you to have these tools available on your own machine than to ask others to use them for you, but that's just my opinion -- correct me if I'm wrong.

There's an old saying that comes to mind: You can lead a horse to water, but you can't make it drink.
User avatar
VortexCortex
 
Joined: 20 Apr 2010
Location: Houston

Re: GZDoom in Development Hell?

Postby zap610 » Wed Apr 21, 2010 2:41 pm

BlazingPhoenix wrote:I'm not quite sure what's going on here, but it looks like all VortexCortex is doing is showing that he's quite capable of trolling Graf Zahl in a thread where we specifically asked that there should be no more drama in here. Cut it out, please.


How is that trolling? He is actually backing up his statements and it's kind of hypocritical to tell him to stop posting and completely disregard the other user. He isn't above the rules... Right?
zap610
Secretary of Balloon Doggies
 
Joined: 20 Feb 2005

Re: GZDoom in Development Hell?

Postby wildweasel » Wed Apr 21, 2010 2:47 pm

It's still apparently making Graf angry, the way he's wording his posts. I wouldn't say there's a wrongdoing going on yet, but we're still monitoring the thread because this is still a difficult topic.
User avatar
wildweasel
change o' pace.
Moderator Team Lead
 
Joined: 15 Jul 2003

Re: GZDoom in Development Hell?

Postby Gez » Wed Apr 21, 2010 2:47 pm

VortexCortex wrote:I still don't see why I must be the one to grant this trivial "favor" for you

You're the one who posit that these optimization you propose to be done through tools are needed. Graf has said that these optimizations have been made and are available as an option, but that their effect wasn't especially noticeable. Given that he has run the tests already (with a benchmark campaign over at the DRD Team forums), the onus probandi is on you -- you have to prove that yes it would improve things significantly.

Otherwise, you'd just be repeating misleading information whose veracity you do not know for sure -- and that's exactly what you're charging Graf of doing wrt. ATI drivers. So, when you eventually leave your work environment and come back home, do that benchmark test and use the result to back up your claims or to drop them, and in the meantime -- please stop your GZDoom FUD.

It's not like spending one minute starting a game and typing a few commands in the console is more work than investigate several debuggers, read their code to extend their functionality, recompile them and use them for profiling software in various conditions...
Gez
 
 
 
Joined: 06 Jul 2007

Re: GZDoom in Development Hell?

Postby Zippy » Wed Apr 21, 2010 3:05 pm

VortexCortex wrote:There's an old saying that comes to mind: You can lead a horse to water, but you can't make it drink.
There's also many old sayings that run along the line of "the burden of proof is on the accuser". Or in the context of GZDoom development, this:
Gez wrote:You're the one who posit that these optimization you propose to be done through tools are needed. Graf has said that these optimizations have been made and are available as an option, but that their effect wasn't especially noticeable. Given that he has run the tests already (with a benchmark campaign over at the DRD Team forums), the onus probandi is on you -- you have to prove that yes it would improve things significantly.

Otherwise, you'd just be repeating misleading information whose veracity you do not know for sure -- and that's exactly what you're charging Graf of doing wrt. ATI drivers. So, when you eventually leave your work environment and come back home, do that benchmark test and use the result to back up your claims or to drop them, and in the meantime -- please stop your GZDoom FUD.



Also, a little note: when you constantly praise your own incredible generosity, it usually comes off with exactly the opposite intended effect. Strange but true. Could probably get more mileage out of avoiding saying things like this. This is incredibly helpful advice on my part that you should definitely take because I am a awesome, helpful, superior person who knows better, and really anybody with any normal amount of intellectual soundness would listen and be thankful. Have I mentioned how damned nice I am? <-- See what I did there? It's like that (comically exaggerated for clarity).
User avatar
Zippy
Your Golden Boy
 
Joined: 23 Mar 2005
Location: New Jersey
Discord: Zurock#4834

Re: GZDoom in Development Hell?

Postby VortexCortex » Wed Apr 21, 2010 4:18 pm

Zippy wrote:Could probably get more mileage out of avoiding saying things like this. This is incredibly helpful advice on my part that you should definitely take because I am a awesome...

I started out by claiming GZDoom's code was the problem, not ATI hardware and drivers. For this I have been called "mindless" and even a "bitch" by Graf. Since then I've made inflammatory statements in an attempt to dispel the faulty ATI hardware and drivers rumors, and even apologized for my statements being so inflammatory.

Let's do a quick paraphrased run-down...

Me: Hey Stop lying about ATI! GZDoom's code is all out of whack -- that's why it fails/gives poor performance on ATI, it's not ATI's fault.

[various statements supporting and contradicting my statements. Graf publicly decides to halt GZDoom development. Claims that my statements are false and I'm just a Troll are made.]

Me: For those that still don't believe me: Here's some tools you can use to find out for yourselves.

[Those tools are useless, too expensive, or too hard to use.]
Graf: "Let's be frank:" [GZDoom's code is out of whack. Do me a favor and run some more tests]

Me: I'm short on time, and helping you out doesn't really help me out, but did you know about these other free tools that you can use yourselves?

[Stop being a Troll. Prove your self to us. I guess you don't want to help GZDoom out.]

Point being: The only thing I was trying to prove in the first place was that ATI and Intel weren't the cause of GZDoom's issues on those platforms.

I realize that no amount of proof that I can provide will be enough for people who have decided otherwise.
Most of the folks that dismiss my claims or think me a Troll won't even attempt to use the free software that's available to investigate things for themselves...

Now that Graf has stated that GZDoom's code is inefficient and unmanageable, which is the main reason for halting GZDoom's development (not because everyone else thinks his code is bad), and that he may check out some GL testing tools, I've got nothing more to say except this:

Thanks for GZDoom, Graf. Good luck with your future projects.

If anyone wants to improve the performance of GZDoom (or any OpenGL application) you might be able to save yourself some time if you check out the tools I've previously mentioned, or try to find something similar via your favorite web search engine.
User avatar
VortexCortex
 
Joined: 20 Apr 2010
Location: Houston

PreviousNext

Return to General

Who is online

Users browsing this forum: Diabolución and 5 guests