Support of new image formats

Discuss anything ZDoom-related that doesn't fall into one of the other categories.
dpJudas
 
 
Posts: 3132
Joined: Sat May 28, 2016 1:01 pm

Re: Support of new image formats

Post by dpJudas »

Texture compression isn't so much about saving memory as it is about improving texture unit cache behavior. I've noticed that is a lot better with BC textures compared to the non-compressed versions in the past.

You shouldn't buy a GPU today if it doesn't have at least 12 gigabytes of RAM. Anything less and you'll have problems with modern titles swapping out the textures at higher texture quality levels.
User avatar
NeuralStunner
 
 
Posts: 12328
Joined: Tue Jul 21, 2009 12:04 pm
Preferred Pronouns: He/Him
Operating System Version (Optional): Windows 11
Graphics Processor: nVidia with Vulkan support
Location: capital N, capital S, no space

Re: Support of new image formats

Post by NeuralStunner »

I admit my only experience with webp has been seeing swaths of paletted gifs that have been reencoded into it and basically destroyed in the process, so I can't say I'm too taken with it. :P

In terms of game projects, what users are most likely to notice is the dreaded Load Time™, so I'd be more concerned about decoding/decompression speed. When you're already using a compressed archive, squeezing a few extra bytes out of individual files starts to become meaningless. (In terms of speed, you're even better off using STORE in some cases.)
dpJudas wrote: Sat Aug 26, 2023 8:06 amI'm not a fan of DDS because it is just a garbage format that was never really meant to be used for anything than Direc3D sample projects
Unfortunately, "it sucks but it's what everybody else uses" is how these things go. (Hi, MP3.)
User avatar
wildweasel
Posts: 21706
Joined: Tue Jul 15, 2003 7:33 pm
Preferred Pronouns: He/Him
Operating System Version (Optional): A lot of them
Graphics Processor: Not Listed

Re: Support of new image formats

Post by wildweasel »

NeuralStunner wrote: Sun Aug 27, 2023 7:47 amIn terms of game projects, what users are most likely to notice is the dreaded Load Time™, so I'd be more concerned about decoding/decompression speed. When you're already using a compressed archive, squeezing a few extra bytes out of individual files starts to become meaningless. (In terms of speed, you're even better off using STORE in some cases.)
This is roughly where my point of view is at, these days; at the risk of sounding like John Triple-A Publisher, we live in an era in which practically everybody has what amounts to broadband internet. I think we can stand to not save a megabyte or two in file size if it means the result can load and perform faster for more people.

Of course, that's largely contingent on mods still being roughly the size they were ten years ago. More and more I'm seeing stuff like UltimateDoomVisor, a mod that exclusively changes the HUD, taking upwards of several hundred megabytes, because they insisted on the entire thing being made of full-screen, 1920x1080 PNGs.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49179
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Support of new image formats

Post by Graf Zahl »

The main reason for WebP is not better compression for lossless data but less artifacts for lossy compression while having files roughly the same size as JPEG. If the artifacts are less you can encode a lot more textures with it and won't have to use much larger PNGs.

Overall, I think it is saying a lot that none of these more modern formats is really gaining in popularity. Most of the time JPEG and PNG are good enough and they have the invaluable advantage that they can be decoded with a reasonable amount of code, unlike WebP, AVIF and JPEG XL (i.e. stb-image covers them both with less than 300 kb of source code - each of the newfangled formats is a multi MB library)

This is what technicians often tend to forget: Sometimes smaller is better and less is more. If I had to take a guess, these old formats are there to stay for a long time
Blzut3
 
 
Posts: 3178
Joined: Wed Nov 24, 2004 12:59 pm
Graphics Processor: ATI/AMD with Vulkan/Metal Support

Re: Support of new image formats

Post by Blzut3 »

My information may be old, but I believe one of the major limitations on WebP is that in lossy mode it only supports 4:2:0 chroma. The codec was originally for HD video after all. (WebP does have a lossless mode which is obviously 4:4:4.) Given that I don't know how much better it actually is reducing artifacts vs JPEG when used at typical resolutions for Doom textures. Of course I don't think JPEG has been particularly popular for Doom sized textures anyway so may be moot.

I will note for the sake of comparison that both AVIF and JPEG XL don't have this limitation, but that's not a call to say that GZDoom needs to adopt these formats (at least not until there's a mature ecosystem around them).
Professor Hastig
Posts: 251
Joined: Mon Jan 09, 2023 2:02 am
Graphics Processor: nVidia (Modern GZDoom)

Re: Support of new image formats

Post by Professor Hastig »

Blzut3 wrote: Mon Aug 28, 2023 12:14 am My information may be old, but I believe one of the major limitations on WebP is that in lossy mode it only supports 4:2:0 chroma. The codec was originally for HD video after all. (WebP does have a lossless mode which is obviously 4:4:4.) Given that I don't know how much better it actually is reducing artifacts vs JPEG when used at typical resolutions for Doom textures. Of course I don't think JPEG has been particularly popular for Doom sized textures anyway so may be moot.
I've tried JPG mainly with skybox textures. You'll need at least 512x512 pixels to not notice the compression artifacts. The format is definitely not good for lower res stuff. At 128x128 the result is mostly unacceptable, even with high quality settings.
User avatar
Phredreeke
Posts: 309
Joined: Tue Apr 10, 2018 8:14 am

Re: Support of new image formats

Post by Phredreeke »

128x128 is tiny, why would you need lossy compression for that?
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49179
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Support of new image formats

Post by Graf Zahl »

That's the point. The only places where I've seen JPEG make sense is skyboxes and hires replacements. And hires replacements always come with a significant performance hit from the texture upload
User avatar
Darkcrafter
Posts: 571
Joined: Sat Sep 23, 2017 8:42 am
Preferred Pronouns: He/Him
Operating System Version (Optional): Windows 10
Graphics Processor: nVidia with Vulkan support

Re: Support of new image formats

Post by Darkcrafter »

I tried that AVIF image compression, it's good at very high-res, you can basically get 1.5-2 times less file sizes with it but again, it's not just "hi-res from 2000s" it's a really big high-res like 4096x4096 and more. Sometimes, on some textures, it still smears quality way too much.

There is no quality slider, rather two "color quantizations" for RGB and Alpha, the biggest is value (63) the higher the compression rate is, it could be hellishly slow if compression speed is at min (up to 20 mins on my ryzen 5 1600 one thread to save such files), the biggest value is 10 and it look worst, 7 provides for a bit better tradeoff between quality and speed.

In overall some blockiness is still there, especially if saved too fast but detais handling is better than JPEG.

There is something else that can be better than JPEG but has been around for decades: JPEG 2000, see the biggest issue of using a JPEG compression is largely seen when magnified is blockiness and JP2 handles this aspect better. Google "says" it even supports an alpha channel.

In case you want bloat, I'd still go with AVIF any day over older formats like JP2.

Upd, I just compressed my 6144x3200 texture for skybox and got a pretty good look at 1.9mb in AVIF-quality 52 instead of 12.7mb in JPEG-quality 98.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49179
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Support of new image formats

Post by Graf Zahl »

I think WebP is a good compromise between modern and reduced bloat. It looks a lot better than JPEG at least and unlike JPEG 2000 has found some actual use.
The reason JPEG2000 failed was just the patent mess surrounding it.
User avatar
Rachael
Posts: 13782
Joined: Tue Jan 13, 2004 1:31 pm
Preferred Pronouns: She/Her

Re: Support of new image formats

Post by Rachael »

Unfortunately some apps that have survived from ye olde 2000's still use jpeg2000 for some things. Otherwise it'd likely have been a completely dead format by now.
User avatar
NeuralStunner
 
 
Posts: 12328
Joined: Tue Jul 21, 2009 12:04 pm
Preferred Pronouns: He/Him
Operating System Version (Optional): Windows 11
Graphics Processor: nVidia with Vulkan support
Location: capital N, capital S, no space

Re: Support of new image formats

Post by NeuralStunner »

Graf Zahl wrote: Mon Aug 28, 2023 12:06 pm That's the point. The only places where I've seen JPEG make sense is skyboxes and hires replacements. And hires replacements always come with a significant performance hit from the texture upload
Solving this would effectively make GZDoom a viable engine for more modern stylings. It's never going to compete with UE of course, but it's probably worth thinking about.
User avatar
Chris
Posts: 2954
Joined: Thu Jul 17, 2003 12:07 am
Graphics Processor: ATI/AMD with Vulkan/Metal Support

Re: Support of new image formats

Post by Chris »

NeuralStunner wrote: Wed Aug 30, 2023 8:13 am
Graf Zahl wrote: Mon Aug 28, 2023 12:06 pm That's the point. The only places where I've seen JPEG make sense is skyboxes and hires replacements. And hires replacements always come with a significant performance hit from the texture upload
Solving this would effectively make GZDoom a viable engine for more modern stylings. It's never going to compete with UE of course, but it's probably worth thinking about.
That's where GPU native compression formats come in. Formats that result in a less-than-uncompressed amount of data and which can be copied directly into VRAM, without taking time to decompress on the CPU and copy the full uncompressed image. JPEG, PNG, WebP, AVIF, JPEG XL, etc, all require CPU decompression and a full uncompressed image transfer, while other compression formats like S3TC/DXT, RGTC, BPTC, ETC1, ASTC, can (when handled properly) be loaded without extra CPU-side work to decompress and less data to copy to VRAM.
User avatar
NeuralStunner
 
 
Posts: 12328
Joined: Tue Jul 21, 2009 12:04 pm
Preferred Pronouns: He/Him
Operating System Version (Optional): Windows 11
Graphics Processor: nVidia with Vulkan support
Location: capital N, capital S, no space

Re: Support of new image formats

Post by NeuralStunner »

Chris wrote: Wed Aug 30, 2023 8:46 pmThat's where GPU native compression formats come in. Formats that result in a less-than-uncompressed amount of data and which can be copied directly into VRAM, without taking time to decompress on the CPU and copy the full uncompressed image.
Yep... Thinking back to the days when game graphics were raw bits they could send straight to video memory, and wondering if maybe they weren't quite as crazy as everyone thought? :wink:
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49179
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Support of new image formats

Post by Graf Zahl »

Here' some interesting numbers:

Disabling the WebP decoder reduces EXE size by 150kb.
Disabling JPEG-Turbo reduces EXE size by 340 kb
Enabling JPEG support in stb-image increases EXE size by 27 kb!
Forcing the internal JPEG decoder instead of turbo reduces EXE size by 190kb

WebP is quite efficient code-size-wise. 150kb for an image decoder is absolutely fine.
WebP support is at a stage where even the source package (pure library source is 2.6 MB and I'm sure it could be stripped further) could be embedded into a project that doesn't want to hook up with vcpkg.

It gets more interesting with the 3 JPEG decoders.

Turbo is 340 kb
The old library is 150 kb - same as WebP
STB manages with 27 kb, which is quite surprising.

I'm going to run some performance tests here to see how they fare. Turbo really is only worth it if it is significantly faster.


By comparison, JXL adds 5 MB and AVIF at least 2MB, but only if we hack the library not to include suff we do not want, otherwise it's 12 MB. And source code size for these libraries is HUGE.
Quite a difference, actually, and I have a feeling this decoder bloat really won't help these two formats.

Return to “General”