by Chris » Tue Mar 06, 2018 11:27 am
dpJudas wrote:All the games I've studied over the years stored textures in their own custom format, usually generated by a baking step. Even when stored as DDS with DXT/S3TC it is usually the baking tool that generated it from a higher quality of source.
Take a normal map, for example - the content author might supply a normal map as PNG, but in order to properly mipmap a normal map it cannot use the standard scaling algorithms as used for images. If they used DDS directly then not only would it present a problem for the content author (poor general support for DDS), but they would also need a complex description of how to create that DDS file as most tools would assume its an image and not a normal map.
I don't doubt that there are extra (automated) steps to "bake" textures, to prepare it for use by the engine. Most paint programs I'm aware of have support for DDS, but their DXT support tends to be lacking. In regards to normal maps, that's an issue regardless. It's just due to the fact that it's abusing color to represent directional vectors, which is true no matter how you do it. This is actually where DDS/KTX can help since, unlike PNG, your baking tool can generate and store premade mipmaps, using custom algorithms that understand a given texture to be a normal map. The GPU couldn't auto-gen correct mipmaps itself, and doing that at load time would just add to loading times.
DXT is certainly not a good compression format for normal maps, though there are others that work better (like RGTC; red-green texture compression that uses only two color channels for the x/y vectors, and blue/z can be reconstructed in the shader). I've head of
some interesting things going on with texture compression lately.
In any case, if you want to efficiently load textures, using a container like DDS or KTX is a must, as well as using a pixel format that can be loaded as-is onto the GPU, like raw RGB(A)8 or DXT or something. Having premade mipmaps is also good for quickly loading low-res versions of textures (and just being a good idea in general for normal maps).
Kinsie wrote:dpJudas wrote:PUBG is a good example of terrible prediction. But there are other games that are very good at it - GTA San Andreas is a good example of a game where I virtually never see that its streaming in the textures.
PUBG demands a minimum of 6gb of RAM and 2gb of VRAM, while GTA:SA was developed for a console with 32mb of RAM and 4mb of VRAM. Slightly different economies of scale here...
That just proves the point, though. Despite having more resources available, PUBG still has trouble making sure textures are loaded in time, whereas GTA:SA had both less memory and slower disk access, but still made sure textures were loaded before becoming visible. But we should all know that PUBG is a good example of horrible engine optimization.
[quote="dpJudas"]All the games I've studied over the years stored textures in their own custom format, usually generated by a baking step. Even when stored as DDS with DXT/S3TC it is usually the baking tool that generated it from a higher quality of source.
Take a normal map, for example - the content author might supply a normal map as PNG, but in order to properly mipmap a normal map it cannot use the standard scaling algorithms as used for images. If they used DDS directly then not only would it present a problem for the content author (poor general support for DDS), but they would also need a complex description of how to create that DDS file as most tools would assume its an image and not a normal map.[/quote]
I don't doubt that there are extra (automated) steps to "bake" textures, to prepare it for use by the engine. Most paint programs I'm aware of have support for DDS, but their DXT support tends to be lacking. In regards to normal maps, that's an issue regardless. It's just due to the fact that it's abusing color to represent directional vectors, which is true no matter how you do it. This is actually where DDS/KTX can help since, unlike PNG, your baking tool can generate and store premade mipmaps, using custom algorithms that understand a given texture to be a normal map. The GPU couldn't auto-gen correct mipmaps itself, and doing that at load time would just add to loading times.
DXT is certainly not a good compression format for normal maps, though there are others that work better (like RGTC; red-green texture compression that uses only two color channels for the x/y vectors, and blue/z can be reconstructed in the shader). I've head of [url=https://www.youtube.com/watch?v=Q6tC7IVUuEc]some interesting things[/url] going on with texture compression lately.
In any case, if you want to efficiently load textures, using a container like DDS or KTX is a must, as well as using a pixel format that can be loaded as-is onto the GPU, like raw RGB(A)8 or DXT or something. Having premade mipmaps is also good for quickly loading low-res versions of textures (and just being a good idea in general for normal maps).
[quote="Kinsie"][quote="dpJudas"]PUBG is a good example of terrible prediction. But there are other games that are very good at it - GTA San Andreas is a good example of a game where I virtually never see that its streaming in the textures.[/quote]PUBG demands a minimum of 6gb of RAM and 2gb of VRAM, while GTA:SA was developed for a console with 32mb of RAM and 4mb of VRAM. Slightly different economies of scale here... :P[/quote]
That just proves the point, though. Despite having more resources available, PUBG still has trouble making sure textures are loaded in time, whereas GTA:SA had both less memory and slower disk access, but still made sure textures were loaded before becoming visible. But we should all know that PUBG is a good example of horrible engine optimization.