by Boondorl » Sun Aug 17, 2025 9:51 pm
I think a hard limit is out of the question because it becomes impossible to know what a "safe" amount of textures is. One thing I haven't seen mentioned at all is the idea that multiple mods could be allocating textures for similar purposes. Assuming we have a tiny 1gb pool, this means mod A might think it's safe having only 600mb of textures mod B also thinks it's safe having only 700mb of textures. Combine these textures pools and now you suddenly have random errors showing up with seemingly no explanation as both mods work fine alone but break together. Fundamentally, this is a bad system that makes the idea of a safe range impossible, especially if someone decides to immediately allocate the entire space for their own mod (but I'm sure no one would do this out of bad faith, right? It's not like mods have been caught sabotaging each other in the past or anything).
I think one option here is to require a definition forward declaring how much memory you'll need so it can be tracked on the CPU side. Multiple mods combine their required amounts and give much better error tracing because we can actually detect these problems in advance and give proper warnings. This seems to be the single largest problem: it's too hard to actually track issues with this and debug if you're just allocating memory out of the blue. Mods can't really step on each other's toes here with an arena system because if a mod allocates less memory than it needs, it simply won't work standalone. Piggybacking can only work if it's an add-on intentionally aimed at an existing memory pool (still a bad idea, but again, easily debuggable for the modder). This fixes the huge problem which is just knowing the limit at all without making it trivial for mods to get in each other's way. If you wanted to introduce a hard limit on allocated space, the declaration is realistically where it'd be done so it could still be handled on a per-mod basis.
As for tracking textures, to be honest, I think this is being overthought. We don't need the GC to handle this imo, I think the simplest solution is to always wipe textures on level change, keeping them in memory otherwise. With the above arena we can capture runaway creation/unexpected sizes so needing to track this isn't necessary anymore, we already know exactly how much memory can exist for textures at a time. But wouldn't this be really awful to work with, having to constantly recreate them and all? Well, let's ask another question: are we planning on allowing these textures to be serialized? No other texture in the engine is, so I doubt these will be. Since they can't be serialized, they already have to be recreatable from scratch, otherwise loading a game will lose the texture. An example of this is Disdain. For our bloodied player model system, we track blood splats on Objects that can actually be serialized out and then draw on the texture whenever they're modified, creating a simple controller structure. Modders are already going to be forced down this route if they want to not break saves, so I don't see a harm in just clearing them on barriers that can already handle this. The only issue I could think of is if GZDoom is fundamentally not capable of doing this which feels like a problem outside of the scope of this feature if it's incapable of ever releasing resources on the GPU (and should maybe be said ahead of time instead of misleading people by telling them the problem is that bad ZScript coders exist).
I think a hard limit is out of the question because it becomes impossible to know what a "safe" amount of textures is. One thing I haven't seen mentioned at all is the idea that multiple mods could be allocating textures for similar purposes. Assuming we have a tiny 1gb pool, this means mod A might think it's safe having only 600mb of textures mod B also thinks it's safe having only 700mb of textures. Combine these textures pools and now you suddenly have random errors showing up with seemingly no explanation as both mods work fine alone but break together. Fundamentally, this is a bad system that makes the idea of a safe range impossible, especially if someone decides to immediately allocate the entire space for their own mod (but I'm sure no one would do this out of bad faith, right? It's not like mods have been caught sabotaging each other in the past or anything).
I think one option here is to require a definition forward declaring how much memory you'll need so it can be tracked on the CPU side. Multiple mods combine their required amounts and give much better error tracing because we can actually detect these problems in advance and give proper warnings. This seems to be the single largest problem: it's too hard to actually track issues with this and debug if you're just allocating memory out of the blue. Mods can't really step on each other's toes here with an arena system because if a mod allocates less memory than it needs, it simply won't work standalone. Piggybacking can only work if it's an add-on intentionally aimed at an existing memory pool (still a bad idea, but again, easily debuggable for the modder). This fixes the huge problem which is just knowing the limit at all without making it trivial for mods to get in each other's way. If you wanted to introduce a hard limit on allocated space, the declaration is realistically where it'd be done so it could still be handled on a per-mod basis.
As for tracking textures, to be honest, I think this is being overthought. We don't need the GC to handle this imo, I think the simplest solution is to always wipe textures on level change, keeping them in memory otherwise. With the above arena we can capture runaway creation/unexpected sizes so needing to track this isn't necessary anymore, we already know exactly how much memory can exist for textures at a time. But wouldn't this be really awful to work with, having to constantly recreate them and all? Well, let's ask another question: are we planning on allowing these textures to be serialized? No other texture in the engine is, so I doubt these will be. Since they can't be serialized, they already have to be recreatable from scratch, otherwise loading a game will lose the texture. An example of this is Disdain. For our bloodied player model system, we track blood splats on Objects that can actually be serialized out and then draw on the texture whenever they're modified, creating a simple controller structure. Modders are already going to be forced down this route if they want to not break saves, so I don't see a harm in just clearing them on barriers that can already handle this. The only issue I could think of is if GZDoom is fundamentally not capable of doing this which feels like a problem outside of the scope of this feature if it's incapable of ever releasing resources on the GPU (and should maybe be said ahead of time instead of misleading people by telling them the problem is that bad ZScript coders exist).