I wonder if someone can explain how the alpha property works for texture shaders? It's always seemed non-linear to me and impossible to counteract. Something I've just tried doing is actually causing me to be suspicious it's a bug but I want to make sure I'm not missing something.
So I have a 3D floor with the line opacity property set to 254. I have the software fuzz shader being applied to it's texture, multiplying the pixels' alpha (ie. colour.a) by values between 0.1875 and 0.65625. But what I get are low alpha values being shunted to zero (or lower?) instead. Previously I've assumed it was my error and just abandoned projects until I can figure it out, only now I'm using someone else's proven code verbatim.

You can see the detail is greatly reduced in the bridge compared to the spectre even though they share identical code. Are we supposed to treat colour.a differently in custom shaders?