Page 5 of 5

Re: Old laptop /Linux question

Posted: Tue Feb 11, 2020 9:44 pm
by Blzut3
ketmar wrote:assuming that the code has zero bugs is crazy. can you give a 100% guarantee that your code is flawless? if not, then your "optimised" code is silently turned into nonsence, and you can never know for sure what is wrong, because so-called "optimisations" are absolutely unpredictable.
Assuming code has zero bugs is crazy, but intentionally writing bugs because you know better than the compiler is even crazier. This is a classic reaction to something that's a rare occurrence if you're not trying to actively defeat your compiler. Most of the time you just need to look for the type cast in the relevant code.

Re: Old laptop /Linux question

Posted: Tue Feb 11, 2020 10:03 pm
by ketmar
i *definitely* know better, that's why compilers still cannot write code for me. if i wrote some check, i want it to be there, regardless of what compiler believes. if i am left-shifting into sign bit, i know what i am doing. and so on.

is there any way to left-shift into sign bit w/o UB, for example? nope. casting from `unsigned` to `int` is undefined for resulting negative integer values. and using union is in no way better, because the code is still assuming two-complement math. sure, i can dance around it by writing more code with checks, but why? two-complement integer math is de-facto standard, and i don't believe that it will die soon. there is simply no reason to jump through that hoop.

and most "UB" things are prefectly defined too (yeah, deferencing null pointer causes segfault; it is DEFINED DE-FACTO). i don't care if compiler can "deduce" that my null check is not required, because "dereferencing null pointer cannot happen, ever." if i wrote that check, i want it to be there, and i know better.

Re: Old laptop /Linux question

Posted: Tue Feb 11, 2020 10:20 pm
by dpJudas
I don't see how compiler optimizations makes code unpredictable. It also doesn't require the code to be bug free. What is required is that the code can't rely on side effects of the compiler or hardware unless explicitly given. Not doing that is easy.

Vector optimizations is a bad example for this. Better examples are where the compiler figures out half the code can be reduced to constants, more efficient register allocations, reordering of instructions for better scheduling, inlining functions where half can be removed as its result isn't used, and on. All this stuff adds up quickly and can easily double the speed of your program compared to a policy where no assumptions can be made. But hey, you're welcome to compile all your programs as -O0 where it won't optimize your code away. Just don't complain about the speed then.

By the way, if you really want to shift like that, use the intrinsics made for it.

Re: Old laptop /Linux question

Posted: Tue Feb 11, 2020 10:24 pm
by Blzut3
ketmar wrote:i *definitely* know better, that's why compilers still cannot write code for me. if i wrote some check, i want it to be there, regardless of what compiler believes. if i am left-shifting into sign bit, i know what i am doing. and so on.
So we should always compile with -O0? All optimizations are tossing some of your code out. Or writing code for you in the case of things like inlining which you know could break any undefined back-patching you do to your functions.
ketmar wrote:is there any way to left-shift into sign bit w/o UB, for example?
My cop out answer: use C++20.
ketmar wrote:and using union is in no way better, because the code is still assuming two-complement math.
More specifically using a union in that way is undefined behavior regardless of what you're doing with it. The standard says you can only read from the member you last assigned to. IIRC C is slightly relaxed in this regard, but most union tricks like this are just a way of doing undefined behavior in a way the compiler can't warn about even in C.
ketmar wrote:i don't care if compiler can "deduce" that my null check is not required, because "dereferencing null pointer cannot happen, ever." if i wrote that check, i want it to be there, and i know better.
I on the other hand would like it to deduce these things so when it inlines code it can remove branches that may only be needed in some cases.
dpJudas wrote:Vector optimizations is a bad example for this.
While I may agree that your examples are better in regards to most applicable speed ups, the main reason I cited vector optimizations is they're the primary reason for "new GCC broke my code" in my experience. And the reason is almost always type casting breaking data alignment guarantees for a given type.

Re: Old laptop /Linux question

Posted: Tue Feb 11, 2020 10:41 pm
by wildweasel
ketmar wrote:>Has OP's question been answered?
several times, by different people. that's why we're going wild here.
A little bit TOO wild, honestly, this thread seems to be more about arguing over programming work flow than any kind of advice for the OP. This isn't really the best foot to put forward.