Installing Visual Studio 2017

Discuss anything ZDoom-related that doesn't fall into one of the other categories.
User avatar
Chris
Posts: 2942
Joined: Thu Jul 17, 2003 12:07 am
Graphics Processor: ATI/AMD with Vulkan/Metal Support

Re: Installing Visual Studio 2017

Post by Chris »

Graf Zahl wrote:And as things stand, GCC is by far the worst in this regard, mainly because it advertises itself as a standard compliant compiler and thus making many programmers use its non-standard features.
Whenever possible, I try to compile using -std=c11 or -std=c++11 or whatever, rather than the gnu11 or gnu++11 variants. That makes it not provide the GNU extensions directly, and instead requires double-underscore (e.g. typeof(foo) and asm("...") won't work; you instead need to do __typeof(foo) and __asm("...") if you really need it, which should make it more apparent when you're relying on non-standard features). Unfortunately CMake sets the gnu/gnu++ variants when you request a specific standards version, and people don't generally consider the downsides to using gnu/gnu++ modes with code that's supposed to be portable, and instead think "more is better".
Of course all that doesn't even come close to touching the most annoying subject of C++ programming, and that is the bootload of undefined behavior and a committee that seems to be so stuck in the past - forgetting that any bit of undefined or implementation defined behavior is a source for major trouble.
Undefined behavior is what allows the kind of optimizations C and C++ thrive on. Aliasing rules (undefined behavior if you access the same memory through pointers to different types) and memory concurrency (undefined behavior if you non-atomically read memory in one thread that could be written in another), for instance, is what allows the compiler to avoid treating all memory accesses as volatile with hard memory barriers. Signed overflow (undefined result if incrementing or multiplying a signed integer beyond its max value) is what allows all those for(int i = 0;condition;i++) ... loops to remain efficient on x86-64, since internally 64-bit CPUs use 64-bit offset indexing while an int is typically only 32 bits. The compiler can substitute the 32-bit variable with a native 64-bit register offset, since it can assume i will never increment beyond INT_MAX. Unsigned integer overflow, in contrast, is defined (it must wrap), and that means the compiler needs to make CPUs with 64-bit addressing do more work to ensure for(unsigned int i = 0;condition;i++) will maintain proper indexing in the event of an overflow.

Undefined and implementation defined behavior is also what allows C++ to be portable and somewhat future-proof. Not all microprocessors implement types using the same design. 2's compliment isn't the only way to handle signed integers, for instance. There's 1's compliment, and also (I don't know the term for this) a design that's more akin to how floats work; sign-bit that just dictates a positive or negative value separate from the actual value (e.g. 0x00000001 is +1 and 0x80000001 is -1). Defining signed integer overflow would require all but one of these designs to take a hit for every addition, subtraction, and multiplication to ensure language-defined results. Similarly, there's nothing to say floats in C++ must be IEEE-754; floating-point on at least some ARM chips is not IEEE-754-compliant actually, notably in regards to denormals (it doesn't support them, since they're easy to inadvertently generate, tough to protect against, and murder performance). Even x86 with SSE has an option to turn them off.

Should C++ define as much as it can, it would be restricted in what hardware it can reasonably work on, and also means it won't be able to adapt as future microprocessor designs create different low-level behaviors (at least not without breaking code that was previously compliant; sure that still sometimes happens now, but it would be many times worse the more you break previously-defined behavior).
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49073
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Installing Visual Studio 2017

Post by Graf Zahl »

I see you are also one of the people missing the forest for the trees.
The problem here is that C and C++ sacrifice a significant portion of robustness for being able to run on some outlandish hardware.

A good example is shifting negative numbers. Why is this undefined? The result is that the feature as-is is perfectly unusable and the shift operator may just be tagged with "Do not use", if this little gotcha just weren't ignored completely.
Another sore spot is the undefined-ness of the size of integer types. Yes, yes, I know, it has to be that way to allow compilation on every imaginable piece of junk hardware, but as things stand, using data types with different sizes on different platforms only creates code incompatibilities. The 'long' type is already totally useless these days. Its size differs across architectures, it doesn't even guarantee anymore that it's the size of a pointer so using it is a foolhardy proposition. It might just be dumped entirely because of its wishy-washy definition. There's no guarantee it gives that isn't given in a more reliable form by another type.

In all seriousness, though, the problems all these issues create on mainstream software, far outweigh that perceived advantage. It is virtually impossible to robustly test C code that's supposed to be run on different types of hardware. In the end if would be better if the software had to specify what behavior it expects so that any non-conforming compiler can outright reject to compile it instead of treating it as "undefined", which is ALWAYS a concern.

Instead of laying out some minimum requirements on the hardware it's supposed to run on, effectively placing the burden into the hands of a small number of people who develoü the hardware and may actually do something about it (or if they do not want to, let their shitty products disappear), it is placed into the hands of the entire programming community at large, most of which has no grasp on these things and happily produces broken software as a result.

And in the end, where's the proof that C's philosophy leads to faster software? I cannot see it because the more undefined shit a programmer needs to work around to create something robust, the more cruft will accumulate, easily nullifying any potential advantage. To make it short: If I had to write my code to fully avoid any case of undefined behavior, it'd be a lot slower than if I could work against a sane baseline and let the compiler deal with hardware that isn't fully compliant.

Let's not delude ourselves: 99.9% of existing software takes integer overflows as they happen on any mainstream CPU for granted, where negatives are 2's complement, where shifting negative numbers works as people have seen it for decades and so on and so on. If some lunatics feel the need to design something different, their hardware deserves to rot in hell. C++'s only saving grace these days is that nobody seems to care about developing a language that's designed to be compiled to native code. Everything new is designed to use large bloated runtimes and preferably compiles to some sort of intermediate byte code .

The entire thing reeks of an 80's mindset from top to bottom. Which ultimately isn't surprising, because that's probably where most of these people come from: The older generation of programmers which never adapted to modern development philosophies.
dpJudas
 
 
Posts: 3044
Joined: Sat May 28, 2016 1:01 pm

Re: Installing Visual Studio 2017

Post by dpJudas »

Graf Zahl wrote:C++'s only saving grace these days is that nobody seems to care about developing a language that's designed to be compiled to native code. Everything new is designed to use large bloated runtimes and preferably compiles to some sort of intermediate byte code.
Nobody developed such a language because every time someone tries to save us from C++'s faults they end up re-evaluating and changing everything in such a way they replace one set of faults with another.

Java and C#, for example, wanted to be a "C++ killer" but some of their assumptions/improvements turned out to be wrong. They are designed around the assumption that you could create an intermediate byte code and still maintain performance. It turned out they were wrong - that was only possible in lab tests on simple codebases. As soon as the code grew large enough the JIT time became a significant problem. All the other languages trying to replace it made similar types of mistakes.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49073
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Installing Visual Studio 2017

Post by Graf Zahl »

Java and C# are hardly good examples.

Java is pretty much missing anything that makes a language performant by design. IIRC it was initially developed as a learning language where these omissions and the poor underlying design do not hurt but it ended up becoming a production language.

C# by itself could have been the C++ killer, if it hadn't been designed to run on bytecode with a large and bulkjy runtime needed to support it. I think the language itself is great, but at its core it simply misses what still makes C++ the first choice for low level programming by a wide margin.

For more recent attempts, there's also Swift. At least it got the compile-to-native part right but overall much of the language is just sugarcoating around Objective-C's deficiencies turned into actual language features.
dpJudas
 
 
Posts: 3044
Joined: Sat May 28, 2016 1:01 pm

Re: Installing Visual Studio 2017

Post by dpJudas »

Graf Zahl wrote:C# by itself could have been the C++ killer, if it hadn't been designed to run on bytecode with a large and bulkjy runtime needed to support it. I think the language itself is great, but at its core it simply misses what still makes C++ the first choice for low level programming by a wide margin.
The sad thing about C# is that the language itself actually doesn't require bytecode or that huge runtime. Or at least that was the case for the C# 2.0 spec I once used wheb implementing a LLVM frontend for C#. I never finished it, but I hard implemented enough of the language to see that it would work without any large issues.

But my point with C# is that the language implementors decided to use bytecode precisely because it was was hot and new at the time. And every feature they've added since 2.0 has been about trying to be trendy with syntax rather than fix the true limitation of C#: no decent RAII pattern or ARC support. The finalizers, the using keyword and the IDisposable interface are just "lalala we can't hear you" excuses for the parts of the codebase where you absolutely cannot wait for a GC to clean it up. The struct syntax is also absolutely insane.

So to sum it up: C# could be fixed and it could have native compilation - but it will never ever happen because it ain't cool for those driving the development of that language. In fact, it would require them to admit they were wrong about key decisions they made in the past.
Graf Zahl wrote:For more recent attempts, there's also Swift. At least it got the compile-to-native part right but overall much of the language is just sugarcoating around Objective-C's deficiencies turned into actual language features.
Yet another example of a language that should just have fixed the actual problem with Objective-C: the syntax. But no, they had to go all-in and have "courage" and do brain-dead stuff like removing semicolons and a ton of other garbage that changes things without the designers fully understanding the consequences of those choices.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49073
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Installing Visual Studio 2017

Post by Graf Zahl »

dpJudas wrote: But my point with C# is that the language implementors decided to use bytecode precisely because it was was hot and new at the time. And every feature they've added since 2.0 has been about trying to be trendy with syntax rather than fix the true limitation of C#: no decent RAII pattern or ARC support.
Actually, this is where *ALL* OOP languages tend to fall apart. Having true objects that live in their own space is surely fine when dealing with a game scripting language like ZScript, although I wouldn't mind to have some RAII capabilities in there, too (sadly I lack the experience in that particulart field to make it happen...)

If you look at the whole picture, C++ seems to be the only language which defined proper value semantics for objects - an absolute necessity for RAII.
ARC isn't really that much better than garbage collection. Both concepts revolve around recklessly allocating stuff and letting the system deal with the consequences, which normally are not nice.

All this is what really could drive me up walls with all that undefined crap in the spec: C++ really is the only language which has a concept for lightweight local objects, essentially making it the only good language where you can use those features without making a mess. It's too bad that taken from a strict spec-adherence viewpoint, many features can technically not be used. So what happens? do people avoid those features? Surely not, they use what works on their system. I'd guess that a vast portion of existing C++ code is not truly portable because its authors are not even aware of the sloppy spec.
User avatar
drfrag
Vintage GZDoom Developer
Posts: 3141
Joined: Fri Apr 23, 2004 3:51 am
Location: Spain
Contact:

Re: Installing Visual Studio 2017

Post by drfrag »

So i assume the layout i created for VS 2017 is fine now.
I still don't think the MinGW toolchain is actually bad, it's very lightweight (<90 MB), the compiler is good and it's free software. The debugger it's the weakest link it seems. CodeBlocks is a good IDE. As far as i know many people use MinGW.

https://www.codeproject.com/Articles/30 ... ual-Studio

I've fixed compilation for MinGW (see the other thread with links to the code and info on how to setup the poor man's VS :) ).
There are a couple of crashes, the VM crash on exit is still there. The other one is in the SSE2 truecolor multithreaded drawers.
leileilol wrote:Not adhering to Microsoft's insistent platform regressions and VC20XX library requirements is an advantage - at least until gcc 4.9 anyway before they went on a MS-like deprecating spree.

The exes buit with gcc 5.1 still run on win 98 (my legacy projects do).

Edit: i've just seen the massive include cleanup. I think VS can compile with missing includes so this could be a problem.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49073
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Installing Visual Studio 2017

Post by Graf Zahl »

drfrag wrote:The debugger it's the weakest link it seems.
That it is. It is just too bad that it is also the part where most time is spent so having a sup-par tool here surely does not help.
drfrag wrote: As far as i know many people use MinGW.
From my personal experience with programmers, it appears that those who use it tend to come from a Linux environment who aren't used to anything more than stone-age development tools lilke makefiles and autotools.
User avatar
Rachael
Posts: 13575
Joined: Tue Jan 13, 2004 1:31 pm
Preferred Pronouns: She/Her
Contact:

Re: Installing Visual Studio 2017

Post by Rachael »

As far as I know, most people use whatever toolchain is most suited for their operating system. Sadly, since most people use Windows, that means most people use Microsoft Visual Studio, not MinGW.

MinGW is, and always has been a very niche development tool, and mostly in either older open source projects, or projects with emphasis on cross-platform support. Newer ones always tend to use Visual Studio, from what I have seen.

It's actually only on Linux that I see GCC gain widespread use. Even the BSD OS's use Clang as the default compiler.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49073
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Installing Visual Studio 2017

Post by Graf Zahl »

Rachael wrote: MinGW is, and always has been a very niche development tool, and mostly in either older open source projects, or projects with emphasis on cross-platform support.

Make that "cross platform support" in the sense that "we do not want to write cross-platform code but still support all platforms."
In other words: It is often used as a crutch to make code work that isn't really portable.

We also shouldn't forget that MinGW comes from a time where Visual Studio was only available as an expensive commercial product. Once Microsoft realized that they need the support of programmers that couldn't afford to buy it and started offering the Express editions, MinGW's days were inevitably numbered.

And just looking at Drfrag's other thread shows that a Windows compiler that isn't fully compatible with Windows code is not going to fly. So essentially these days, MinGW is mainly restricted to compile some cross-platform libraries that only have some basic interaction with the underlying OS.
User avatar
drfrag
Vintage GZDoom Developer
Posts: 3141
Joined: Fri Apr 23, 2004 3:51 am
Location: Spain
Contact:

Re: Installing Visual Studio 2017

Post by drfrag »

But GZDoom is actually cross-platform right? When it compiles GZDoom it cannot be that bad.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49073
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Installing Visual Studio 2017

Post by Graf Zahl »

GZDoom *is* cross-platform - but it is also cross-compiler-compatible. The code can be compiled with GCC, Clang and VC++.

As the opposite example, take Doom Legacy. It has been developed exclusively on GCC for nearly two decades, and the result is a mess that cannot be made to work right on other compilers anymore. That's the type of "cross-platform" I was referring to. Yes, a Windows version exists, but due to how the source evolved, the only change to make it work on Windows is with something based on GCC.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49073
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Installing Visual Studio 2017

Post by Graf Zahl »

dpJudas wrote: Yet another example of a language that should just have fixed the actual problem with Objective-C: the syntax. But no, they had to go all-in and have "courage" and do brain-dead stuff like removing semicolons and a ton of other garbage that changes things without the designers fully understanding the consequences of those choices.
I missed that part earlier but I still have to say something about it.

No, the syntax is not the main problem with Objective-C. That I could live with. What bothers me far more is that the language has some truly insane conventions, like 'sending a message' (i.e. calling a member function) to a null pointer is supposed to either do nothing or return a null value. That was probably the biggest issue Swift had to face and what did they come up with? Instead of enforcing stricter checks they added some weird operators to guard pointers, which, if you use them just silently ignore null pointers as a potential source of problems. In the end they made something bad far worse.

Another thing that I find tiresome about Swift is that each version of the language introduces new incompatibilities to previous versions, resulting in some incredible code rot.

Regarding Objective-C, I still ask myself how anyone could think about using that as a system programming language. It's missing nearly everything that's essential for compile-time error checks. If you are lucky, you get a warning. If you are not so lucky your predecessor on the project had a tendency to just ignore any warning (easy since any software for Apple devices tends to get ever more noisy over the years when being recompiled) and if you are really unlucky you get hard to trace crashes that would be peanuts to fix with a robust language that just errors out on the code during compilation.
dpJudas
 
 
Posts: 3044
Joined: Sat May 28, 2016 1:01 pm

Re: Installing Visual Studio 2017

Post by dpJudas »

When I blame the syntax of Objective C it is mostly because the entire language seems to have been written initially as some kind of preprocessor to C - like the initial version of C++. As the years went by they made a few minor adjustments here and there, like the automatic properties and such, but never revisited the overall structure of a .m file.

IMO Swift was that revisit, but they chose the wrong people to do it. As all language designers, they took it as a carte blanche to try change all kind of pointless shit and my bet is that eventually reality hit them and they realized their shiny new language would have to integrate with Objective C. If they didn't meet that requirement they would have to rewrite their entire codebase and so all the sad hacks you talk about showed up. And yes, their constant adjustment of the language makes me consider them totally unsuited for the task.
User avatar
Graf Zahl
Lead GZDoom+Raze Developer
Lead GZDoom+Raze Developer
Posts: 49073
Joined: Sat Jul 19, 2003 10:19 am
Location: Germany

Re: Installing Visual Studio 2017

Post by Graf Zahl »

dpJudas wrote:When I blame the syntax of Objective C it is mostly because the entire language seems to have been written initially as some kind of preprocessor to C - like the initial version of C++. As the years went by they made a few minor adjustments here and there, like the automatic properties and such, but never revisited the overall structure of a .m file.
That is indeed the case - and the funny thing is that under the hood it still produces code that essentially works like that.
dpJudas wrote: IMO Swift was that revisit, but they chose the wrong people to do it. As all language designers, they took it as a carte blanche to try change all kind of pointless shit and my bet is that eventually reality hit them and they realized their shiny new language would have to integrate with Objective C. If they didn't meet that requirement they would have to rewrite their entire codebase and so all the sad hacks you talk about showed up. And yes, their constant adjustment of the language makes me consider them totally unsuited for the task.
[/quote]

I think a lot here is a result of the Apple mentality of treating external developers like crap, constantly forcing them to redo their code for some pointless optimizations in their operating systems and throwing old code under the bus after only a few years.
Post Reply

Return to “General”