Well, this is where you get into complicated semantic arguments. I'm sure you know that Linux-proper is just the kernel. Nothing more, not even an init program/process. What we refer to as Linux colloquially is more aptly called GNU/Linux as it uses the Linux kernel along with GNU-developed software to actually control the system. However, that's not entirely accurate either because, while portions of it may be GNU (particularly the compiler and libc), GNU doesn't cover everything most people need, and most people/distributions may replace some GNU components with alternatives.Graf Zahl wrote:I think you misunderstood what this was about.
On Windows and Mac the OS provides the facilities so any middleware is on even ground, but it ultimately calls down to the system API as well which ensures that everything is consistent.
Now, obviously you can't have a stable system that people can target software to if everyone is doing something different. This issue is dealt with through ubiquitous packages that have become the defacto standard of a "Linux system". Things like X11, ALSA, etc. There's nothing to say a Linux system will have these, but if you're making software for a general Linux system, you can safely assume they're there if you rely on that task. API standards and compatibility layers also play a part.
That is all just a long-winded way of saying, what you may think of as a system API on Windows or macOS may actually be nothing more than an extra utility library pre-bundled with the OS. Is it really much different for a developer to be told "If you need XYZ, use these functions provided by the OS" or "If you need XYZ, use these functions provided by this ubiquitous library"? As long as it works, the job's done either way. Granted, there are times it would be beneficial to have something pre-bundled that everyone is guaranteed to use, for, as you mentioned, consistency (it's painfully obvious when one app uses Qt while another uses GTK, for example). But other times, that can lead to disaster too (Internet Explorer).
That's been an on-going argument for a while now. Some people say we need to replace it, others say X is good enough and/or can be fixed as needed. I'm sure it doesn't help that Qt and GTK have been effective at covering up most of its issues, so not many people actually have to deal with the lower-level stuff where these issues are most apparent. And X had gotten stuck before when the Xfree86 project was slow to include features people were wanting, leading to Xorg being forked from it which revitalized X for a time.Which makes me wonder why X is still a thing if it's this far behind current technology.
It's also a big task to replace. A lot has built up around it as both a natural part of extending it with new features, and to deal with the issues as they turned up (GLX, XInput, XRandR, ...). You also need to implement compatibility layers, to ensure all the existing X11-based programs can continue to work, and ideally get the support of toolkits and graphics libraries, so that apps written for them can work natively with the new APIs.
Luckily it seems some headway is being made in that department. Wayland has been coming along, getting native support from Qt, GTK, and SDL, and they're also making sure to have XWayland available, a compatibility layer for X11-based apps to work in Wayland. I couldn't tell you how far along they are to being ready for "prime time", or what obstacles they still have to overcome, but it's much farther along than previous attempts at replacing X in recent memory, and is still going.