What kind of hardware are you people using?
I got my first 16:9 flatscreen more than 8 years ago and since that time have neither touched a 4:3/5:4 nor a CRT monitor again.
^
On the matter of topic, not much people is using 5:4/4:3 aspect ratios monitors. 1366x768 here.
I have an Acer 5:4 monitor and a laptop with 16:9, but the one i prefer to use is the 5:4 monitor.
And every Cyber Cafe and House i have went to uses 4:3/5:4 and my specific resolution is 800x600 (crappy igpu) so yes, 5:4/4:3 is still kinda popular.
I still using 800x600 for desktop mostly(My monitor is BenQ GL2250, 16:9, don't know if it even be sold on outside of where I live), but recently since I needed play some mobile games with BlueStacks(only windowed fullscreen, great!), had to be forced using higher resolution like 1600x900 or the games will looks like unreadable shit......recently I decided to run GZDooM with 1920x1080 for some silly reasons(no, I don't want to tell why!:P), so although my eye hurts by small font(I explained on other thread for my own unfix-able reason(s) for refused to adjust system DPIs to actually using 1080p on desktop.), I'm still try and getting used the 1600x900/1920x1080, at least for playing games.
Spoiler: Silly rant inside, read on your own risk.
Yes, I prefer real fullscreen, screw M$ and their silly reasons for not properly supporting real fullscreen in Windows 10. While this is sad for me but I guess I have to stay at Win7 (like forever?) if they never changed their mind...lack of classic UI just other major reason but I can try fix it myself, but real fullscreen is unfix-able to me too.
PlayerLin wrote:Yes, I prefer real fullscreen, screw M$ and their silly reasons for not properly supporting real fullscreen in Windows 10. While this is sad for me but I guess I have to stay at Win7 (like forever?) if they never changed their mind...lack of classic UI just other major reason but I can try fix it myself, but real fullscreen is unfix-able to me too. :3:
Except "real" fullscreen (exclusive mode fullscreen) has been deprecated in all OSes since ever, and none of them really officially support it, even Windows 7.
"Real" fullscreen has so many problems that it's pretty much garbage.
phantombeta wrote:
Except "real" fullscreen (exclusive mode fullscreen) has been deprecated in all OSes since ever, and none of them really officially support it, even Windows 7.
"Real" fullscreen has so many problems that it's pretty much garbage.
Well, at least it's still working in Windows 7...but I guess I just expected too much.
Garbage or not, it doesn't changed my mind.
EDIT at near half-hour later:
Come to think about it, maybe most games are actually using windowed fullscreen does proper handling and maintains in-game resolutions and changed back to desktop one when ALT+TAB out, so they never use that "problematic exclusive-mode fullscreen". But I don't know, I don't like those programs which cannot proper changes resolutions when startup/alt+tab out/something made out of focus without proper reasons(proper reasons like OpenGL softwares, I know it's limitation so they have to stuck on in-game resolution when alt+tab out)...but I guess even I don't like them, I have to accept them since no way to really get around that...
In these days of LCDs and the like, changing the resolution for fullscreen is pretty much useless. LCDs have a fixed resolution, and when you "change" the resolution, you're just having the monitor accept a smaller image for display that it stretches to fill the screen. There's little advantage to having the monitor do this scaling, but a good number of advantages to allowing the graphics card/app do it. With the app doing the stretching via hardware, it prevents issues with the desktop resizing, throwing your desktop icons or other windows out of whack to fit a new size. With the app doing it, you can also get the option to retain the correct aspect ratio of the image, adding letterboxing or pillarboxing as needed (monitors I've seen all use stretch-to-fill, which destroys the pixel aspect ratio).
Additionally, most games these days render to offscreen framebuffers and don't write to the window/screen backbuffer until after post-processing has been handled. That write to the backbuffer can incorporate the necessary stretching, after all the hard work has been done, which also has the important implication of allowing "dynamic resolution". Since the pixel shaders and fill rate of drawing the scene encompass a large percentage of the game's rendering time, the game can keep a running profile of how long it took to draw a frame, and if it saw it took longer than desirable, the next frame can render to a smaller portion of its offscreen framebuffer and adjust the scaling metrics accordingly (fewer pixels=faster frame rendering). The intention being that rather than scene complexity causing the framerate to drop, it instead reduces the internal render resolution by some percentage, adding a slight scaling blur (not unlike using a lower screen resolution) to retain smooth motion. The advantage here is that it's not stuck to a predefined resolution list, and the UI elements (which are often very cheap to draw) can stay at full resolution. The game can also cap the resolution scale, or set a static scale via game options for user choice.
This also works the other way around, too. Rather than lowering the internal resolution below the screen size, you can increase it above the screen size to enable a form of super-sampling anti-aliasing (SSAA), which if you have the power to spare, is far superior to the more common MSAA or FXAA techniques.
TDRR wrote:I have an Acer 5:4 monitor and a laptop with 16:9, but the one i prefer to use is the 5:4 monitor.
And every Cyber Cafe and House i have went to uses 4:3/5:4 and my specific resolution is 800x600 (crappy igpu) so yes, 5:4/4:3 is still kinda popular.
What kinds of cyber cafés are that which run on such stone age hardware? Where I live they'd have to close due to lack of customers.
I cannot even remember when was the last time I have SEEN a 4:3 or 5:4 monitor. Wherever I go, it's TFT 16:9 or 16:10, and nothing else.
Until very recently I ran 3x1600x1200 IPS. One of my monitors has a bad capacitor so I replaced it with a 2560x1440 144Hz IPS Freesync display which I now use for gaming because between the lower latency and the higher refresh rate it just feels a lot nicer. Although given the awkward positioning of the monitor on my desk due to its size I sometimes still game on the 1600x1200 displays. Not holding my breath that NEC will ever include high refresh rates and adaptive sync into their 4:3 displays so I imagine the days of 4:3 for me are numbered. With all that said I always game windowed and generally use 16:9 anyway for the greater fov.
Also my laptop has a 1400x1050 IPS screen and was retrofitted with a modern (Broadwell) motherboard.
Recently I had to install some stuff where the installer was some broken piece of shit that only worked on Windows 9x. So I had to get a Windows 98 VM to unpack the thing and copy it elsewhere.
And as it just so happened that VM ran at 800x600 and nothing higher. It was an utter hassle to get anything done on that tiny screen.