Old laptop /Linux question
Re: Old laptop /Linux question
Clearly, you need to enable hardware acceleration of some sort. Trying to run higher resolutions without it especially for web browsing is completely insane.
- drfrag
- Vintage GZDoom Developer
- Posts: 3200
- Joined: Fri Apr 23, 2004 3:51 am
- Location: Spain
- Contact:
Re: Old laptop /Linux question
I know. Perhaps i should try LXDE, last time i tried to me it sucked big time and may be it doesn't boot either.
Edit: I don't think that would help, i can't get to the login screen with the graphics driver and i must choose between them there.
Edit: I don't think that would help, i can't get to the login screen with the graphics driver and i must choose between them there.
- Redneckerz
- Spotlight Team
- Posts: 1134
- Joined: Mon Nov 25, 2019 8:54 am
- Graphics Processor: Intel (Modern GZDoom)
Re: Old laptop /Linux question
With KDE, are you running the Plasma Shell and KWin?drfrag wrote:I know. Perhaps i should try LXDE, last time i tried to me it sucked big time and may be it doesn't boot either.
Edit: I don't think that would help, i can't get to the login screen with the graphics driver and i must choose between them there.
Similar to old Compiz (From what i remember from my times with Linux), KWin uses compositing for advanced graphical effects. It might be that your card with GL1.4 simply refuses on this. There are two options i can think of right now if you prefer KDE:
- See the section ''If you prefer a classic KD3-Style desktop'' of the KWin page.
- Consider looking at the Trinity Desktop Environment which is the old KDE3 desktop manager maintained that is far lower on system specs requiring no dedicated 3D acceleration.
- drfrag
- Vintage GZDoom Developer
- Posts: 3200
- Joined: Fri Apr 23, 2004 3:51 am
- Location: Spain
- Contact:
Re: Old laptop /Linux question
I already disabled the compositor, it said it was using GL 2 but may be it wasn't even working or was software. I can't get to the login screen without disabling the driver so i don't think changing desktops would help. But i dunno, the requirements said just 1 GHz CPU and 1 GB of ram but they don't update that.
- drfrag
- Vintage GZDoom Developer
- Posts: 3200
- Joined: Fri Apr 23, 2004 3:51 am
- Location: Spain
- Contact:
Re: Old laptop /Linux question
Now i've tried to install the xserver-xorg-video-radeon package with the package manager and i get an error saying that it's a virtual package and can't be installed. Then from the terminal with sudo apt becouse you must type a lot to prove that you're a real hacker you know and i get something that there were a lot of problems and i kept broken packages.
This is almost fun considering i'm a computer technician. 
Edit: it's already installed, the hwe package.


Edit: it's already installed, the hwe package.
-
-
- Posts: 3213
- Joined: Wed Nov 24, 2004 12:59 pm
- Operating System Version (Optional): Kubuntu
- Graphics Processor: ATI/AMD with Vulkan/Metal Support
- Contact:
Re: Old laptop /Linux question
Do you know if Windows worked on that machine? The T40 series was very prone to dead GPUs. I can't say I've experienced any freezing with the R100 chip in my A31p, but it has been a long while since I've used that laptop for an extended period of time.
- drfrag
- Vintage GZDoom Developer
- Posts: 3200
- Joined: Fri Apr 23, 2004 3:51 am
- Location: Spain
- Contact:
Re: Old laptop /Linux question
But 2d at least should work, i only tried mini xp and the driver supported the display native resolution. If it was fried i would not get any output.
"Only" hardware problems here are dying keyboard and cmos battery. Now power management works.
Mint mate works with a radeon 9600, i could have installed mate here with another desktop. May be it's the distro or a kernel update no idea.
"Only" hardware problems here are dying keyboard and cmos battery. Now power management works.
Mint mate works with a radeon 9600, i could have installed mate here with another desktop. May be it's the distro or a kernel update no idea.
- drfrag
- Vintage GZDoom Developer
- Posts: 3200
- Joined: Fri Apr 23, 2004 3:51 am
- Location: Spain
- Contact:
Re: Old laptop /Linux question
I would need to install an older version of the package supporting the user space mode setting but that's impossible. The driver version was 6.14.4 and here there are both video-ati and video-radeon drivers both installed. With recent drivers only kernel mode setting is supported.
The newest kernels have moved the video mode setting into the kernel. So all the programming of the hardware specific clock rates and registers on the video card happen in the kernel rather than in the X driver when the X server starts.. This makes it possible to have high resolution nice looking splash (boot) screens and flicker free transitions from boot splash to login screen. Unfortunately, on some cards this doesnt work properly and you end up with a black screen. Adding the nomodeset parameter instructs the kernel to not load video drivers and use BIOS modes instead until X is loaded.
At least now i know that LZDoom still runs. And next that other thing i want to do.Many open source drivers have removed support for non-kernel mode setting, so in those cases when you use nomodeset you will end up falling back to the very basic VESA un-accelerated driver. This is very much a performance and feature hit.
Re: Old laptop /Linux question
>GDB is incredibly powerful if you know how to use it.
and so printf too. ;-)
interactive debugger is the best way to waste your time pretending to doing something useful. once i realised this, i stopped using interactive debuggers. and i never looked back since.
and so printf too. ;-)
interactive debugger is the best way to waste your time pretending to doing something useful. once i realised this, i stopped using interactive debuggers. and i never looked back since.
- Chris
- Posts: 2978
- Joined: Thu Jul 17, 2003 12:07 am
- Graphics Processor: ATI/AMD with Vulkan/Metal Support
Re: Old laptop /Linux question
Except the language itself makes this invalid. A program that does something wrong (invalid pointer access, uninitialized memory, etc), is by definition an invalid program, and an invalid program makes no guarantee about the behavior of a program, at all. Bugs can go "back in time", essentially effecting the behavior of code "before" the buggy line. As an example:ketmar wrote:>GDB is incredibly powerful if you know how to use it.
and so printf too.
Code: Select all
static char temp[512];
void foo(void *dst, void *src, size_t count)
{
if(src)
{
count += *(unsigned char*)src;
src += 1;
}
printf("Copying %z bytes from %p\n", count, src);
fflush(stdout); // Make sure output is flushed
memcpy(temp, src, count); // BUG: need a null check here, memcpy parameters must not be NULL even if count is 0.
}
...
int main()
{
foo(NULL, 0);
...
return 0;
}
Code: Select all
$ ./a.out
Segmentation fault
Code: Select all
void foo(void *dst, void *src, size_t count)
{
count += *(unsigned char*)src; // Removed check, since it's unconditionally given to memcpy which can't be given null, src therefore can't be null
src += 1;
printf("Copying %z bytes from %p\n", count, src);
fflush(stdout); // Only user space memory is flushed
memcpy(temp, src, count);
}
Code: Select all
Note that fflush() flushes only the user-space buffers provided by the C library.
This is why it's important to have tools that can inspect the state of the program at the time of the crash, rather than depending on an invalid program to make a valid log. Such a log can be useful to have as extra information, but certainly not on its own and not as the primary data point.
Re: Old laptop /Linux question
>A program that does something wrong (invalid pointer access, uninitialized
>memory, etc), is by definition an invalid program, and an invalid program makes no
>guarantee about the behavior of a program, at all.
that's why i am always turning off optimisations based on "your code is always valid" assumption (at least as much as modern compilers allows that).
>This is why it's important to have tools that can inspect the state of the program at
>the time of the crash
as a wrote befire, i *do* use gdb to get backtraces from coredumps. i said that i don't use *interactive* debuggers, not post-mortem. sure, post-mortem can be interactive too, but i think it is quite clear what i meant. ;-)
p.s.: segfaults are the easiest bugs to debug anyway. it segfaulted, you looked at the coredump, you see what is wrong (if your code is not same insane spaghetti, of course). logic bugs are much harder to fix, and usually you have to add alot of watch expressions to debugger anyway. this is not really different from adding logging, but if you added logging, you can leave it in the code (i do), and you can later ask your users to run your app with "--debuglogs" CLI arg to perform "remote debugging". i can't even count how many times this saved me weeks of my life i would waste trying to reproduce the logic bug locally.
>memory, etc), is by definition an invalid program, and an invalid program makes no
>guarantee about the behavior of a program, at all.
that's why i am always turning off optimisations based on "your code is always valid" assumption (at least as much as modern compilers allows that).
>This is why it's important to have tools that can inspect the state of the program at
>the time of the crash
as a wrote befire, i *do* use gdb to get backtraces from coredumps. i said that i don't use *interactive* debuggers, not post-mortem. sure, post-mortem can be interactive too, but i think it is quite clear what i meant. ;-)
p.s.: segfaults are the easiest bugs to debug anyway. it segfaulted, you looked at the coredump, you see what is wrong (if your code is not same insane spaghetti, of course). logic bugs are much harder to fix, and usually you have to add alot of watch expressions to debugger anyway. this is not really different from adding logging, but if you added logging, you can leave it in the code (i do), and you can later ask your users to run your app with "--debuglogs" CLI arg to perform "remote debugging". i can't even count how many times this saved me weeks of my life i would waste trying to reproduce the logic bug locally.
-
-
- Posts: 3213
- Joined: Wed Nov 24, 2004 12:59 pm
- Operating System Version (Optional): Kubuntu
- Graphics Processor: ATI/AMD with Vulkan/Metal Support
- Contact:
Re: Old laptop /Linux question
Not necessarily. I have a GeForce 4 4600Ti which will work with the XP desktop but any accelerated program is a glitch fest. With modern operating systems the desktop, web browser, office suite, what have you all use hardware acceleration.drfrag wrote:But 2d at least should work, i only tried mini xp and the driver supported the display native resolution. If it was fried i would not get any output.
Trying another desktop environment would be worthwhile as a data point, but I'd expect that at best it last a little longer before hitting the same freeze.drfrag wrote:Mint mate works with a radeon 9600, i could have installed mate here with another desktop. May be it's the distro or a kernel update no idea.
At this point I'm fairly sure you're looking at very dated advice. KMS is a very old feature at this point and works just fine. If an older driver were to fix your issue it would just be an indication that there is indeed a bug in the newest drivers.drfrag wrote:I would need to install an older version of the package supporting the user space mode setting but that's impossible.
To that end as far as your options go for installing an older driver, the only practical one is to turn off the hardware enablement stack and go back to the original 18.04 kernel and mesa stack. In theory compiling old versions of mesa is also viable, but I wouldn't attempt that unless you were actually comfortable with using Linux which you clearly are not.
Enjoy not having any vector optimizations. Those rely on a lot of hidden assumptions granted by the C++ standard including data alignment. Disabling compiler optimizations because they point out your code is poorly written is crazy.ketmar wrote:that's why i am always turning off optimisations based on "your code is always valid" assumption (at least as much as modern compilers allows that).
Re: Old laptop /Linux question
>Enjoy not having any vector optimizations. Those rely on a lot of hidden
>assumptions granted by the C++ standard including data alignment. Disabling
>compiler optimizations because they point out your code is poorly written is
>crazy.
assuming that the code has zero bugs is crazy. can you give a 100% guarantee that your code is flawless? if not, then your "optimised" code is silently turned into nonsence, and you can never know for sure what is wrong, because so-called "optimisations" are absolutely unpredictable.
also, "vector optimisations" are overrated. if they're so great, then why people still writing SIMD code manually? and oops... you have to organise your data structures to be "vector-friendly" first. meh. x2 for code complexity for 0.1% more speed? i can't care less. good algorithms, profiling, and manual improving of bottlenecks can give me much better speed boost than any sophisticated compiler hyperoptimisation.
and what is important -- i know that compiler won't throw away some of my code because some idiot in "standards committee" decided to turn perfectly valid thing into "unspecified". like integer overflow -- because yeah, we have alot of CPUs with non-two-complement integer math, so we cannot put it into standard, right? screw it. making complex project UB-free is impossible. literally. and i won't even going to try -- not before we get sane standards.
>assumptions granted by the C++ standard including data alignment. Disabling
>compiler optimizations because they point out your code is poorly written is
>crazy.
assuming that the code has zero bugs is crazy. can you give a 100% guarantee that your code is flawless? if not, then your "optimised" code is silently turned into nonsence, and you can never know for sure what is wrong, because so-called "optimisations" are absolutely unpredictable.
also, "vector optimisations" are overrated. if they're so great, then why people still writing SIMD code manually? and oops... you have to organise your data structures to be "vector-friendly" first. meh. x2 for code complexity for 0.1% more speed? i can't care less. good algorithms, profiling, and manual improving of bottlenecks can give me much better speed boost than any sophisticated compiler hyperoptimisation.
and what is important -- i know that compiler won't throw away some of my code because some idiot in "standards committee" decided to turn perfectly valid thing into "unspecified". like integer overflow -- because yeah, we have alot of CPUs with non-two-complement integer math, so we cannot put it into standard, right? screw it. making complex project UB-free is impossible. literally. and i won't even going to try -- not before we get sane standards.
- wildweasel
- Posts: 21706
- Joined: Tue Jul 15, 2003 7:33 pm
- Preferred Pronouns: He/Him
- Operating System Version (Optional): A lot of them
- Graphics Processor: Not Listed
- Contact:
Re: Old laptop /Linux question
Has OP's question been answered?
Re: Old laptop /Linux question
>Has OP's question been answered?
several times, by different people. that's why we're going wild here.
User received a warning for this post -Rachael
several times, by different people. that's why we're going wild here.
User received a warning for this post -Rachael