News AMD's 22-year-old GPUs are still getting driver updates — ATI's R300 - R500 from the early 2000s live on in Linux driver patches thanks to the open...

Status
Not open for further replies.
While I question the practicality of having this GPU (or any other ancient piece of hardware fully functional), I do appreciate and find it really cool.

We all like 60's Mustangs purring along, right?

Regards.
 

domih

Reputable
Jan 31, 2020
193
174
4,760
I have several Linux servers with Radeon HD 47xx from 2008-2010 just to have a local TTY console. I mostly access these servers with ssh or other services, but it is sometime convenient to be able to use a local console. I thank the Linux developers community for maintaining the drivers of these old but very functional graphics cards.
 
It is amazing that anyone is still running one of these cards, let alone supporting them in modern Linux operating systems, as the open-source community has done.

Not really, especially if all you need is a basic GPU for video output since AMD CPUs don't include a basic GPU like (most) Intels do. Even if it's not that old there's still the Radeon 5450.

https://www.tomshardware.com/reviews/radeon-hd-5450,2549.html
 
This is why I like the FOSS community, there is no malicious segmentation or deliberately dropping "support" as a way to force consumers to buy more stuff. If the HW hasn't changed in two decades, then there is absolutely no drivers shouldn't keep on working.
 

purpleduggy

Proper
Apr 19, 2023
162
42
110
with the way things are going I think in a generation or two, GPUs will generate frames instead of raster render them. ie. you will buy a compute chip from nvidia and then the game devs will have a specifc model that has been trained to display a game. this will then be generated so fast it will be imperceptible from a frame rate and super smooth with infinite resolution and zooming if needed, with more detail being added the closer the zoom. current compute chips are still primitive, give it a decade or so and they'll precisely generate images without flaws realtime at a 120fps at 8K. it only looks flawed now because we are still learning how to improve and prompt it into getting it to show us exactly what we want to see this will massively drop the power usage and performance cost of displaying a game. stable diffusion with precise prompts is a far more efficient way to do game design. a custom game engine will then be made to precisely and accurately prompt a model into creating what is required. think unreal engine but with ai prompts to get it to do what you want. when this happens all these render based gpus will all seem outdated. imagine how unique games can become when AI can generate new scenarios based upon input and display them realtime in fluid motion with near instant latency interaction from controls/input (ie. making a character move or looking left/right/up/down). When this tech matures, we can expect hyperrealism gaming with higher resolution than real life.
 
Status
Not open for further replies.