The five worst AMD GPUs of all time: So bad we can't forget them

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

bit_user

Polypheme
Ambassador
Yes, I would love this as I was very young or not alive when these GPUs were released, and i would love to see an article about it.
If you guys want to go really obscure, SGI made a PC 3D accelerator, back in 1990:

Wikipedia:


It's so obscure that, in spite of reading industry publications about 3D workstations, back in the early/mid 1990's - and very keenly tuned into the emergence of consumer PC graphics cards with 3D acceleration - I think I had never even heard about it, until a couple years ago!

I had thought it was the first ever 3D accelerator for the PC, but the above link contains this claim:

"Some may think that this was the first true 3D accelerator available for PC, but that’s not true. At that time, there were other more or less successful products, mainly oriented on 3D CAD. The first 3D accelerators for PC were available in the mid-80s. As far as I know, the oldest among the true real-time 3D accelerators is Matrox SM-640 / SM-1281 from 1988 (real-time means the support for double-buffering). These cards provided hardware geometry acceleration (through a licensed SGI chip), a “TIGA” rasterizer, 256 colors, smooth shading and (some say) even a Z-Buffer. It cost $6,995 at the end of 1988."

Amazing.
 
Last edited:
  • Like
Reactions: COLGeek
Nov 30, 2023
24
2
15
I dont like AMD not it's product, but it's marketing. AMD offer cheap product to help people against expencive Intel and Nvidia, that seems very good, but the reason is that AMD have no a good product to fight with.
when AMD have a good product it wont be a cheap one. that is why i dont like AMD. 7900series just only a little bit cheap than 4090 4080 but much more expensive than 4070ti. even 7900XTX has a large ratio in repairing.
indeed, if people have enough money to squander, who care AMD. yes we are limited on money, so we have to deal with AMD.
Many times i feel despair facing expensive price of Intel and Nvidia, and I nearlly cannot help to surrender to AMD, at last I realize that if I accpet AMD I will spend much time to settle verious problem that AMD's product might brought to me, then i give up.
money is used to settle problem but is not suppoused to bring trouble.
AMD makes you spend money also bring trouble, that is disgusting
 
The R5 230 existed for the purpose of adding (additional) display outputs if you didn't have any/enough. And/or HW video decoding. Same idea as the GT 720, around the same time and price from what I can tell. What were the better/cheaper options?
The Radeon HD 5450 or 6450 both come to mind. My mother used an HD 6450 until about four months ago. Glorified video adapters were perfected before 2010.
 
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,163
909
2,570
There are only 6, and one of them just came out...It would also be the 5 best Intel GPUs.

I suppose you could go back and look at their last discrete cards and the ones that never made it to market.
It would be interesting to see a best and worst of Intel GPUs list, but maybe down the road when they have more than 6 released GPUs. Maybe in about 10 years or so if they are still making GPUs by then.
 

Crazyy8

Proper
Sep 22, 2023
120
72
160
I dont like AMD not it's product, but it's marketing. AMD offer cheap product to help people against expencive Intel and Nvidia, that seems very good, but the reason is that AMD have no a good product to fight with.
when AMD have a good product it wont be a cheap one. that is why i dont like AMD. 7900series just only a little bit cheap than 4090 4080 but much more expensive than 4070ti. even 7900XTX has a large ratio in repairing.
indeed, if people have enough money to squander, who care AMD. yes we are limited on money, so we have to deal with AMD.
Many times i feel despair facing expensive price of Intel and Nvidia, and I nearlly cannot help to surrender to AMD, at last I realize that if I accpet AMD I will spend much time to settle verious problem that AMD's product might brought to me, then i give up.
money is used to settle problem but is not suppoused to bring trouble.
AMD makes you spend money also bring trouble, that is disgusting
Userbenchmark in a nutshell. You complain about AMD offering budget options that are affordable, refuse to elaborate on how the budget products are bad, complain about marketing, and dismiss AMD when they have better performance. Pick a side, budget or performance. And those "problems" you talk about, what are they? Can you list them for us? No? Then why do you say those "problems" exist?
 
Last edited:

TJ Hooker

Titan
Ambassador
The Radeon HD 5450 or 6450 both come to mind. My mother used an HD 6450 until about four months ago. Glorified video adapters were perfected before 2010.
I mean, those are essentially the same GPU (6450 is literally the same GPU) as the R5 230, and were the same price. But I see what you mean, they could have just kept making the 6450 (or whatever similar model) rather than rebadging it only to have it fill the same role at the same price. So it's not that the R5 230 was a worse option, it was the same option with an unnecessary new name
 

Eximo

Titan
Ambassador
There is an argument to be made that AMD did launch the 7900XT and 7900XTX at price points that didn't make a lot of sense. Then the RX7600, 7700XT and 7800XT are all in the same boat. By pricing them higher it means that they essentially artificially keeping the previous generation prices higher than they should be. Considering they have every opportunity to undercut Nvidia everywhere but the high end and choose not to is a little annoying.

Obviously against their best interest to offer great GPUs at good prices. Instead we have good GPUs at reasonable prices compared to the competition's over pricing. AMD just isn't in the market position to really push against Nvidia volume, so they can't really start a price war. But that is the hope.

Why everyone is so interested in how Intel turns out.
 
  • Like
Reactions: Order 66

bit_user

Polypheme
Ambassador
The Radeon HD 5450 or 6450 both come to mind.
That little card is great for servers that mostly run headless. It's low-profile, idle power is in the single digits, and mine is even fanless. Plus, if you still have a VGA KVM switch, it supports analog from the DVI connector (it was their last generation with analog support).

The two downsides are its anemic performance (still fine for basic desktop graphics) and its resolution limit (2560x1440). I guess driver support on Windows might be an issue, but Linux still supports it quite well.
 
  • Like
Reactions: Order 66

bit_user

Polypheme
Ambassador
That's not true.
28% of the top 500 fastest supercomputers use an AMD processor, compared to 68% using Intel.
IMO, this is somewhat beside the point. HPC is mainly about GPUs, right now.

AMD's CPUs had a resurgence because it advanced in PCIe generation and lane-count quicker than Intel, but that advantage will be short-lived. AMD did add AVX-512 and increase its memory bandwidth to 12-channel DDR5, meanwhile Intel has its Xeon Max CPUs with HBM. So, each are now bringing new weapons to the fight.

For AMD, it'll be interesting to watch uptake of MI300. That's the next potential game changer. Intel's hybrid Falcon Shores got delayed until 2025, so don't expect MI300 to be answered directly.
 
  • Like
Reactions: AgentBirdnest
Nov 30, 2023
24
2
15
This has to be one of the most ignorant and ill-informed posts that I've ever seen. All you've managed to do is tell on yourself. It's very clear that you know little to nothing about PC tech except for maybe what you read from LoserBenchmark.

In the CPU space, AMD is currently FAR ahead of Intel, in gaming, in productivity and most importantly, in the server space. Intel has nothing that can match a 96-core EPYC server CPU and that's why the majority of the world's top supercomputers are AMD-based. When El Capitan goes live, it will be the fastest supercomputer in the world by quite a margin and it will have AMD EPYC CPUs coupled with Radeon Instinct GPUs.

There are arguments to be made why AMD is inferior to nVidia (if you care about the aspects in which nVidia is actually better) but there are absolutely NO arguments to be made as to why Intel is better than AMD on the CPU side because it's just not true.

If you think that Intel is so great, maybe you should look at this:
and this:

The truth is that Intel has become inferior to AMD because they have to push more wattage through their CPUs than most GPUs draw to even get close to AMD's performance. Intel has become a joke.
Convince me,is easy. but how to convince all? many company are use intel, why? just becaus the truth is often mastered for a few people, your are one of a few? who care? when many adults go out for a job, while a few stay in home and ask for money from parents, so that is the truth of a goodliving?
when FX serious was introduced to market, i saw many article sing high praise of it and said it is fit for all applications, games. but the truth was that, intel I3 could do better in nearlly all games. some of my friend misled by AD of AMD. 13years ago some people wasted their mony to buy a CPU to play games and suffered, i never forget that when play WOW, SC. at that time, people got low frames just because they have AMD.
now AMD is a shining one, but i will not forget AMD's <Mod Edit> history
in winter of 2001, i bought athlone 1200+, and cottupted 2 weeks after. no ov no any operation on my hardware. ,because there was no more same cpu to exchange when i ask for aftersale-sercice i had to add money to get athlone 1400+.
also i have experience on GPU of AMD, also a sad sotry.
for 25 years, so i experience hundreds intel cpus and Nvidia GPUs, no one can be so impressive like AMD
 
Last edited by a moderator:

abufrejoval

Reputable
Jun 19, 2020
390
262
5,060
The R9 390X feels like it should be on the list rather than the 290X? Unless I'm badly misremembering, the 290X was a solid engineering effort that got overshadowed by rapid releases and price drops, while the 390X was a desperate attempt to crank the factory OC up and raise the prices back up from where the 290X had dropped to. Guess they are pretty close to the same card?
To me the 390 was mostly a free upgrade to 8GB of VRAM... which I rather disliked not coming as a free swap, just after I had bought a R9 290X. Otherwise, I'd mostly complain that it hasn't been supported for far too long.

I had that 290 next to a GTX 780 system for a while, but the 290 ran on an AMD Phenom II x6 while the 780 had a Penryn Core2 Quad, both slightly overclocked beyond 3 GHz.

Both performed pretty similar but not stellar and looked rather worse once I upgraded my display to 1920x1200 while they had yelled "4K gaming!" very big on their boxes and in the news.

At 4k only the RTX 4090 finally managed to get beyond 30 FPS on ARC Survival Evolved.
And only with DLSS 3 it manages to sometimes exceed 60 FPS on the ARC Survival Ascend successor based on Unreal 5, of course with epic settings, because for me its mostly about the eye candy while playing "Lego". I know why I only went with a 144Hz monitor...

The 290X still survives at 1920x1080 on an Ivy Bridge i7-3770 running at 4 GHz at my father in law. He's happy and therefore so am I: that's an epic piece of hardware to put into the trash!

At the time one of its main differentiation points was that its FP64 performance wasn't castrated like consumer Nvidia cards were. But lack of GPGPU software for Radeons was even more effective...
 

abufrejoval

Reputable
Jun 19, 2020
390
262
5,060
When adjusting for inflation, $500 has never been the high end. Back in 1998, I had a Voodoo2 SLI paired with a Matrox Millennium II, and each of those cards retailed for $300 then. That's $900 even before adjusting for inflation. After adjusting, it's over $1700 today. Right about where we are with AIB 4090's before the China ban kicked in.
I can only agree, even if my Voodoo2 was paired to an ATI, possibly the Mach8. And I did get the VR shutter glass extension, too, which never really worked: no idea what I paid for them, but while being in the forfront was expensive, it paid back much more for me.

I paid the equivalent of €10.000 for a 80286 with 640K RAM and an EGA graphics card (640x350 16 colors) in 1986 that had a 20MB hard disk drive and a 1.44MB floppy.

I could have bought a new VW or a used Porsche instead, but that computer was the start of my career.
And while the RTX 4090 is pricey, it still sustains my six digit salary (with a bit of human brain thrown into all that AI).

Yes, the top-of-the-line PCs have become rather expensive, but it's easy to overlook their explosion in terms of capabilities. I've worked with SGI workstations costing a few Maybachs that today are anihiliated by budget smartphones.

Even a Chromebook has a Cray XMP look very bad indeed; a €100 Raspberry 4 emulates a €1M ECL VAX 9000 faster than that ever ran, while it cost more in electricity alone, than most of us could ever afford.

So please take off your inflation victim glasses, breath in and out and calmly enjoy what you can afford even as a pure hobbyist today, compared to what wasn't even remotely imaginable back then.
 
Last edited:

Alex/AT

Reputable
Aug 11, 2019
34
17
4,535
Well, not sure I agree.
R9 290X was a great card, albeit a bit too hot.
And Vega 56/64 is competitive even now, in normally rendered 1080p gaming.
 

bit_user

Polypheme
Ambassador
I paid the equivalent of €10.000 for a 80286 with 650K RAM and an EGA graphics card (640x350 16 colors) in 1986 that had a 20MB hard disk drive and a 1.44MB floppy.

I could have bought a new VW or a used Porsche instead, but that computer was the start of my career.
This is a point that should be underscored. I'll leave aside the points about exotic workstations and minicomputers, although they're also worth keeping in mind. People have too quickly forgotten what a high-end PC cost in the 80's and early 90's. We're still enjoying historically low PC prices. Even if they're not the lowest-ever (inflation-adjusted), they're still very accessible.

Perhaps one problem is that more people than ever are trying to access them and they're more of a stretch for some budgets than others. Plus, there are plenty of Millennials who remember what pre-inflation pricing was like, over the past couple decades, but lack the far hindsight of yourself.

So please take off your inflation victim glasses, breath in and out and calmly enjoy what you can afford even as a pure hobbyist today, compared to what wasn't even remotely imaginable back then.
+1
 
  • Like
Reactions: AgentBirdnest
That's not true.
28% of the top 500 fastest supercomputers use an AMD processor, compared to 68% using Intel. (The percentage is about the same if you shrink the list down to the top 10 fastest. AMD powers 2, and Intel 5.)
Source
We're talking about current levels of technology here, not who bought what. The only two Supercomputers in the TOP500 list that matter in this case are Frontier and Aurora because those are AMD's most advanced and Intel's most advanced.

The numbers that matter are maximum continuous throughput per core (RMax) and power draw-per-core.

Frontier (EPYC/Instinct) with 8,699,904 cores:
Rmax:
1,194 TFlops Power Draw: 22,703kW

Rmax-per-core: 1,194,000,000,000,000 Flops ÷ 8,699,904 Cores = 137,242,894 Flops/Core
Power Draw-per-core: 22,703,000W ÷ 8,699,904 Cores = 2.6W/Core

Aurora (Xeon/GPU Max) with 4,742,808:
Rmax: 585 TFlops RPeak: 1059 TFlops Power Draw: 24,687kW

Rmax-per-core: 585,000,000,000,000 Flops ÷ 4,742,808 Cores = 123,344,651 Flops/Core
Power Draw-per-core: 24,687,000W ÷ 4,742,808 Cores = 5.2W/Core

It's quite clear that AMD has the superior technology at the moment and it's not even close. The performance of an EPYC core is 11.3% faster than that of Xeon despite drawing literally half the power. Drawing double the power with worse performance makes Xeon very unattractive to prospective supercomputer and/or server owners.

AMD has the more advanced tech which makes for a superior product. The numbers don't lie, people do.
 

punkncat

Polypheme
Ambassador
I finally just clicked on this article. I am not going to go so far as to say that I agree with all the selections but am going to reserve judgement until I see a similar article for Nvidia.

One of the first decent graphics cards I purchased to fit inside a 'standard' PC build was the HD 7770. It was a darned good card aside from the noise level of that blower. The card remains in use to this day. To try and get away from the blower noise, I fell for some of Nvidia marketing horse pucks with the "cool and quiet" (feature?) going at the time. It doesn't help that I made a super poor choice in getting a 960 with 2GB of RAM ...but that in a lot of ways BOTH of the 960 options were poor. There was loads of talk at the time about the uselessness of making these 4GB with a GP that didn't have the grunt to properly use it.

The biggest issue in my own situation was that the 'cool and quiet' feature just waited until the card was either 60C or 70C, I cannot recall for sure, but then just flooded your rig with super heated air and things went downhill quickly. I did later learn about changing fan curves, but then the card wasn't quieter than the 7770 that I changed simply for that feature.
 
  • Like
Reactions: cyrusfox

bit_user

Polypheme
Ambassador
We're talking about current levels of technology here, not who bought what. The only two Supercomputers in the TOP500 list that matter in this case are Frontier and Aurora because those are AMD's most advanced and Intel's most advanced.

The numbers that matter are maximum continuous throughput per core (RMax) and power draw-per-core.
From what I've seen, that list doesn't provide enough detail to do a strict CPU vs. CPU comparison, because the power and performance figures aren't broken down by CPU vs. GPU, but rather apply to the entire thing.

Frontier (EPYC/Instinct) with 8,699,904 cores:
Rmax:
1,194 TFlops Power Draw: 22,703kW
Oops. That's 1194 PFLOPS, which is 1,194,000 TFLOPS or 1.194 EFLOPS.

Also, some napkin math makes it clear the "core" count includes GPU shaders. Otherwise, you'd be talking about squeezing something like 1836 CPUs per cabinet, and that's not happening.

The performance of an EPYC core is 11.3% faster than that of Xeon despite drawing literally half the power.
The metric is mostly dominated by GPU performance. Even there, it's not a very useful comparison for us lowly consumers, because both Intel and AMD use radically different GPU architectures for these machines than their consumer products and the kind of performance they're talking about involves 64-bit floating point arithmetic and solving large matrix problems and similar kinds of vector arithmetic (i.e. LINPACK and HPCG).

As an aside, most consumer GPUs have like 1/32 as much fp64 performance as fp32 (which is the mainstay of interactive graphics), but Intel's Alchemist and Xe-generation iGPUs have none! They run any fp64 computations in emulation! Funny enough, I once tried to compile a program using fp64 and run it on an Alder Lake's iGPU and it flat-out refused to run until I removed the fp64 arithmetic from it.

AMD has the more advanced tech which makes for a superior product. The numbers don't lie, people do.
I love the attention to data, but two things to keep in mind.
  • You need to make sure you understand what the data is telling you. This ultimately extends to what the test actually measures and how the measurements were conducted.
  • People have been gaming HPC benchmarks since they first existed. The hardware is tuned for maximum benchmark performance and numbers reported are using custom software stacks (OS + compilers + libraries) that are all tuned to deliver the best numbers. Therefore, applicability to the rest of us is less than it might otherwise be.

Therefore, having data is merely the start of the discussion. We then have to agree on what it's telling us, before we can try to reach firm conclusions. It helps to have some humility, here, because we've all made our share mistakes and oversights. It's much easier to move past such a misstep, if you don't start out by making sweeping pronouncements or brash statements.

Finally, I think what we can say about the data and your analysis is that (aside from being off 3 orders of magnitude on your performance numbers) it does paint a more positive picture for AMD. Again, applicability to consumer workloads is questionable. A much better comparison would be to simply look at RDNA3 vs. Alchemist GPUs or Ryzen 7000 vs. Raptor Refresh.
 
  • Like
Reactions: AgentBirdnest

spongiemaster

Admirable
Dec 12, 2019
2,289
1,289
7,560
So please take off your inflation victim glasses, breath in and out and calmly enjoy what you can afford even as a pure hobbyist today, compared to what wasn't even remotely imaginable back then.
What on earth are you talking about? At no point in my post did I say or imply any opinion on the cost of anything, nor did I say I regretted any purchase I made. All I did was state the math that halo GPU setups have never been in the $500 range the OP stated.

How about you climb off your high judgmental horse?
 

bit_user

Polypheme
Ambassador
What on earth are you talking about? At no point in my post did I say or imply any opinion on the cost of anything, nor did I say I regretted any purchase I made. All I did was state the math that halo GPU setups have never been in the $500 range the OP stated.

How about you climb off your high judgmental horse?
The post seemed to start out by agreeing with you. I took the last part as a statement aimed more broadly or maybe aimed at the poster you were originally replying to, but I could be wrong...
 
  • Like
Reactions: abufrejoval

epiczombiekill

Distinguished
Oct 6, 2013
69
2
18,645
The R9 290x section isn't accurate, according to passmark a 980 barely beats out my R9 290 and it's not even an X and is pretty clearly being bottlenecked by my 4690k@4.8
So no a 780ti or 970 aren't much faster than the R200 series, they're even at best
 

mihen

Honorable
Oct 11, 2017
465
54
10,890
The HD 2800 XT was pretty bad. The HD 3070 was a good intermediate until the HD 4070 mopped the floor with nVidia. The HD 3070 offered the performance of the HD2800 XT yet more efficiently at the $200 price point. It also had a huge over clocking headroom. It's DX standard was basically DX11, but Microsoft hadn't finalized it yet.
I still have an R9 Fury X in a machine using a custom water loop. It came in 3 flavors each with its own benefits. There was the R9 Nano that was good in small form factors. The R9 Fury that had a garbage air cooler and was uselessly long. The R9 Fury X was good as long as you chose proper settings for its high compute limit and 4GB memory. Realistically, to get the most out of any gpu at that time you had to manipulate the settings. Most reviewers just pushed 4k ultra settings in nVidia sponsored games. Now the drivers come with recommended settings.
One thing that was overlooked with the Vega Frontier Edition was properly powering it. Get it a good 850 watt PSU as recommended, put it on a custom water cooler with the recommended 360mm radiator, and it becomes a beast.
 
  • Like
Reactions: bit_user