The five worst AMD GPUs of all time: So bad we can't forget them

Neilbob

Distinguished
Mar 31, 2014
209
245
19,620
Not too sure I agree that the RX 7600 is one of the worst so much as a meh product with a slightly poor price and value proposition; trouble is meh products with questionable pricing and value rather sums up the majority of the current generation.

I'm certain there have been far worse options over the years (although my wrinkly old noggin can't think of them at this moment in time).
 

garrett040

Distinguished
Jan 20, 2014
31
30
18,560
So one day we have a video of Nvidia touting its "500" games with dlss, then the following day we get "5 WORST AMD CARDS OF ALL TIME!"

yea this isn't sus at all.... this kind of behaviour specifically makes me WANT to get an amd gpu next upgrade...
 
  • Like
Reactions: 1ch0
Not too sure I agree that the RX 7600 is one of the worst so much as a meh product with a slightly poor price and value proposition; trouble is meh products with questionable pricing and value rather sums up the majority of the current generation.

I'm certain there have been far worse options over the years (although my wrinkly old noggin can't think of them at this moment in time).
As noted in the intro, we've trended toward newer rather than digging way back into the pre-Radeon era. We'll be doing the same for Nvidia, just like we did with both the Best AMD and Best Nvidia articles. These are a "nostalgia series" of articles, talking about some of the good and bad old days. Don't worry, the RTX 40-series will also get its fair share of derision with the next and final piece.

The RX 7600 represents the worst of the 7000-series, though the 7700 XT certainly gives it some competition, and the 7800 XT isn't riding high on the hog either. It's simply lackluster in almost every important way. DP 2.1 is a checkbox feature that has almost no practical bearing on a ~$250 graphics card — are you really going to pair it with a new $750 monitor to make use of the ultra-high bandwidth it supports? AV1 and boosted AI performance are at least something, but these are primarily gaming cards and so a few percent improvement over the RX 6650 XT while bumping the price $20–$40 isn't a great showing.
So one day we have a video of Nvidia touting its "500" games with dlss, then the following day we get "5 WORST AMD CARDS OF ALL TIME!"

yea this isn't sus at all.... this kind of behaviour specifically makes me WANT to get an amd gpu next upgrade...
These best/worst articles were planned weeks ago. Nvidia hitting 500+ was merely a news story, and while I did have something positive to say, if you actually read the text there's plenty of cynicism as well. We did the Best AMD GPUs before the Best Nvidia GPUs, and no one complained. Doing the Worst AMD GPUs before the Worst Nvidia GPUs just follows that pattern.
 

usertests

Distinguished
Mar 8, 2013
499
455
19,060
Only the price makes the RX 7600 bad, and it almost launched with a higher MSRP than it did. It's just boring otherwise.

The RX 6500 XT has truly earned its spot on the list, if not higher.
 

Order 66

Grand Moff
Apr 13, 2023
2,158
903
2,570
The 7600 xt is at least better than the 6500 xt, but that's about the only thing it has going for it. I think it is a shame that the Radeon VII a GPU with 16GB of VRAM doesn't have the horsepower to actually utilize it, given how VRAM-hungry modern games are, this GPU could have been decent even in 2023 if not for the fact that the GPU holds it back.
 

Amdlova

Distinguished
Trying find the a used vega64 and you will see the worst card in market...
The rx6400 6500xt are the bad product because you need a new machine to work.
The rx7600 is a so so product another made for notebook in mind.
The intel and amd fusion that is bad :)
 
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,158
903
2,570
It wasn't mentioned by name, but the i7 8809g was the intel CPU with RX vega M graphics. I have heard that it was somewhere on par with an RX 570. Impressive for essentially an IGPU with dedicated VRAM.
 
nvidia high end gpus continue to melt down at the power plug, destroying themselves, even with the new connectors, with an RMA rate many times higher then the worst any gpu manufacturer has had in over a decade, and THG runs this.

I'll never forget THG's rabid defense of the RTX 2000 series "just buy it" was such a memorable moments.
 
Completely disagree on the RX7600 when the RadeonVII exists.

Also, my sigature is already made up its mind about worst products of each.

EDIT: I just noticed the Radeon 8500 received a dishonorable mention? That card was rock solid for me. I had it in All-In-Wonder flavour. Best card I've ever owned with no issues. In fact, it worked flawlessly for TV duty and such. Anecdotal and all, but competing nVidia cards back then were a crapshoot for video output. They had so many issues with a wide array of TVs nd monitors it was laughable. That's why I moved to ATI back then.

Regards.
 

COLGeek

Cybernaut
Moderator
Aside from all the hyperbolic, indignant responses, these are interesting articles. They remind us of the changes and evolution of tech over the years. Some buys were good, some not so good. I had a couple of these and they were stinkers.

I would have tossed the Radeon VII and Vega FE into the pile as well.

Lets keep the fanboy and shill comments to a minimum, please.
 

Order 66

Grand Moff
Apr 13, 2023
2,158
903
2,570
I would have tossed the Radeon VII and Vega FE into the pile as well.
Would the HBM VRAM on the radeon VII help in modern games if the GPU itself wasn't holding it back? the reason I'm asking is because AFAIK there is no gaming GPU with enough GPU horsepower or enough HBM VRAM to see whether it makes a difference in games vs GDDR6 (x).
 
Apr 1, 2020
1,483
1,155
7,060
I'm going to bash you for not mentioning the HD 5970 and HD 2900 (series). The former was expensive, hot, loud as a jet thanks to the blower cooler, and the fact that faulty design meant an entire bank of VRMs were cooled only by a thin thermal pad which meant they overheated easily. The latter was expensive, hot, loud, and had embarrassingly low performance compared not only to Intel, but to the former X1950.

I say with shame I owned both and didn't learn my lesson and later bought a 4850, 4850x2, 7970 Ghz Edition, and a Fury Nano before jumping to the green team.
 
  • Like
Reactions: artk2219

gfg

Distinguished
Mar 30, 2005
93
27
18,670
I'm not even going to waste time reading this article because of the lack of ethics on this site. First the best Nvidia GPUs, followed by the worst AMD/ATI GPUs.
bye bye Tomshardware!!
 

rluker5

Distinguished
Jun 23, 2014
643
386
19,260
These AMD GPUs were good for most things, but not that competitive. They had caveats though. Like the ubiquitous blower design. Undervolting really became a thing with them. And they had more graphical and stability issues in games. Especially in mGPU which worked far better with Nvidia GPUs until it faded out of popularity at about 2016. The way my CFX 7970s performed in games would not pass the muster nowadays. It would be a source of a sea of memes. Or CFX Fury Nitros. Kids nowadays have no idea how low the bar on performance can go. Literally unplayable, and not hyperbole.

An idea for Nvidia card shortcomings is the lack of hardware feature support on older cards. For example Nvidia didn't gimp the 780ti with drivers, it just didn't have the hardware to effectively render hairworks, do async compute, etc, some features on games were better avoided and some couldn't be avoided like interior lighting in Fallout 4. The GTX580 is even worse with games that use any kind of compute. But Maxwell seems to be holding on much better in the game and feature categories. Even without hairworks, my 780tis in SLI were great until 2016 (really not that long I guess) and I was able to get a smooth 60fps at med-hi settings in W3@4k(Note: I had to run the 4k at 4:2:0 color because the 780tis would bottleneck due to ROPs and get 1/3 less fps otherwise). 300+ hours starting when the game launched. But I also had to flash a custom vbios to get the things 24/7 stable at 1250mhz. That factory boost would just cook the card and throttle it. When I type it, it is sounding like a lot of hassle, not to mention that I had to actively cool my room so its temp wouldn't run away. But the 4k W3 was easily the best gaming I had ever experienced at that point. I even turned on the TV's interpolation to 120 when not in boss fights.

By comparison, my daughter's Fury is mostly limited by it's vram and even my old 6870 does almost as well as you would expect from its tflops in games that can run with 1GB vram.
 
Last edited:
  • Like
Reactions: Order 66

ezst036

Honorable
Oct 5, 2018
570
485
11,920
I was surprised to see the Radeon 8500 on the list I owned one it wasn't that bad.

I also don't think that card deserved to be on the list, but for a different reason. An open source driver for Linux. For those like me who remember:

Prior to AMD's purchase of ATI (going way back) you mostly had to be stuck to either Nvidia's closed source driver or even worse, ATI's terrible closed source FGLRX driver.

Except.

The Radeon and Radeon 8500 somehow had some NDA action and a mostly feature-complete open source driver. This card was hands down the most stable card anybody could get on Linux anywhere and with the 8500 being the top end, it was the fastest card period with an open source driver. There was nothing else, nothing, and this was the fact for many, many years.

That of course changed as AMD bought ATI, contracted with SuSE to create an open source video driver for the entire graphics stack, and the 8500 quickly faded.
 
Last edited: