Review AMD Radeon RX 6400 Review: Budget in Almost Every Way

King_V

Illustrious
Ambassador
If ever there was a GPU that strode forth, and boldly declared "Meh," this is it. I will, however, grant it points for its performance/watt, relative to its competitors.

I would be very surprised if the price held up where it is, though. Then again, a quick look at PC Part Picker for a GDDR5 version of the GT 1030 is showing a single passively cooled model for $90 directly from Asus, and the rest at $114 and higher. The horrible DDR4 version is just as pricey, which elicits a big DoubleYoo Tee Eff?

Ouch.
 
If ever there was a GPU that strode forth, and boldly declared "Meh," this is it. I will, however, grant it points for its performance/watt, relative to its competitors.

I would be very surprised if the price held up where it is, though. Then again, a quick look at PC Part Picker for a GDDR5 version of the GT 1030 is showing a single passively cooled model for $90 directly from Asus, and the rest at $114 and higher. The horrible DDR4 version is just as pricey, which elicits a big DoubleYoo Tee Eff?

Ouch.
Just waiting for the GTX 1630 to arrive... It should give a little bit more meh to the GTX 16-series, because Nvidia can't let AMD run unchecked in the meh market segment of graphics cards! LOL

GPU price tiers:
Enthusiast/Extreme
High-end
Mainstream/Midrange
Budget/Entry-Level
Meh
 

Liquidrider

Honorable
Nov 25, 2016
19
4
10,515
If ever there was a GPU that strode forth, and boldly declared "Meh," this is it. I will, however, grant it points for its performance/watt, relative to its competitors.

I would be very surprised if the price held up where it is, though. Then again, a quick look at PC Part Picker for a GDDR5 version of the GT 1030 is showing a single passively cooled model for $90 directly from Asus, and the rest at $114 and higher. The horrible DDR4 version is just as pricey, which elicits a big DoubleYoo Tee Eff?

Ouch.

I can think of another GPU that strode forth and bodly declared Meh this is it.
Do you mean competitors like Intel's ARC A380 which was just released in China only and cost more than AMD 6400 and is slower?

Unlike Intel, however, at least AMD didn't build a bunch of hype around the 6400.
 
I can think of another GPU that strode forth and bodly declared Meh this is it.
Do you mean competitors like Intel's ARC A380 which was just released in China only and cost more than AMD 6400 and is slower?

Unlike Intel, however, at least AMD didn't build a bunch of hype around the 6400.
I'm not sure the A380 actually costs more than the RX 6400, and it has 2GB more VRAM, much better codec support... but questionable drivers at present. Yeah, it's not great, and the China-only business does not inspire any confidence in me whatsoever. But the theoretical price of the A380 is supposed to be under $150 as I understand things. And if Intel ever wants to be a real player in the GPU space, it absolutely has to fix the driver situation, which is something it knows and is working on. I'm pretty sure a big part of the delayed US launch is to give the driver teams three extra months of debugging and fixing. We'll find out in the next two months... But yes, I'm looking to be underwhelmed by first generation Arc performance. I'm also very hopeful that Intel will keep iterating and actually close the gap with AMD and Nvidia over time, because it would be great to have a third serious player in the GPU market.
 

shady28

Distinguished
Jan 29, 2007
443
314
19,090
It's pretty disappointing that the lower end of the market, around $150 MSRP, hasn't really moved in performance since the 1650 was released in Feb of 2019.

Yes, 2 1/2 years and there is really no movement here. This wasn't always the case, the 750 Ti (2014) and 1050 Ti (2016) were great cards for their time that sucked people into PC gaming for a fairly low price.

This failure to seed the market so to speak may backfire in coming years.

Intel-A380.webp
 
Thanks a lot for the review. This card actually had a lot of potential (much like the 6500XT) to come and save the day for a lot of people, but they both fell so darn flat it wasn't even funny. It's like one of those bad movies that is so bad it's good, but in this case, the movie was just bad... At least you can decode the movie, right? Heh.

Anyway, I wish they'd pack a bit more features for <75W cards to justify them being half width for slim cases. I still have my case waiting for that one card that is worthy of going into it. Ah, the dreams and hopes burned, haha.

Regards.
 
  • Like
Reactions: shady28

King_V

Illustrious
Ambassador
I don't think it's actually terrible for that niche it's supposed to cover... you need something in a system that doesn't have a PCIe connector, with the option of also having a low-profile, single-slot version.

And, when I say that, I mean to include that the cooler itself is only single-slot height.

After all, as Jarred said:
AMD's Radeon RX 6400 is geared for a very specific niche. If you happen to fall into that niche, go ahead and add 1.5 stars to our score and pick one up.

I agree that, even for what it is, it most certainly is overpriced. Then again, I seem to recall R7 250E/7750 cards that were single-slot-low-profile designs costing more than their normal sized counterparts. Likewise, it was difficult (impossible?) to find a 750Ti that fit that form factor at all (I was looking to squeeze something into a Dell Inspiron 3647 Small Desktop). They tended to be priced a little higher probably because of the cold calculation of "a captive audience with very few options."

If a 1630 comes out, I can't imagine it being a contender. The 1650 and 6400 trade blows, with the 1650 dominating once the details are cranked up. The 1630 would probably be out of the running.

That leaves the real contest for New Meh to the 6400 and the A380. Assuming that the A380 allows for no-PCIe, and offers low-profile-single-slot solutions. At 75W, I don't think that's going to be possible.

Best case, A380 vs 6400 becomes the Battle For Meh! . . . but I'm starting to suspect that even the A380 won't be able to - it'll require physically larger cards/coolers, and, similar to the 1650, most will require a PCIe connector.

That'll put it at A380 vs 1650, which the Intel card is going to lose, at a performance level, though likely win easily in the price/performance aspect.


(edit: grammar/clarity)
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
I'm pretty sure a big part of the delayed US launch is to give the driver teams three extra months of debugging and fixing. We'll find out in the next two months... But yes, I'm looking to be underwhelmed by first generation Arc performance. I'm also very hopeful that Intel will keep iterating and actually close the gap with AMD and Nvidia over time, because it would be great to have a third serious player in the GPU market.
We have to assume that amd and nvidia have decades worth of a head-start of soft and hardware IP to make games run better and three month is not going to make a difference for intel in closing that gap.

Intel has to focus to the future and that's what they are doing, they provide all the tools and info to developers and have arc integrated to unity and unreal, they focus on future games being specifically optimized for their cards.
It's still not a sure bet by any means that the cards will perform well in the future, but at least intel has the groundwork laid out.
https://www.intel.com/content/www/us/en/developer/topic-technology/gamedev/overview.html
 

thisisaname

Distinguished
Feb 6, 2009
931
518
19,760
I can think of another GPU that strode forth and bodly declared Meh this is it.
Do you mean competitors like Intel's ARC A380 which was just released in China only and cost more than AMD 6400 and is slower?

Unlike Intel, however, at least AMD didn't build a bunch of hype around the 6400.

Loved the sub heading on the articles, harsh but true :giggle:
 

magbarn

Reputable
Dec 9, 2020
157
149
4,770
I'm not sure the A380 actually costs more than the RX 6400, and it has 2GB more VRAM, much better codec support... but questionable drivers at present. Yeah, it's not great, and the China-only business does not inspire any confidence in me whatsoever. But the theoretical price of the A380 is supposed to be under $150 as I understand things. And if Intel ever wants to be a real player in the GPU space, it absolutely has to fix the driver situation, which is something it knows and is working on. I'm pretty sure a big part of the delayed US launch is to give the driver teams three extra months of debugging and fixing. We'll find out in the next two months... But yes, I'm looking to be underwhelmed by first generation Arc performance. I'm also very hopeful that Intel will keep iterating and actually close the gap with AMD and Nvidia over time, because it would be great to have a third serious player in the GPU market.
Too bad for Intel as the train has already left the station. Knowing Intel, if Arc is a dud, they'll fire the whole team and we'll be back to the duopoly again. If they just started 6 months ago, Intel would've been as a savior of GPU market. Here's hoping that they give their GPU division at least 3 generations to catch up and get sizable market share.
 
It's pretty disappointing that the lower end of the market, around $150 MSRP, hasn't really moved in performance since the 1650 was released in Feb of 2019.

Yes, 2 1/2 years and there is really no movement here. This wasn't always the case, the 750 Ti (2014) and 1050 Ti (2016) were great cards for their time that sucked people into PC gaming for a fairly low price.

This failure to seed the market so to speak may backfire in coming years.
Actually, the lower end of the market has gone backwards since 2016. The Radeon RX 470 matched this performance for under $150 when it launched six years ago, in the summer of 2016. That could be found on sale at $120 by November 2016. For comparison purposes, use the RX 570, which gets about 10% more performance out of a pretty identical chip.
https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
 
Could we get some of these super-low-end graphics compared to integrated graphics? There are lot of people that ended up going with Ryzen chips with decent integrated graphics for Fortnite and lower gaming over the last three years. I'm curious if this is actually any better than that.

For anyone that isn't a hardcore gamer, has integrated graphics reached the point where it compares to sub-$200 graphics? If it can play titles on medium, then it has.

Also, you need a test suite that isn't comparing this to the 6700 XT.
 
We have to assume that amd and nvidia have decades worth of a head-start of soft and hardware IP to make games run better and three month is not going to make a difference for intel in closing that gap.

Intel has to focus to the future and that's what they are doing, they provide all the tools and info to developers and have arc integrated to unity and unreal, they focus on future games being specifically optimized for their cards.
It's still not a sure bet by any means that the cards will perform well in the future, but at least intel has the groundwork laid out.
https://www.intel.com/content/www/us/en/developer/topic-technology/gamedev/overview.html
Honestly, I'm assuming that Intel's Arc graphics exist in hopes of making something scalable that they can use in integrated graphics that can keep pace with AMD. From what I can tell, AMD's integrated graphics are viable for playing games from a couple years ago (like Skyrim or CS:GO) and Intel's aren't.
 
Honestly, I'm assuming that Intel's Arc graphics exist in hopes of making something scalable that they can use in integrated graphics that can keep pace with AMD. From what I can tell, AMD's integrated graphics are viable for playing games from a couple years ago (like Skyrim or CS:GO) and Intel's aren't.
They would only have to increase the EU in the integrated graphics to achieve that, they wouldn't have to create discreet graphics for that.
They don't make iGPUs that can game because they would have to spend too much silicon on it and their target market for systems without a GPU, mainly office systems, don't need gaming capabilities, at least that's my guess.
 
Could we get some of these super-low-end graphics compared to integrated graphics? There are lot of people that ended up going with Ryzen chips with decent integrated graphics for Fortnite and lower gaming over the last three years. I'm curious if this is actually any better than that.

For anyone that isn't a hardcore gamer, has integrated graphics reached the point where it compares to sub-$200 graphics? If it can play titles on medium, then it has.

Also, you need a test suite that isn't comparing this to the 6700 XT.
Well, I put up to the 6700 XT just for kicks, so that I didn't have four super low-end GPUs and nothing else. It's good to see what you can get for more money. I could have cut it off at the 6600 and 3050, but it doesn't change the performance offered.

As for integrated graphics, I tested Intel's Xe DG1 last year and compared it to GTX 1650, 1050 Ti, 1050, RX 560, Ryzen 5700G, GT 1030, and a Ryzen 4800U. So up until Rembrandt and the Ryzen 6000-series mobile chips launched, that was the best we could get for integrated. Here are the results:

7sUiYiG9TaqZFTU2VaLmXd.png

XhkxkKUU6LugCczeUi4qV4.png


At 720p, the 5700G integrated graphics was 41% slower than a GTX 1650. At 1080p, it was 56% slower. Even the RX 6400 easily outclasses current integrated graphics. We'll see what AMD does with Ryzen 7000 RDNA2 graphics and Zen 4 later this year, but I suspect AMD probably won't put a ton of CUs into the desktop parts. If it does go as high as 12 CUs and a 105W TDP, that should be quite close to the GTX 1650 in performance I think.
 
  • Like
Reactions: shady28 and King_V
They would only have to increase the EU in the integrated graphics to achieve that, they wouldn't have to create discreet graphics for that.
They don't make iGPUs that can game because they would have to spend too much silicon on it and their target market for systems without a GPU, mainly office systems, don't need gaming capabilities, at least that's my guess.
That "target market" idea used to be true. But with the price of graphics lately, just building a system with a Ryzen 5600G seems like an intriguing stopgap until you can afford an RTX 3050, RX 6600, or similar.

Looking at the benchmarks above (and the 5600G and 5700G reviews), I think the integrated Vega chips will get about half the framerate of a dedicated RX 6400, and just a bit less than a dedicated GT 1030. While that's not good, I think it's enough to justify waiting to spend $300 on graphics instead of $150 on this RX 6400.
 
Well, I put up to the 6700 XT just for kicks, so that I didn't have four super low-end GPUs and nothing else. It's good to see what you can get for more money. I could have cut it off at the 6600 and 3050, but it doesn't change the performance offered.

As for integrated graphics, I tested Intel's Xe DG1 last year and compared it to GTX 1650, 1050 Ti, 1050, RX 560, Ryzen 5700G, GT 1030, and a Ryzen 4800U. So up until Rembrandt and the Ryzen 6000-series mobile chips launched, that was the best we could get for integrated. Here are the results:

7sUiYiG9TaqZFTU2VaLmXd.png

XhkxkKUU6LugCczeUi4qV4.png


At 720p, the 5700G integrated graphics was 41% slower than a GTX 1650. At 1080p, it was 56% slower. Even the RX 6400 easily outclasses current integrated graphics. We'll see what AMD does with Ryzen 7000 RDNA2 graphics and Zen 4 later this year, but I suspect AMD probably won't put a ton of CUs into the desktop parts. If it does go as high as 12 CUs and a 105W TDP, that should be quite close to the GTX 1650 in performance I think.
Thanks for sharing those. I still can't justify the RX 6400 price point, even though Vega's not nearly good enough to displace it. It seems like you should just get a Ryzen 5600G and save up longer. That, or a worse CPU, and spend more on graphics.
 
Last edited:
That "target market" idea used to be true. But with the price of graphics lately, just building a system with a Ryzen 5600G seems like an intriguing stopgap until you can afford an RTX 3050, RX 6600, or similar.
Sure, how many of those can AMD realistically make?
That many that nobody has to buy a $40-70 intel CPU anymore and instead is going to go for a $150 one?
Especially for cheap office systems.
 
Thanks for sharing those. I still can't justify the RX 6400 price point, even though Vega's not nearly good enough to displace it. It seems like you should just get a Ryzen 5600G and save up longer. That, or a worse CPU, and spend more on graphics.
A lot of it depends on what you currently have, both for CPU and GPU, and your intended goal. If you want a gaming PC, I'd strongly recommend trying to get to at least the RX 6600 / RTX 2060 level of performance. Right now that costs almost $300, which is a lot, but I suspect it will be closer to $200 by the end of the year. Either of those GPUs would be basically double the performance of RX 6400 (well, 85% faster with the 2060 at 1080p medium, and 130% faster at 1080p ultra). But to get most of the potential performance out of a GPU like the RX 6600, I'd say you'd want at least a Ryzen 5 5600X or Core i5-12400 (with the Intel chip being faster in virtually every case). You could go with older CPUs and probably only lose 10-15% performance, depending on how old a chip you're running, and at 1080p ultra you'll likely become more GPU limited and might only lose 0-5%.
 
Sure, how many of those can AMD realistically make?
That many that nobody has to buy a $40-70 intel CPU anymore and instead is going to go for a $150 one?
Especially for cheap office systems.
True, it's not a replacement for office systems. But it's a valid choice if I'm building a Minecraft desktop for my kid or recommending a PC for my stepdad, who does light gaming. That makes it a pretty small market.
 
A lot of it depends on what you currently have, both for CPU and GPU, and your intended goal. If you want a gaming PC, I'd strongly recommend trying to get to at least the RX 6600 / RTX 2060 level of performance. Right now that costs almost $300, which is a lot, but I suspect it will be closer to $200 by the end of the year. Either of those GPUs would be basically double the performance of RX 6400 (well, 85% faster with the 2060 at 1080p medium, and 130% faster at 1080p ultra). But to get most of the potential performance out of a GPU like the RX 6600, I'd say you'd want at least a Ryzen 5 5600X or Core i5-12400 (with the Intel chip being faster in virtually every case). You could go with older CPUs and probably only lose 10-15% performance, depending on how old a chip you're running, and at 1080p ultra you'll likely become more GPU limited and might only lose 0-5%.
Keep in mind that "losing 15%" only matters if you're dropping below about 60fps. It's gonna be pretty rare that a modern CPU bottlenecks that low. A lot of these benchmarks we show are at 150 fps, which isn't anything people should be worrying about hitting if they want a gaming machine for $700.
 
Keep in mind that "losing 15%" only matters if you're dropping below about 60fps. It's gonna be pretty rare that a modern CPU bottlenecks that low. A lot of these benchmarks we show are at 150 fps, which isn't anything people should be worrying about hitting if they want a gaming machine for $700.
Minimums (or at least 99th percentiles) will also be a factor. Average FPS might only drop a few percent, but if an older CPU has frequent hitching and stuttering, that would be a bad experience, especially if it's frequently oscillating between >60 fps and <60 fps.
 
Minimums (or at least 99th percentiles) will also be a factor. Average FPS might only drop a few percent, but if an older CPU has frequent hitching and stuttering, that would be a bad experience, especially if it's frequently oscillating between >60 fps and <60 fps.
That's true. I was thinking more along the lines of recent Ryzen 3's and i3's. Most of those won't bottleneck most games.

At that price point, there are always compromises. I believe in compromising on the CPU for a gaming machine before a GPU.
 
Nov 4, 2022
1
0
10
Well, while I may be old fashioned, but how does a 6nm process , 50 Watt card that can churn out 60 FPS on max settings in 1080p in many of the recent games tested in reviews get only 2 and 1/2 stars.
I'm just not seeing it. How does it get better? Sure, the prices for low end stuff are not what they used to be 10 years ago, but that's true across the board. And those low end cards in their time had value if they could reach 40-50 FPS on medium. This one can manage 3x the FPS of my current RX550. I am sorry to be all negative, but the review may not reflect the realities and needs of low end computing.