News UserBenchmark bashes AMD GPUs and claims they lack real-world performance

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I don't know where that chart came from, would love a direct link. However, lets fast forward to FS 2024 and here you can see you get an extra 12 FPS from a 4080 super vs a 4070 Ti super. You get less than 1 frame from a 13900k vs 9800x3d.

Again we are talking about which thing you should push money into, CPU or GPU. Not Intel vs AMD.

for reference here is a link to the charts below.

qsFxkLgBi4XTW3toU2zvkn-970-80.png.webp

YqrdiPVd22Wb88pAZxuR7n-970-80.png.webp
You cannot completely dismiss a data point because it is convenient for your argument by looking at a completely differently made game like FS 2024 compared to MSFS.
 
You are missing the point of how much CPU very much matters, in CPU heavy titles. In normal scenarios, a 13600k and a 9800x3d are not that far apart.


Still 1440 with a 4090 as in the flight sim benchmark, about 9% difference, same for the 5800x3d.
relative-performance-games-2560-1440.png


In flight sim the 5800x3d is 71% faster than a 13600k.
It's also not 71%, it's 9% according to the chart you posted (5800x3d 100%, 13600k 91.3%)

This isn't about CPU vs CPU, that is the point that is being missed here. It's whether it makes more sense to spend on GPU than CPU. This doesn't prove you should spend extra on a CPU. A 3080 is 15% better than a 3070 Ti vs the 9% 5800x3d vs 13600. Spending the money on a 3080 yielded better FPS performance over spending money on the 5800x3d


relative-performance_2560-1440.png
 
You cannot completely dismiss a data point because it is convenient for your argument by looking at a completely differently made game like FS 2024 compared to MSFS.
Same as you can't dismiss that the aggregate game charts that show it makes more sense to splurge on GPU over CPU and I'm not convince you would see any difference for FS 2020 if we could find a similar GPU comparison (I still can't even find the chart posted for FS 2020 so there is that too). We are also talking about future scalability. Clearly FS 2024 shows GPU gives you get future scalability regardless of what the older version of FS 2020 shows.

This is what I can find for FS 2020 which is wildly different than the chart posted. That's why I want a link to the Toms hardware link because it looks a lot different than other charts I can find.
 
Last edited:
It's also not 71%, it's 9% according to the chart you posted (5800x3d 100%, 13600k 91.3%)

This isn't about CPU vs CPU, that is the point that is being missed here. It's whether it makes more sense to spend on GPU than CPU. This doesn't prove you should spend extra on a CPU. A 3080 is 15% better than a 3070 Ti vs the 9% 5800x3d vs 13600. Spending the money on a 3080 yielded better FPS performance over spending money on the 5800x3d


relative-performance_2560-1440.png
You are missing the point. If I am playing Baldurs gate 3 and I want the best value CPU + GPU combo its certainly not a 9600x + 4090 for 1080p costing 1900 dollars at MSRP. A 4080 super and a 9800X3D would get more performance at 1080p and cost 1480 dollars at MSRP. According to this chart the 4090 is only 14.4% faster than the 4080S at 1080p. We know this is not true, because at the time there was not a powerful enough CPU to show more of a difference in their selection of games they benched. These topics are more nuanced then; Average FPS of 20 games higher = this one always that % better.
 
Last edited:
I don't know where that chart came from, would love a direct link. However, lets fast forward to FS 2024 and here you can see you get an extra 12 FPS from a 4080 super vs a 4070 Ti super. You get less than 1 frame from a 13900k vs 9800x3d.

Again we are talking about which thing you should push money into, CPU or GPU. Not Intel vs AMD.

for reference here is a link to the charts below.


Straight from the 7800x3d review, and also a different version than what you posted.

https://www.tomshardware.com/reviews/amd-ryzen-7-7800x3d-cpu-review/4


It's also not 71%, it's 9% according to the chart you posted (5800x3d 100%, 13600k 91.3%)

This isn't about CPU vs CPU, that is the point that is being missed here. It's whether it makes more sense to spend on GPU than CPU. This doesn't prove you should spend extra on a CPU. A 3080 is 15% better than a 3070 Ti vs the 9% 5800x3d vs 13600. Spending the money on a 3080 yielded better FPS performance over spending money on the 5800x3d

You obviously didn't fully read what I posted. I said normally 9% for the 13600k and 5800x3d vs a 9800x3d, but flight sim it was 71% 5800x3d vs 13600k, which shows how CPU dependent that game is, when normally they are about even. You are trying to fit this one size fits all box around gaming rigs. There are outliers and exceptions to everything. Being a 19yr veteran WoW player, I am one of those outliers. Even with my RX 6800, at 1440p, I would see far better 1% lows with a 9800x3d, than I would my 12700k. Even a 5800x3d would be superior.
 
Last edited:
  • Like
Reactions: helper800
Same as you can't dismiss that the aggregate game charts that show it makes more sense to splurge on GPU over CPU and I'm not convince you would see any difference for FS 2020 if we could find a similar GPU comparison (I still can't even find the chart posted for FS 2020 so there is that too). We are also talking about future scalability. Clearly FS 2024 shows GPU gives you get future scalability regardless of what the older version of FS 2020 shows.
Its very reductionist to say a collection of 10-40 games benchmarked gives you a fact that all 1,000,000+ games made will perform as they did in that small snapshot. There are many games that are not benched that are heavily CPU bound where you could save money on the graphics card to spend on the CPU. MSFS was one of them that was benchmarked. Here is that link by the way.
 
Fine, allow me to find CPU bound games with a 4090 using a 9800X3D. I cannot show that the same CPU with a lesser GPU would have similar performance because nobody tests that way. Here are some charts that show a 9800X3D pulling massively ahead compared to a 9600x which logically means a lesser GPU than a 4090 can be paired with the 9800X3D to get the same or better FPS as a 9600X and a 4090.

Theoretically you could get a GPU that is X percent slower than the 4090 + 9600x where X is the amount the 9800X3D is the percentage faster than the 9600x and pair it with the 9800X3D. Sources for the below images here. There are tons of games that are not commonly benchmarked that would also support my argument. MSFS was not benchmarked hear, nor any of the games from the genres I mentioned above in the entire 9800X3D review from TPU. Below are stock for stock comparisons.

Baldurs gate 3 The 9800X3D performs 51.9% faster than the 9600x at 1080p
Cyberpunk 2077 The 9800X3D performs 21.5% faster than the 9600x at 1080p
Elden ring The 9800X3D performs 28.0% faster than the 9600x at 1080p
Remnant 2 The 9800X3D performs 21.2% faster than the 9600x at 1080p
Spider-Man Remastered The 9800X3D performs 25.5% faster than the 9600x at 1080p
I am not sure the goal of the argument here. You are trying really hard to show how important CPU cache is correct? I think logically speaking, it's not as important as GPU with the majority of games. Why fight that?
 
  • Like
Reactions: JamesJones44
Its very reductionist to say a collection of 10-40 games benchmarked gives you a fact that all 1,000,000+ games made with perform as they did in that small snapshot. There are many games that are not benched that are heavily CPU bound where you could save money on the graphics card to spend on the CPU. MSFS was one of them that was benchmarked. Here is that link by the way.
Just as it's ridiculous to say 1 game proves your side of the argument. When doing research do we look at 1 sample and say it proves the rule or do we look at a confluence of samples.

At any rate we'ved move way off the topic of 4K gaming which was the original post and I've seen no evidence to prove that was incorrect. The posts have largely circled around finding specific benchmarks at other resolutions. In either case for the vast majority of games and for future proofing it makes more sense to beef up GPU over CPU. FS 2024 proves this and costs from upgrading prove this. Unless you can prove over a large set of games it makes more sense at higher resolutions to invest in CPU over GPU, there is no sense in continuing this conversation.
 
Just as it's ridiculous to say 1 game proves your side of the argument. When doing research do we look at 1 sample and say it proves the rule or do we look at a confluence of samples.

At any rate we'ved move way off the topic of 4K gaming which was the original post and I've seen no evidence to prove that was incorrect. The posts have largely circled around finding specific benchmarks at other resolutions. In either case for the vast majority of games and for future proofing it makes more sense to beef up GPU over CPU. FS 2024 proves this and costs from upgrading prove this. Unless you can prove over a large set of games it makes more sense at higher resolutions to invest in CPU over GPU, there is no sense in continuing this conversation.
If that 1 sample is the only game I play, its all that is determinative of my purchasing decisions. You are trying to bludgeon everyone with averages that do not include some of the most choice determinative data points. Your claim is false, one exception breaks the rule that you made in your argument.
 
  • Like
Reactions: logainofhades
If that 1 sample is the only game I play, its all that is determinative of my purchasing decisions. You are trying to bludgeon everyone with averages that do not include some of the most choice determinative data points. Your claim is false, one exception breaks the rule that you made in your argument.
One personal preference does not disprove that in most cases it makes more sense to splurge on GPU and again if you plan to play other games, heck even upgrade from FS 2020 to FS 2024 it still makes more sense.
 
It's still not 71%. in the link shared the FPS for a 5800x3d 170.5, the 13600k is 123.3. Some basic math, 170.5 - 123.3 is 47.2 FPS. 47.2 / 123.3 = 38.2%


I think there is a misconception here we are talking about Intel vs AMD. No one is talking about that here. We are talking about spending money on GPU vs CPU an as it pertains to the original posts, higher resolutions and how well if future proves. 1 game was picked to try to argue against this while confluence of games shows CPU over GPU doesn't hold up. Even upgrading to a new game, FS 2024 shows spending on GPU makes more sense.
Some even more basic math would be 170.5/123.3 = 1.3828 times the performance. He probably make the mistake of comparing the minimum fps of the 13600k vs the average of the 5800X3D which is 1.65 times.
 
You obviously didn't fully read what I posted. I said normally 9% for the 13600k and 5800x3d vs a 9800x3d, but flight sim it was 71% 5800x3d vs 13600k, which shows how CPU dependent that game is, when normally they are about even. You are trying to fit this one size fits all box around gaming rigs. There are outliers and exceptions to everything. Being a 19yr veteran WoW player, I am one of those outliers. Even with my RX 6800, at 1440p, I would see far better 1% lows with a 9800x3d, than I would my 12700k. Even a 5800x3d would be superior.
It's still not 71%. in the link shared the FPS for a 5800x3d 166.7, the 13600k is 119.2. Some basic math, 166.7 - 119.2 is 47 FPS. 47 / 119.2 = 39.4%


I think there is a misconception here we are talking about Intel vs AMD. No one is talking about that here. We are talking about spending money on GPU vs CPU an as it pertains to the original posts, higher resolutions and how well if future proves. 1 game was picked to try to argue against this while confluence of games shows CPU over GPU doesn't hold up. Even upgrading to a new game, FS 2024 shows spending on GPU makes more sense.
 
UserBenchmark claims that AMD's Radeon GPUs fail to deliver real-world performance and are backed by an army of influencers it blames for duping customers.

UserBenchmark bashes AMD GPUs and claims they lack real-world performance : Read more
1000% agree with this. if you are new and broke you choose AMD because of that out of sheer ignorance. Through normal experiences you realize unfortunately amd cpus/gpus are a highly inferior product. rife with problems and issues. users are full with frusterations, or if they are ignorant, they dont even realize why the issue is happening at all, the answer always is. your hardware sucks. im sure we can all atest to the guy who has a great amd experience, or a great intel/nvidia experience. but my opinion is, amd products are inferior, period, they always have been, if you have experience in life, meaning more than 10 years in this pc hardware industry, you would know and understand these things well. but look feel free disagree idc. i feel the price justifies the value, unfortunately for me that is a fact.
 
One personal preference does not disprove that in most cases it makes more sense to splurge on GPU and again if you plan to play other games, heck even upgrade from FS 2020 to FS 2024 it still makes more sense.
None of us ever said spending more on the CPU would be better on average, in fact I explicitly said otherwise. I then gave examples where spending more on the CPU can matter.
 
Last edited:
Hey, look, it's the guy who STILL doesn't understand why you benchmark a CPU at 1080p. Despite Hardware Unboxed, GN, Tom's, etc. all explaining it many many times for the slow and dense.

Also, what sucks compared to previous gen cards? The 9070XT? A midrange card sucks compared to the previous gen's apex card? Huh. Imagine. The 5070ti isn't as fast as a 4090 (despite Jensen's claims.) Is it lacking real world performance for its price and placement in the stack? (Well, maybe that wasn't the best example.)
I understand why they say they benchmark at 1080p. And I still say it is WRONG. And no manner of name calling will change that, keyboard warrior.
Yes, I get that it takes away the bottleneck. But how much of a bottleneck is there when you're gaming at 4K? Or even 1440 Ultrawide? If I'm running at 4K, I want to know if it is worth upgrading my CPU. I'm not going to benchmark my computer at 1080p, think, "oh, wow, that's fast", then switch to 4K and expect the same increase is FPS.
 
1000% agree with this. if you are new and broke you choose AMD because of that out of sheer ignorance. Through normal experiences you realize unfortunately amd cpus/gpus are a highly inferior product. rife with problems and issues. users are full with frusterations, or if they are ignorant, they dont even realize why the issue is happening at all, the answer always is. your hardware sucks. im sure we can all atest to the guy who has a great amd experience, or a great intel/nvidia experience. but my opinion is, amd products are inferior, period, they always have been, if you have experience in life, meaning more than 10 years in this pc hardware industry, you would know and understand these things well. but look feel free disagree idc. i feel the price justifies the value, unfortunately for me that is a fact.
I have been in the PC space for 15 years and have had nearly every brand combination. To say unequivocally that AMDs product are always and have always been inferior is a lie. I can give so many examples to the contrary, but I think I am done being baited today.
 
  • Like
Reactions: King_V
Some even more basic math would be 170.5/123.3 = 1.3828 times the performance. He probably make the mistake of comparing the minimum fps of the 13600k vs the average of the 5800X3D which is 1.65 times.

For FS 2020 or another item?

For FS 2020 the minimum 1440 is 126.6 - 104.3 = 22.3, 22.3 /104.3 is a 21% difference between 5800x3d and 13600k
 
View: https://www.youtube.com/watch?v=Zy3w-VZyoiM


And there's a LOT of follow ups and re-tests to continue explaining and trying to hammer the point to people's THICK skulls. Not just from HUB, but many others, but I think HUB-Steve has explained it the best.

I'll put a few others after I find them.

EDIT:
View: https://www.youtube.com/watch?v=5GIvrMWzr9k


View: https://www.youtube.com/watch?v=98RR0FVQeqs


Regards.
I understand why they say they benchmark at 1080p. And I still say it is WRONG. And no manner of name calling will change that, keyboard warrior.
Yes, I get that it takes away the bottleneck. But how much of a bottleneck is there when you're gaming at 4K? Or even 1440 Ultrawide? If I'm running at 4K, I want to know if it is worth upgrading my CPU. I'm not going to benchmark my computer at 1080p, think, "oh, wow, that's fast", then switch to 4K and expect the same increase is FPS.
 
I understand why they say they benchmark at 1080p. And I still say it is WRONG. And no manner of name calling will change that, keyboard warrior.
Yes, I get that it takes away the bottleneck. But how much of a bottleneck is there when you're gaming at 4K? Or even 1440 Ultrawide? If I'm running at 4K, I want to know if it is worth upgrading my CPU. I'm not going to benchmark my computer at 1080p, think, "oh, wow, that's fast", then switch to 4K and expect the same increase is FPS.
Generally upgrading your CPU for 4k is not worth it at all unless you have a very old CPU or you play very specific games where it can matter.
 
None of use ever said spending more on the CPU would be better on average, in fact I explicitly said otherwise. I then gave examples where spending more on the CPU can matter.

I'm not disagreeing with that at all. In my original post I said at lower resolutions CPU becomes more important, especially 1080 and less. GPU still wins there in general cases, but it becomes far less of a clear winner at lower resolutions. As you scale resolution GPU starts to become more important and wins more often.
 
Generally upgrading your CPU for 4k is not worth it at all unless you have a very old CPU or you play very specific games where it can matter.
but it would be nice to know. If someone is going to be doing a review on a CPU, then they should be giving full information for everyone.
I also suggested years ago that they should test games with and without hyperthreading - especially with the 13th and 14th gen CPUs that overheat. If you turn off hyperthreading, they don't get quite as hot. Do games run better or worse with that? Intel's new ones got rid of hyperthreading, so seems like there was something there to look at all along.
 

I think that only holds if the overclocked numbers vs non over clocked. 207.3 - 123.3 = 84 /123.3 = 68%, for low 1% it's 168.0 - 102.9 = 65.1 / 102.9 = 63.2% (still not 71+%)

If we do a fair comparison, non-overclocked to non-overclock 170.5 - 123.3 = 47.2 / 123.3 = 38.2% and 1% 129.1 - 102.9 = 26.2%. Overclocked to overclocked 194.8 - 137.3 = 57.5 / 137.3 = 41.9% and 1% 156.5 - 120.7 = 35.8 / 120.7 = 29.7%. It's a lot less than 70%
 
but it would be nice to know. If someone is going to be doing a review on a CPU, then they should be giving full information for everyone.
I also suggested years ago that they should test games with and without hyperthreading - especially with the 13th and 14th gen CPUs that overheat. If you turn off hyperthreading, they don't get quite as hot. Do games run better or worse with that? Intel's new ones got rid of hyperthreading, so seems like there was something there to look at all along.
You are talking about 10's to 100's of extra review hours while the reviewers are already crunching to get their reviews up in time. Full information in a review is impossible. There are mathematically near infinite combinations of settings in the BIOS, settings in games, monitors and their settings, hardware configurations, et cetera. The reviews usually pick the ones that make the most sense to the average reader and test those. If you want testing that is not being done on Tom's there are plenty of more reviewers that do different things.
 
I think that only holds if the overclocked numbers vs non over clocked. 207.3 - 123.3 = 84 /123.3 = 68%, for low 1% it's 168.0 - 102.9 = 65.1 / 102.9 = 63.2% (still not 71+%)

If we do a fair comparison, non-overclocked to non-overclock 170.5 - 123.3 = 47.2 / 123.3 = 38.2% and 1% 129.1 - 102.9 = 26.2%. Overclocked to overclocked 194.8 - 137.3 = 57.5 / 137.3 = 41.9% and 1% 156.5 - 120.7 = 35.8 / 120.7 = 29.7%. It's a lot less than 70%
I never said it was a 70% difference. I said that he most likely made a mistake and gave one example of such a mistake that was close to a 70% difference. I already did the fair comparison.