News UserBenchmark bashes AMD GPUs and claims they lack real-world performance

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Not entirely true, his benchmark of storage drives is top tier, It is a great aggregator of obscure performance figures and has helped me make many of ebay purchase decisions when doing upgrades on the cheap for others.

For GPU and CPU, I would venture if you compare only across same company (Intel vs Intel, Nvidia vs Nvidia, or AMD vs AMD). I would trust those conclusions wholeheartedly, but I do believe their is some brand vs brand bias inherent at userbenchmark.

I personally think it is rooted in some sort of deep bias. I can relate, after I spent a lot on an HP laptop, and replaced the MOBO 3+ times, I have strong feelings about purchasing or using HP laptops that I now know are actually unfounded (the real issue was using the wrong DRAM, and the mobo were actually fine, would work at first then stop working).
I like the memory benchmark as well. And I think that, not including the benchmarks incorrect indifference to large cache, it is the best general bench outside of actual games, for the gaming performance of a CPU.
Mind you, not considering the benefit of large cache is a big hole, but everything else makes the 3D marks look like trash in CPU gaming assessment.

I'm not bothered by the trash talk. And I think it would be nice if it were more consistent across different arches, but nothing is perfect and it still has plenty of good uses.
 
  • Like
Reactions: cyrusfox
But it does suck when it's compared to it's own previous gen cards. If it can't beat it's own 7900XTX in rasterization performance, then, yea, it lacks real world performance.
And your own quote, quoting them, "For context, it once recommended readers purchase a Core i5-13600K over the Ryzen 7 9800X3D, asserting, and I quote, "Spending more on a gaming CPU is often pointless."" I've often said your benchmarking and reviews of CPUs in gaming is greatly flawed. You take a top of the line graphics card, then run it at 1080p to benchmark the CPU. No one is buying an RTX 4090 or RTX 5090 and gaming at 1080p. If you have those cards, you're gaming at 4K. What kind of difference in fps do you have then between the 13600K (which is highly overclockable) and the Ryzen 7 9800X3D?
You never do real world benchmarks like that. How much of a difference would it be from the core i7 to the core i9 in those situations when the core i7 can overclock much more, and have a lot less thermal throttling?
View: https://www.youtube.com/watch?v=Zy3w-VZyoiM


And there's a LOT of follow ups and re-tests to continue explaining and trying to hammer the point to people's THICK skulls. Not just from HUB, but many others, but I think HUB-Steve has explained it the best.

I'll put a few others after I find them.

EDIT:
View: https://www.youtube.com/watch?v=5GIvrMWzr9k


View: https://www.youtube.com/watch?v=98RR0FVQeqs


Regards.
 
AMD has never been known as the top of the high performance GPUs merely the cheapest alternatives. Of course with Intel entering the picture they may lose that crown as well.
They were know for high end GPU's, its been a while though. They held the performance crown multiple times with the Radeon 9000, x800, hd 5000, hd 6000, hd 7000, and x290 series. Nvidia generally quickly released a card that either matched, or slightly beat those cards. But to say they've never had the crown is disingenuous, its definitely been a while though. They got close with the RX 6950 XT if you didn't care about ray tracing, but the RTX 3090 TI sorted that out.

https://www.techpowerup.com/review/amd-radeon-rx-6950-xt-reference-design/30.html
 
Actually it’s people like you and these tech sites that lie. You sound angry like a woman so I know your a liar.
So you try to make your point by perpetuating sexism? This isn't wccftech.
bunny-wth.gif
 
Last edited:
His statement that there is no advantage in buying a 3D CPU if it is married to an average GPU is correct. The GPU will be the bottleneck. You need a top of the line GPU for the 3D CPU to shine. The majority of users do not own a fast GPU although that will increase over time as people upgrade. It is also true that AMD drivers are usually not as good as Nvidia's.

He may have an anti AMD bent but his points are not without some merit. Simply denouncing him as crazy just lowers the conversation.

It's a blanket statement that isn't quite true. Some titles are very cpu heavy and benefit from x3d chips, even with more modest graphics. I fall into that category. I only play WoW, and really wish I had an x3d chip. The cache helps keep the min FPS from totally tanking, in highly populated situations like cities, and raids. IIRC Microsoft flight sim loves the 3d cache too.
 
Here is a perfect example of the complete BS of Userbenchmark... When these CPU's are actually benchmarked against each other you get this, 285k being 12.1% slower instead of just 1%:
relative-performance-games-1920-1080.png

Source.

I'm not disagreeing with you about UserBenchmark not telling the whole story, they are biased without a doubt. Just wanted to start with that.

At 1080, CPU matters. At 4K though, CPU doesn't matter as much (it matters but marginally). If someone wanted to game at 4K I would argue skimping on CPU and putting the saved money toward GPU makes a lot of sense.

relative-performance-games-38410-2160.png

Source.
 
There has to be a way to ask Google to blacklist them from tech-related search results. Or, at least, add some sort of warning about their very obvious biases.

This site, while a great idea on paper, the clown running the show makes it terrible for anyone looking for decent data and unbiased analysis.

Regards.
It would be better if Google did not put their thumb on the scales to squash websites for "bias". The correct solution is for other sites to up their SEO game. I think cpubenchmark.net (PassMark) has done this type of thing, getting every two combinations of CPUs imaginable into searches like UserBenchmark does.
AMD should use a small percentage of their profits and sue that site for slander and put them out of business. It's one thing to not like a Company and it's products, it's a different thing to continually make falls and misleading statements about a Company and it's product(s).
AMD should really take action. I don't know the legal grounds for this, but it is almost a direct attack at the company. Either that, or take action against Google for favoring such false content, misleading millions of people away from good products.
Strongly opinionated/bad reviews should not be grounds for a vindictive lawsuit aimed at destroying a website.
Actually it’s people like you and these tech sites that lie. You sound angry like a woman so I know your a liar.
Are you the guy who runs UserBenchmark? We all want to know.
 
It's a blanket statement that isn't quite true. Some titles are very cpu heavy and benefit from x3d chips, even with more modest graphics. I fall into that category. I only play WoW, and really wish I had an x3d chip. The cache helps keep the min FPS from totally tanking, in highly populated situations like cities, and raids. IIRC Microsoft flight sim loves the 3d cache too.
I don't think a faster processor will help much for dips in WoW cities. I know that its generally thought of as a CPU reliant game. In Valdrakken and Dornogal I get 99% GPU utilization and 50-60% CPU with FPS lows in the 40s. In raids, delves and dungeons I am 100+ FPS. This is all at 1440 Ultrawide max settings. Having read through lots of threads on the matter, consensus seems to be poor optimization of the game by Blizzard. Has been going on for years.
 
  • Like
Reactions: artk2219
I'm not disagreeing with you about UserBenchmark not telling the whole story, they are biased without a doubt. Just wanted to start with that.

At 1080, CPU matters. At 4K though, CPU doesn't matter as much (it matters but marginally). If someone wanted to game at 4K I would argue skimping on CPU and putting the saved money toward GPU makes a lot of sense.

relative-performance-games-38410-2160.png

Source.
Not necessarily, since with every new generation of video cards we get higher 4K potential and unless you want to keep buying a new cheap CPU every time you upgrade your card then it’s best to get a CPU that can stay relevant longer.
relative-performance-games-38410-2160.png
 
His statement that there is no advantage in buying a 3D CPU if it is married to an average GPU is correct. The GPU will be the bottleneck. You need a top of the line GPU for the 3D CPU to shine.
Please do not post disinformation. What you say about CPU vs GPU is heavily dependent on which game is being played, what resolution is being run, detail settings, FPS desired, and so forth, so is not even close to a universal truth.
The majority of users do not own a fast GPU although that will increase over time as people upgrade.
See above. Also, define "fast."
It is also true that AMD drivers are usually not as good as Nvidia's.
Now you're repeating outright disinformation.
He may have an anti AMD bent but his points are not without some merit. Simply denouncing him as crazy just lowers the conversation.
He is very rabidly anti-AMD, to the point where he does, to an unbiased observer, sound like a lunatic and he is absolutely a conspiracy-theorist.

It doesn't lower the conversation. He might accidentally stumble across something accurate, but he's either well-paid, or crazy. He was so anti-AMD that Intel banned him from their subreddit.
 
I don't think a faster processor will help much for dips in WoW cities. I know that its generally thought of as a CPU reliant game. In Valdrakken and Dornogal I get 99% GPU utilization and 50-60% CPU with FPS lows in the 40s. In raids, delves and dungeons I am 100+ FPS. This is all at 1440 Ultrawide max settings. Having read through lots of threads on the matter, consensus seems to be poor optimization of the game by Blizzard. Has been going on for years.
The ultimate CPU crusher games are simulation/builder games that require many path/route calculations and small object image updates.
These simulation/builder games run well at the start, but the user building adds complexity. Eventually, the map becomes so complex that it chokes the GPU and the game will run at <30fps.

Off the top of my head, these are games like....
Satisfactory, Factorio, Timberborne, Transport Fever 2.
These games are not particularly demanding on the GPU, and decently optimized, but some saves can hit <10fps due to the sheer number of entities requiring path calculations. The games start off at well over >120fps btw.
 
  • Like
Reactions: artk2219
The ultimate CPU crusher games are simulation/builder games that require many path/route calculations and small object image updates.
These simulation/builder games run well at the start, but the user building adds complexity. Eventually, the map becomes so complex that it chokes the GPU and the game will run at <30fps.

Off the top of my head, these are games like....
Satisfactory, Factorio, Timberborne, Transport Fever 2.
These games are not particularly demanding on the GPU, and decently optimized, but some saves can hit <10fps due to the sheer number of entities requiring path calculations. The games start off at well over >120fps btw.
Yes I hear that. My wife runs Forum8 software in her city planning research and the best video cards available can never get it to 60 FPS.
 
  • Like
Reactions: artk2219
I don't see why everyone gets so up in arms about this guy's bias against AMD, his biased comments are at the very bottom of the page, just ignore them.

it's a useful site that has the best comparison UI anywhere that I'm aware of at least.
 
  • Like
Reactions: rluker5
Not necessarily, since with every new generation of video cards we get higher 4K potential and unless you want to keep buying a new cheap CPU every time you upgrade your card then it’s best to get a CPU that can stay relevant longer.
relative-performance-games-38410-2160.png

Even in that scenario, it's better to put the most money toward the best GPU. There is a 14% FPS difference between a 5070 Ti and a 5080 at 4K with an MSRP difference $150 (if your patient and don't rush out and buy today). While a 9800x3d only yields 2.1% FPS improvement at 4K over a 9700x for $120 MSRP more than the 9700x.

relative-performance-3840-2160.png
 
I seen that yesterday and just had a laugh at how much they despise AMD. They put the 30 series cards ahead of the 9070XT and even the 12600K over 9800X3D. People that do not know any different are going off their site and that could a huge number which could be a part even if small that is causing the Nvidia and Intel popularity. They are still the first choice on Google when looking at comparisons of CPUs and GPUs.
 
I don't see why everyone gets so up in arms about this guy's bias against AMD, his biased comments are at the very bottom of the page, just ignore them.

it's a useful site that has the best comparison UI anywhere that I'm aware of at least.
The data is invalid because he uses his bias to massage the data. It's not accurate and cannot be trusted. At all.
 
I don't see why everyone gets so up in arms about this guy's bias against AMD, his biased comments are at the very bottom of the page, just ignore them.

it's a useful site that has the best comparison UI anywhere that I'm aware of at least.
The comparisons are ridiculous though. They do not even come close to real world benchmarks nore are they correct and reflect the true performance of AMDs hardware. They also embellish Nvidias performance by a large margin
 
For anyone that missed it on the UB comparison page between the 5070Ti and 9070XT, look at the "Value & Sentiment" comparison where the zealot claims the Nvidia GPU literally has "infinity better" value.

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-5070-Ti-vs-AMD-RX-9070-XT/4181vsm2395341

I don't see why everyone gets so up in arms about this guy's bias against AMD, his biased comments are at the very bottom of the page, just ignore them.

it's a useful site that has the best comparison UI anywhere that I'm aware of at least.
Because its bad data. Even without the ridiculous write up at the end, the data is flawed and people look at it believing that its legitimate.