News UserBenchmark bashes AMD GPUs and claims they lack real-world performance

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I can't honestly see how anybody could prefer intel's current generation to AMDs in terms of CPUs. I *guess* the platform could be overall more solid? I know AMD contracted out the design of their current chipset to ASMedia, which seems somewhat iffy to me, but in terms of CPUs themselves, AMD seems to rule the roost.
Do you even know what we are talking about here? Do some research about the Userbenchmark's irrational hate against AMD. It has nothing to do with what people prefer.
 
Hey, look, it's the guy who STILL doesn't understand why you benchmark a CPU at 1080p. Despite Hardware Unboxed, GN, Tom's, etc. all explaining it many many times for the slow and dense.

Also, what sucks compared to previous gen cards? The 9070XT? A midrange card sucks compared to the previous gen's apex card? Huh. Imagine. The 5070ti isn't as fast as a 4090 (despite Jensen's claims.) Is it lacking real world performance for its price and placement in the stack? (Well, maybe that wasn't the best example.)
The sites should just use professional class GPU cards to test CPUs. That would really provide better data. It clearly doesn’t matter if no one has the GPU they are using.
 
I'm not disagreeing with you about UserBenchmark not telling the whole story, they are biased without a doubt. Just wanted to start with that.

At 1080, CPU matters. At 4K though, CPU doesn't matter as much (it matters but marginally). If someone wanted to game at 4K I would argue skimping on CPU and putting the saved money toward GPU makes a lot of sense.

relative-performance-games-38410-2160.png

Source.
I was talking about comparative CPU performance so 1080p is the obvious resolution to use. UB does all of its "benchamarks" at low resolutions yet the 285k is only 1% slower compared to the 9800X3D, meanwhile, UB says the entire 13000 and 14000 Intel i5s and up as faster than the 9800X3D in those benchmarks. The prior claim is obviously dumb even in a vacuum, with the added context of other reviews it makes clear the maliciousness. I would not endorse someone get a 9800X3D for 4k unless the games they play specifically benefit from it or its within their budget without making a sacrifice. There are a few games that are particularly CPU intensive that are not tested much if at all like Beyond All Reason (BAR), X4 foundations, WoW, Anno, et cetera, or eSports titles where you want as much of a latency reduction as possible with high frame rate monitors.
 
Last edited:
  • Like
Reactions: artk2219
It feels like a huge mistake to give UserBenchmark any press. Please ignore them in the future.

-kp
I disagree. Just looking at this thread on one of the most popular PC tech forums on the internet shows that it needs to be broadcast far & wide that UB is <Mod Edit> junk because many many people, including those who should probably know better, don't seem to get it.
 
Last edited by a moderator:
I disagree. Just looking at this thread on one of the most popular PC tech forums on the internet shows that it needs to be broadcast far & wide that UB is <Mod Edit> because many many people, including those who should probably know better, don't seem to get it.
I agree! You don't foil bad ideas, data, or philosophy by sequestering it away, you shine a light on it with investigative journalism. If you just ignore it the ideas will fester like a cancer even if they are bad.
 
Last edited by a moderator:
I'm not disagreeing with you about UserBenchmark not telling the whole story, they are biased without a doubt. Just wanted to start with that.

At 1080, CPU matters. At 4K though, CPU doesn't matter as much (it matters but marginally). If someone wanted to game at 4K I would argue skimping on CPU and putting the saved money toward GPU makes a lot of sense.

relative-performance-games-38410-2160.png

Source.
That’s not completely true. The CPU matters at 4k mostly for two reasons:

1- The low 1% FPS gets a significant improvement with a better CPU, bigger than the average frame rate, and this has a great impact on the gaming experience.

2- Most people don't play native 4k nowadays, even with a 4090. They play with upscaling, and the CPU matters in that case since the games are rendered at lower a resolution.

And if you have the money for a 4k GPU, you can also afford a high-end CPU.
 
For anyone that missed it on the UB comparison page between the 5070Ti and 9070XT, look at the "Value & Sentiment" comparison where the zealot claims the Nvidia GPU literally has "infinity better" value.

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-5070-Ti-vs-AMD-RX-9070-XT/4181vsm2395341
Oh, that's only because nobody has ever purchased a 9070 XT (according to UB), versus the non-zero number of 5070Ti cards sold, therefore the Infinity Better Market Share.
 
The “army of influencers” he’s talking about is literally just the current discourse on Nividia pricing lol. Unfortunately Google will continue to show UserBenchmark first as long as they continue to pay up.
And all that without any technical documents stating their tests. At this point that one paragraph should be considered, what it is, spin. Posted by an "influencer".
 
No it's accurate. NVIDIA had better stat cards already made and just were not releasing them yet until they saw what AMD was going to do and then they released their better cards. AMD doesn't even come close to the software side of it and they are completely lacking in all of the other products that Nvidia brings to the table. They do make cheap cards though with cheap components that underperform what Nvidia makes.
Still inaccurate. Ill just pick one example, it'll even be after AMD officially retired the ATI name. December 2011, Radeon HD 7000 series. The Radeon HD 7970 beat the GTX 580 as the fastest single gpu available, and Nvidia wouldn't have a rebuttal until March 2012 with their GTX 600 series. That was at least a 3 month period where Nvidia had no response available. The software side is really debatable, especially historically, with each offering features the other didnt have until they caught up. If you want to be a fanboy, fine, that's on you, but I dont see why you would have such a problem with acknowledging that AMD has ever had the crown. Just about any time that has happened, Nvidia has been incredibly offended, and put out an excellent response. When there is proper high end competition, we all win.

https://www.techpowerup.com/review/amd-hd-7970/28.html

https://www.techpowerup.com/review/nvidia-geforce-gtx-680/27.html
 
Still inaccurate. Ill just pick one example, it'll even be after AMD officially retired the ATI name. December 2011, Radeon HD 7000 series. The Radeon HD 7970 beat the GTX 580 as the fastest single gpu available, and Nvidia wouldn't have a rebuttal until March 2012 with their GTX 600 series. That was at least a 3 month period where Nvidia had no response available. The software side is really debatable, especially historically, with each offering features the other didnt have until they caught up. If you want to be a fanboy, fine, that's on you, but I dont see why you would have such a problem with acknowledging that AMD has ever had the crown. Just about any time that has happened, Nvidia has been incredibly offended, and put out an excellent response. When there is proper high end competition, we all win.

https://www.techpowerup.com/review/amd-hd-7970/28.html

https://www.techpowerup.com/review/nvidia-geforce-gtx-680/27.html
And then they launched the 7970 GHz Edition, with a 1GHz clock that took it back to the top for a while.
 
"But Nvidia does have DLSS in about 10x as many games as AMD has FSR4 and that is real world performance that is not shown in reviews."
Not trying to start an argument, but I made a spreadsheet, copied all games that Nvidia has listed on their DLSS supported page and then did the same with AMD. By my method there are 579 games that support some form of DLSS features, and 387 that have some FSR features. So, Nvidia has a 192 game lead over all if we count all versions of DLSS and FSR. On each generation of DLSS and FSR, AMD has been moderately to severely behind on some features, image quality, and sheer number of games with official integration. That said, the gap has substantially narrowed the last couple of generations of FSR... Seeing as the Nvidia page is not clarifying between which games support which version of DLSS, I am not sure how many Nvidia games are fully DLSS 4 enabled. AMD shows 36 are FSR 4 enabled.

https://www.amd.com/en/products/gra...ames.html#tabs-ab87f43a0c-item-60178a43f6-tab

Each generation of DLSS looks better than the equivalent FSR the vast majority of the time, and that is definitely a real world performance advantage when true. That said, the new framegen tech from both is generally just not there yet for either one. The innacuracy in images and the input latency it introduces make it a no-go for me. I have two desktop systems. Intel cpu and AMD GPU in one, AMD CPU and Nvidia GPU in the other. In rasterized performance across the board, the RX 6700XT 12GB massively beats the RTX 4060 (8GB) I have in the other, for basically the same price.

For those reason, I think that the mainstream reviewers methods of testing pure raster performance at various resolutions and then additionally raytracing performance is still the best method for comparing value for most gamers that aren't also creators that need CUDA etc. If you fall into that camp and can afford it, you go with Nvidia.

(Yes, I knew the 4060 was/is a bad value buy on price-to-performance alone, but I wanted to make a tiny low-profile SFF build for behind my mounted TV. As far as form factor goes, it was the most price-efficient and performant without modding a larger card.)
 
  • Like
Reactions: artk2219
Not trying to start an argument, but I made a spreadsheet, copied all games that Nvidia has listed on their DLSS supported page and then did the same with AMD. By my method there are 579 games that support some form of DLSS features, and 387 that have some FSR features. So, Nvidia has a 192 game lead over all if we count all versions of DLSS and FSR. On each generation of DLSS and FSR, AMD has been moderately to severely behind on some features, image quality, and sheer number of games with official integration. That said, the gap has substantially narrowed the last couple of generations of FSR... Seeing as the Nvidia page is not clarifying between which games support which version of DLSS, I am not sure how many Nvidia games are fully DLSS 4 enabled. AMD shows 36 are FSR 4 enabled.

https://www.amd.com/en/products/gra...ames.html#tabs-ab87f43a0c-item-60178a43f6-tab

Each generation of DLSS looks better than the equivalent FSR the vast majority of the time, and that is definitely a real world performance advantage when true. That said, the new framegen tech from both is generally just not there yet for either one. The innacuracy in images and the input latency it introduces make it a no-go for me. I have two desktop systems. Intel cpu and AMD GPU in one, AMD CPU and Nvidia GPU in the other. In rasterized performance across the board, the RX 6700XT 12GB massively beats the RTX 4060 (8GB) I have in the other, for basically the same price.

For those reason, I think that the mainstream reviewers methods of testing pure raster performance at various resolutions and then additionally raytracing performance is still the best method for comparing value for most gamers that aren't also creators that need CUDA etc. If you fall into that camp and can afford it, you go with Nvidia.

(Yes, I knew the 4060 was/is a bad value buy on price-to-performance alone, but I wanted to make a tiny low-profile SFF build for behind my mounted TV. As far as form factor goes, it was the most price-efficient and performant without modding a larger card.)
Between FSR4 and DLSS4, it's all irrelevant.

"Benchmark videos" always focus on stationary imagery. I don't know about you but I play games moving. Not just that, benchmarks never are 1:1 unless a games inbuilt benchmark is used, further leading to inconsistencies.

Both technologies look fine now and furthermore compensation technology shouldn't be pitted against raw performance. If it could, 4X FG wouldn't look like the trash it does.

Lossless Scaling for $8 on Steam does a better job at FG and that's not a jab at the app, on the contrary. They pulled off what Nvidia barely can for $2000
 
  • Like
Reactions: artk2219
UserBenchmark claims that AMD's Radeon GPUs fail to deliver real-world performance and are backed by an army of influencers it blames for duping customers.

UserBenchmark bashes AMD GPUs and claims they lack real-world performance : Read more
UserBenchmark does have a lot of very biased review notes, however the note in the 9070XT benchmark about the marketing is "Based on 1 user benchmark" to quote the site. The data for the 9070XT is from a single sample and the submitter is the one writing the comment. You could not get more biased.
It is interesting to note that the same site with reviews of the 5070Ti is generally positive about the performance, but gives it a -36% rating for value.
The best card on UserBenchmark by user rating is the 4060, closely followed by the 4070, which as an AMD owner I would agree with.
 
  • Like
Reactions: m7dm7d
I find UserBenchmark to be a very reliable way to gauge an AMD product : the more virulent they get, the better the AMD product. When AMD competed with Intel and Nvidia by providing OK products at a great price (Zen/Zen+, Vega/Polaris), they were... Well, they actually sounded and looked quite honest in their assessments.
When AMD CPUs and GPUs really got competitive, UB became outlandishly biased - and that's when I changed my 2016 reference RX480 8Gb for a RX6600XT.
I might just get a RX9060, actually.
 
  • Like
Reactions: tamalero
They cherry-pick their benchmarks and have been caught changing their weighted scores to benefit Intel because they thought AMD's benchmark results were "unrealistic." Someone who is that rabidly against AMD (or any other company) can't be trusted to give accurate data without fudging the numbers. Lies, damned lies, and statistics.

https://www.tomshardware.com/news/userbenchmark-benchmark-change-criticism-amd-intel,40032.html
From your link
Userbenchmark changed their weighing system to 40% single-core, 58% quad-core, and 2% multi-core.
Now explain to the rest of us how having more than 4 cores helps your grandma read her emails better.....
For the general user anything above 4 cores is basically useless, they are gonna need it once a year if they have to decompress something.
Even for a gamer the first 4 cores are the most important.
The 14100 gets you 134FPS avg, you have to be a hardcore user to think that you need more than that.
https://www.techpowerup.com/review/intel-core-i3-14100/15.html


Bottom line, userbenchmark isn't for the hardcore fanatic enthusiast, it's for the average joe that needs something to do their basic computing on.
 
  • Like
Reactions: m7dm7d and rluker5
Do you even know what we are talking about here? Do some research about the Userbenchmark's irrational hate against AMD. It has nothing to do with what people prefer.

Uh, yes I do know what you are talking about. I was offering one significant product that AMD sells (CPUs) in which you can't even really make a case for choosing the competitor unless you had some kind of hangup about a 3rd party company making their chipsets.

Not sure why you are flying off the handle since I'm AGREEING with you.
 
Another AMD hating review from UserBenchmark. The owner's goal is to just crap on anything AMD and SEO so his results appear at the top of any comparison search.

His review of the 9000X3D series are the same, with totally unfounded claims about how bad the vcache is. Thankfully most people now know to ignore their opinions and read other sites.
 
Downloaded the benchmark and tried last night. Pretty interesting, I was outstanding in everything but in the middle of the curve. Lots of information which whether to be taken with salt or not was cool.
 
I am sure that site is meant to be satire.

They hate everything AMD. Whether its CPU or GPU.
They at least semi safe with Nvidia given that Nvidia still do make best GPU.
But its not going to be a sane review of an AMD product

They been suffering the last few years ever since Intel CPU started to not be the best choice for gaming.

They used to be semi sane, untiil Ryzen started to get better... and better.

at one stage last year their benchmark software was also malware, I got infected by it and it redirected my internet connection to India.
Now they charge you to use the benchmark if you use it more than a few times.

Or they show the server busy screen and offer either cancel or Pro choices to run the benchmark, and Pro is a yearly sub just to run a benchmark on a site that has totally unreliable comparisons and reviews. value for money right there.

I wonder if they change the review to counter the "backlash"

shame they don't appear to have "reviewed" the 9950x3d yet.
 
Last edited: