News UserBenchmark bashes AMD GPUs and claims they lack real-world performance

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I never said it was a 70% difference. I said that he most likely made a mistake and gave one example of such a mistake that was close to a 70% difference. I already did the fair comparison.

It was more for showing that the comparisons only hold for logainofhades comparison (which was claimed at 71%) if the comparison is overclocked vs non overclocked. Otherwise the numbers are close to 40% overall as you pointed out with 1.3828 times.
 
I understand why they say they benchmark at 1080p. And I still say it is WRONG. And no manner of name calling will change that, keyboard warrior.
Yes, I get that it takes away the bottleneck. But how much of a bottleneck is there when you're gaming at 4K? Or even 1440 Ultrawide? If I'm running at 4K, I want to know if it is worth upgrading my CPU. I'm not going to benchmark my computer at 1080p, think, "oh, wow, that's fast", then switch to 4K and expect the same increase is FPS.
Oof... Where to even start...

Let's go with the simple one: not all gamers use their PCs the same way.

Outside of your regular Monster Hunters, Final Fantasy RPGs and MMORPGs, Call of Derps and so on, there's plenty games which will take mods, encourage the use of mods and even play in different ways depending on the input method you want. For example: driving simulators with all the bells and whistles you can get, which could include VR HMDs, "proper" Wheel+Pedal+Gearbox combinations or flight sim rigs. All those require way more CPU horsepower than your average "I watch YT while playing" gamer. I talk 100% from experience here and I can tell you, some CPUs that appear in charts as great "value" or even "great performers" fall flat when you need to put them in such scenarios. For that, you need to understand exactly where the CPU will land in situations where the graphical load is secondary to the CPU. Remember, the CPU is still in charge of inputs management and coordination at the end of the day. The GPU will have to wait for the CPU in a lot of situations where a combination of mods, different input types and, sometimes, I/O wait (there's some really interesting conversations to be had for networking and CPU dependency) will affect your overall framerate and experience.

I could say, for argument's sake: "hey, Tom's sucks at benchmarking because they're not using fully a built flight simulator with feedback cockpit and commands while using Starlink connection!". Your "I want 4K for my CPU tests" argument falls within strike distance of that one if we go by the Steam Survey and other somewhat respectable data that tracks monitor usage. Would it be nice if they did that? Absolutely. Is it realistic, or even valid to do so? I'd argue no.

Another easy one, like it's been discussed already, is longevity and GPU swaps: if you're planning to keep the motherboard, RAM (asterisks here) and CPU for a good while, then you will aim for the best CPU (let's call it platform) you can get and then the GPU as part of your budget. For throw-away builds (as I call them), then there is an argument to be made for having 4K testing handy.

Still, even then, you can always extrapolate from the combination of GPU and CPU testing. It's not an outlandish conclusion to say: "mid GPU + mid CPU = mid overall performance", yes? And you can go from there. Sometimes the data you have is enough to still make a valid and informed decision.

So, do you really understand why they test at 1080p?

Regards.
 
>there's plenty games which will take mods...All those require way more CPU horsepower than your average "I watch YT while playing" gamer.

The answer to above is to include more CPU-intensive games in the test mix to be representative, or break them out into a sub-category. Running at an artificially low res, eg 1080p for high-end, doesn't address above, and only further distorts the picture.

Running Wukong @ 1080p is not a substitute for running Dwarven Fortress at any resolution.

>I could say, for argument's sake: "hey, Tom's sucks at benchmarking because they're not using fully a built flight simulator with feedback cockpit and commands while using Starlink connection!". Your "I want 4K for my CPU tests"

Your analogy fails because you're citing a fringe use. Running 4K for high-end CPU/GPU would be a typical and expected use. It's using 1080p with high-end setup that's fringe.

>longevity and GPU swaps: if you're planning to keep the motherboard, RAM (asterisks here) and CPU for a good while, then you will aim for the best CPU for a good while

The above is a non sequitur. A CPU's gaming perf is unrelated to upgrade strategy.

That, and recognize that your cited upgrade strategy above is suboptimal. GPU is the single most expensive part in a PC. It would make more sense to build around the GPU, ie buy the best GPU first. The GPU then can be carried forward to succeeding CPU/MB builds. Buying best CPU first (and GPU second) is a dumb idea.

>So, do you really understand why they test at 1080p?

I understand that 1080p testing has become entrenched as a convention. It doesn't invalidate the argument that 1080p testing while typical high-end use are at 1440p/4K is a distortion. Your arguments above are unconvincing, as is your appeal to authority.
That's a lot of words to just say "I don't understand what I'm reading in your post".

Regards.
 
Technically the topic of this thread was Userbenchmark... just because almost every thread here turns into a feces throwing contest is hardly the sites fault... well, not totally anyway. Some of the topics do induce the reaction from readers. (As a mod I have no control over News/Review topics chosen)

Userbenchmark do it to themselves almost everytime a new CPU/GPU is released. Is it our fault we find it entertaining?
 
I've long held that the best use of UserBenchmark is to see if anyone has actually had a particular hardware combination run well enough to at least run the benchmark, not what the numbers produced are.

For example if you have a question on whether an unsupported CPU will actually run in a particular motherboard, it greatly helps to see many other people running it and even listing which BIOS revision they were running to do so. Or if a mostly-for UEFI hybrid vBIOS on a newer GPU will work in an old Legacy BIOS motherboard. If they didn't work, then there would not be a score.
 
  • Like
Reactions: King_V
Still inaccurate. Ill just pick one example, it'll even be after AMD officially retired the ATI name. December 2011, Radeon HD 7000 series. The Radeon HD 7970 beat the GTX 580 as the fastest single gpu available, and Nvidia wouldn't have a rebuttal until March 2012 with their GTX 600 series. That was at least a 3 month period where Nvidia had no response available. The software side is really debatable, especially historically, with each offering features the other didnt have until they caught up. If you want to be a fanboy, fine, that's on you, but I dont see why you would have such a problem with acknowledging that AMD has ever had the crown. Just about any time that has happened, Nvidia has been incredibly offended, and put out an excellent response. When there is proper high end competition, we all win.

https://www.techpowerup.com/review/amd-hd-7970/28.html

https://www.techpowerup.com/review/nvidia-geforce-gtx-680/27.html
Lol they don't even compete at the high end They can't possibly win it. And AMD never had the crown as you say They never sold more high-end products than Nvidia has.
 
Spoken like (personal attack). Nvidia designed multiple cards at each tier and only decided which to release as a response to AMD? That’s literally the dumbest thing I’ve ever heard. Also literally 90% of Nvidia’s gaming customers don’t even know they have software advantage. They don’t even know what CUDA is.
Personal attacks do not add credibility to your arguments. They do just the opposite in fact.
 
His statement that there is no advantage in buying a 3D CPU if it is married to an average GPU is correct. The GPU will be the bottleneck. You need a top of the line GPU for the 3D CPU to shine. The majority of users do not own a fast GPU although that will increase over time as people upgrade. It is also true that AMD drivers are usually not as good as Nvidia's.

He may have an anti AMD bent but his points are not without some merit. Simply denouncing him as crazy just lowers the conversation.
You sound like the guy who would gulp down propaganda as long as it vaguely gestures at something that might have actually happened.
 
From your link

Now explain to the rest of us how having more than 4 cores helps your grandma read her emails better.....
For the general user anything above 4 cores is basically useless, they are gonna need it once a year if they have to decompress something.
Even for a gamer the first 4 cores are the most important.
The 14100 gets you 134FPS avg, you have to be a hardcore user to think that you need more than that.
https://www.techpowerup.com/review/intel-core-i3-14100/15.html


Bottom line, userbenchmark isn't for the hardcore fanatic enthusiast, it's for the average joe that needs something to do their basic computing on.
Gradnmahs usually do not buy computers. They buy tablets or something as easy to use. Like an IPAD.
 
  • Like
Reactions: artk2219
Lol they don't even compete at the high end They can't possibly win it. And AMD never had the crown as you say They never sold more high-end products than Nvidia has.
1520122731225
 
  • Like
Reactions: King_V
I would be willing to bet that some consideration for this has already been reviewed. A company that size certainly keeps an eye out for things such as that. It is worth consideration they think "any press is good press" and being rational probably consider that everyone else but those who wouldn't be convinced anyway can smell that smell.

I'm confident that if AMD had any inclination to sue Userbenchmark's parent entity, house counsel would have swiftly disabused them of that notion given there's a very high burden of proof required to bring an action under the Lanham Act. Userbenchmark just being some combination of stupid/biased/lazy/hyperbolic is nowhere near enough.
 
Oof... Where to even start...

Let's go with the simple one: not all gamers use their PCs the same way.

Outside of your regular Monster Hunters, Final Fantasy RPGs and MMORPGs, Call of Derps and so on, there's plenty games which will take mods, encourage the use of mods and even play in different ways depending on the input method you want. For example: driving simulators with all the bells and whistles you can get, which could include VR HMDs, "proper" Wheel+Pedal+Gearbox combinations or flight sim rigs. All those require way more CPU horsepower than your average "I watch YT while playing" gamer. I talk 100% from experience here and I can tell you, some CPUs that appear in charts as great "value" or even "great performers" fall flat when you need to put them in such scenarios. For that, you need to understand exactly where the CPU will land in situations where the graphical load is secondary to the CPU. Remember, the CPU is still in charge of inputs management and coordination at the end of the day. The GPU will have to wait for the CPU in a lot of situations where a combination of mods, different input types and, sometimes, I/O wait (there's some really interesting conversations to be had for networking and CPU dependency) will affect your overall framerate and experience.

I could say, for argument's sake: "hey, Tom's sucks at benchmarking because they're not using fully a built flight simulator with feedback cockpit and commands while using Starlink connection!". Your "I want 4K for my CPU tests" argument falls within strike distance of that one if we go by the Steam Survey and other somewhat respectable data that tracks monitor usage. Would it be nice if they did that? Absolutely. Is it realistic, or even valid to do so? I'd argue no.

Another easy one, like it's been discussed already, is longevity and GPU swaps: if you're planning to keep the motherboard, RAM (asterisks here) and CPU for a good while, then you will aim for the best CPU (let's call it platform) you can get and then the GPU as part of your budget. For throw-away builds (as I call them), then there is an argument to be made for having 4K testing handy.

Still, even then, you can always extrapolate from the combination of GPU and CPU testing. It's not an outlandish conclusion to say: "mid GPU + mid CPU = mid overall performance", yes? And you can go from there. Sometimes the data you have is enough to still make a valid and informed decision.

So, do you really understand why they test at 1080p?

Regards.
The most hardcore driving sims available actually aren’t very heavy on the CPU at all. Also using a wheel and pedals doesn’t push your CPU harder either other than maybe the USB controller…
 
The most hardcore driving sims available actually aren’t very heavy on the CPU at all. Also using a wheel and pedals doesn’t push your CPU harder either other than maybe the USB controller…
It will clearly vary from game to game, but it was just a simple-to-grasp example. Look at Asseto Corsa, for instance and Flight Sim. The peripherals will depend on how many and what differences they have. Sure, a single device presented to Windows is a different beast than, say, 4 individual ones. Not a massive extra load, but it'll depend on the software they need to run behind the scenes how much overhead they include into your system. I have a couple of interesting examples of, for example, iCue just robbing performance for friends when they want to have RGB lights doing something in their mouse which needs to be managed by the software (this was fixed/changed a long time ago) and such scenarios. For this particular example, I have no idea how frequent it is, but I'm 100% sure software for peripherals can tank performance if they are implemented cheaply or cause issues. Another good example has been Synapse for a long while. Etc, etc, etc. And I'm not even talking about VR and the many different gadgets you can use for many things. All of those are rather heavy CPU-wise.

The point is: different hardware configurations will stress the CPU differently, so you always need to plan for your specific needs and take the CPU with enough room for all the things you want to use for your games.

Regards.
 
It will clearly vary from game to game, but it was just a simple-to-grasp example. Look at Asseto Corsa, for instance and Flight Sim. The peripherals will depend on how many and what differences they have. Sure, a single device presented to Windows is a different beast than, say, 4 individual ones. Not a massive extra load, but it'll depend on the software they need to run behind the scenes how much overhead they include into your system. I have a couple of interesting examples of, for example, iCue just robbing performance for friends when they want to have RGB lights doing something in their mouse which needs to be managed by the software (this was fixed/changed a long time ago) and such scenarios. For this particular example, I have no idea how frequent it is, but I'm 100% sure software for peripherals can tank performance if they are implemented cheaply or cause issues. Another good example has been Synapse for a long while. Etc, etc, etc. And I'm not even talking about VR and the many different gadgets you can use for many things. All of those are rather heavy CPU-wise.

The point is: different hardware configurations will stress the CPU differently, so you always need to plan for your specific needs and take the CPU with enough room for all the things you want to use for your games.

Regards.
Fair enough. I’m just heavily into sim racing, and I thought I should point out that it generally actually has lower PC requirements than AAA gaming. A hobby that requires expensive peripherals that can go into the thousands would be in a bad place if it required high end hardware too.
 
Saw a decent breakdown of userbenchmark over the years. Originally add in drive tests, little bias. Ryzen 1000 series, little bias. Ryzen 2000 series, praised the improvement, little bias.

Ryzen 3000 series and Threadripper 2990, the benchmark algorithm and reviews changed. The multicore was further deprecated and the “AMD marketing machine” became targets for the ire of the site owner.
The rest is history, bias and disinformation in his reviews coupled with warped benchmark results.

It’s a pity, had the site owner remained neutral then his site would be a useful resource.
 
  • Like
Reactions: artk2219
Saw a decent breakdown of userbenchmark over the years. Originally add in drive tests, little bias. Ryzen 1000 series, little bias. Ryzen 2000 series, praised the improvement, little bias.

Ryzen 3000 series and Threadripper 2990, the benchmark algorithm and reviews changed. The multicore was further deprecated and the “AMD marketing machine” became targets for the ire of the site owner.
The rest is history, bias and disinformation in his reviews coupled with warped benchmark results.

It’s a pity, had the site owner remained neutral then his site would be a useful resource.
Pretty much, they weren't a great resource, but they weren't completely biased like they are now. Its not too late to change back, i just don't think thats possible under the current site leadership.
 
UserBenchmark's take on AMD GPUs, especially the RX 9070 XT, feels pretty biased as usual. They’ve been known for favoring Intel and NVIDIA, and their focus on single-core performance doesn't really reflect real-world gaming scenarios. Plenty of benchmarks from reputable sources show that the RX 9070 XT holds its own against NVIDIA’s RTX 5070 Ti, often delivering similar performance at a better price. Influencers aren't just hyping AMD for no reason—many gamers are genuinely satisfied with the value and performance. Always cross-check reviews from multiple sources rather than relying solely on UserBenchmark’s skewed metrics.
 
UserBenchmark's take on AMD GPUs, especially the RX 9070 XT, feels pretty biased as usual. They’ve been known for favoring Intel and NVIDIA, and their focus on single-core performance doesn't really reflect real-world gaming scenarios. Plenty of benchmarks from reputable sources show that the RX 9070 XT holds its own against NVIDIA’s RTX 5070 Ti, often delivering similar performance at a better price. Influencers aren't just hyping AMD for no reason—many gamers are genuinely satisfied with the value and performance. Always cross-check reviews from multiple sources rather than relying solely on UserBenchmark’s skewed metrics.
I think their reviews are somewhat vague, but reading the 9070xt review I don't see a lot of negatives about "that" card. They claim "Excellent consistency" and "Outstanding average bench". Isn't that a terrific review?