News Intel Core i5-12400 Review: Budget Gaming Supremacy

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Yes, as you can see I am member of this forum since 2007, I wasn't born yesterday. But eliminating bottleneck isn't real world. Not in this way. Using SSD to eliminate bottleneck is fine. But this is getting stupid. I wasn't a fan of this principle in 2007, I'm not a fan now either. As I said, I know why it's done as it's done, but to make the picture whole, a simple set of benches with normal real world-ish setup would be really welcome
I don't understand why you think this is stupid. It just makes sure nothing else bottlenecks the CPU to show exactly which CPU is better.
 
I don't know where they got their pricing... $210 i5-12400, $180 i5-12400F. (Ha, we're paying 20% more for the same aging HD730 graphics! I was wrong, they are getting an up grade from aging HD720 to aging HD730 here, I guess it's better than nothing. Well, probably not nothing if you have a GT 730 or better lying around.) These prices are more than fair. We've got some Z690 boards under $200 and if B660 boards come in at about $120, that's some extreme pressure on the AMD line.
 
Last edited:
I don't understand why you think this is stupid. It just makes sure nothing else bottlenecks the CPU to show exactly which CPU is better.

Because it shows artificial difference of let's say 20% in workload where it's actually a 1% real world difference. Hardware enthusiasts all claim synthetic benches are bad, when all they do is "remove bottlenecks" and show "which CPU is better". And I didn't say showing these numbers is bad, but that we really need some more realistic view ADDED as well. I really want to know if in real world upgrading from one CPU to other would be worth it. Not artificial jump of 20%, but what I'd get for real. We get such articles once a year, where we get big cross bench test of 100 CPUs on 100 GPUs, but those are rare and far between. So to know what I want to know I need to wait some late 2022 article. Instead, if they added what I proposed, each article would have a few charts showing small cross section of immediate competition in such workloads.

I forgot which site is it, one that shows stuff like "best settings you can have to stay on 60fps on tested hardware". That's what's lacking here, real world view at tested hardware.
 
Because it shows artificial difference of let's say 20% in workload where it's actually a 1% real world difference.
It's not an artificial difference, you just don't have a good enough GPU to get that amount of difference.
The difference in FPS you get with the highest GPU is real, you just have to adjust your quality settings until your GPU can match those FPS.
What you are talking about is outside of the scope of a CPU benchmark and only applicable to GPU benchmarks where they could show settings to achieve 60 FPS.

Obviously if you have GPU that is maxed out by a 10 year old CPU then upgrading your CPU isn't going to make anything better, you shouldn't need any benchmark to tell you that.
 
  • Like
Reactions: jacob249358
What's so hard about this?

A top of the line GPU is used because the review is trying to specifically show the performance differences between the CPUs... where ONLY the CPU performance is being analyzed.

Limiting a CPU by saddling it with a lower GPU means that you are adding another variable into the mix, and therefore not accurately comparatively analyzing the CPUs.
 
What's so hard about this?

A top of the line GPU is used because the review is trying to specifically show the performance differences between the CPUs... where ONLY the CPU performance is being analyzed.

Limiting a CPU by saddling it with a lower GPU means that you are adding another variable into the mix, and therefore not accurately comparatively analyzing the CPUs.

Exactly, just like how the fastest CPU is used for all GPU benchmarks. Eliminates the CPU being an extra variable, as much as possible.
 
  • Like
Reactions: King_V
i have this and it sucks when it comes to cyberpunk at 4k. the vga is RTX 3080 12gb but the benchmark goes below 30 inside the bar. min 23 and max 45. so when it comes to high demanding multi thread game like cyberpunk this one sucks is MO.
 
i have this and it sucks when it comes to cyberpunk at 4k. the vga is RTX 3080 12gb but the benchmark goes below 30 inside the bar. min 23 and max 45. so when it comes to high demanding multi thread game like cyberpunk this one sucks is MO.

At 4k the CPU isn't the problem, the GPU is. Considering the difference between a 12400 and a 12700k, is fairly minor, @1080p, @4k there would be 0 difference. Higher the res, the more demanding it is on the GPU.

CP2077.png


4k GPU results, with a 5950x for the CPU.
CP2077_4K.png
 
  • Like
Reactions: King_V