That it makes absolutely no difference beyond around 75fps as neither reaction time nor motion perception exceed that. *Minimum* frame rate is surely what matters?Actually, tons of people that are into competitive shooters or sim racing play as fast as the game can run on their machine for reduced latency.
Their, not there.This is totally opinion based on my end, This article and charts, UserBenchmarks, were put together by Intel fans boys that are having issues with the fact that AMD has now found, for the time being, a solution that gives it unpearalleled performance in gaming scenario's .
Even TomsHardware "had" one of these type of writers on staff a few years back, maybe quite a few as the year slips my mind, but he soon went to the wayside after the article I read where it was not an objective view but forced his own views into the article for Intel fanboy-ism.
Other than the Intel's 13th-14th gen frying themselves, Intel had a good offering last gen, till the frying issues. There performance is not in dispute but there longevity is and this fear will steer those looking to upgrade away from those CPU's. This is a fact. These issues are also not taken into consideration with UserBenchmarks review or charts. I personally will not buy a 14th gen for another 3+ years till it is proven to be safe but by then there will be a different CPU with better performance and newer tech as it moves so quickly these days. 3+ years is going to put it well past the end of life for this CPU so why bother at this point.
Fanboy-ism has no place in benchmarks or reviews. The facts, just the facts matter. The conclusion is subjective but don't force your views on your readers without saying "Opinion Based". User Benchmarks is Opinion based and they have a flawed system to evaluate hardware. Toms Hardware does a good job and so does Gamers Nexus of being transparent and try to be objective and only give opinions in there conclusions, which is Opinion Based to start with.
Just my 2 cents on this article in discussion.
No, that isn't it. The point is that anyone buying a high end gaming rig is going to game at higher than 1080p. Once you reach 1440p, the current GPU's aren't fast enough to make buying a "gaming focused" CPU financial sense. A 14600K trails a 9800X3D by only 7.5% at 1440p when using a 4090. If you're not using the fastest GPU on the planet, those two are going to be a whole lot closer and imperceptibly different. A 14600k currently costs less than half what a 7800X3D costs. I can't even find a 9800x3d for sale. At 4k, You can spend under $200 on a 7600 non-X and get within 3.5% of 9800X3D again with a 4090. Any other GPU and you're looking a 0%. If you don't have a 4090, and you spend $300 more for CPU that will give you 0% more performance on average. That's not a flex, that's stupid. Put the money towards a better GPU.The argument here is basically that it doesn't matter if the 9800x3D is 500% faster than Intel's best because the 13600k and 14600k are already more CPU than anyone actually needs and there's zero reason to buy anything else. So, we've reached peak and AMD and Intel should stop making newer & faster stuff.
You have missed out on the future proofing part of the issue. Most people who build their own rig upgrade parts, not whole system in one go, so say at this point of time, you can get a 13600k and a 4070 and game well on 1440p or even 1080p, but when next gen GPU comes up in a few months, or 2 generations later, the 13600k is highly likely becoming a bottleneck, and a 9800X3D will likely remain relevant till 6090 era. And considering at least Zen 6 will still be at AM5, upgrading from 9800X3D to the next gen X3D could likely sustain your MB and ram config for even longer, that's where the money saving kicks in.No, that isn't it. The point is that anyone buying a high end gaming rig is going to game at higher than 1080p. Once you reach 1440p, the current GPU's aren't fast enough to make buying a "gaming focused" CPU financial sense. A 14600K trails a 9800X3D by only 7.5% at 1440p when using a 4090. If you're not using the fastest GPU on the planet, those two are going to be a whole lot closer and imperceptibly different. A 14600k currently costs less than half what a 7800X3D costs. I can't even find a 9800x3d for sale. At 4k, You can spend under $200 on a 7600 non-X and get within 3.5% of 9800X3D again with a 4090. Any other GPU and you're looking a 0%. If you don't have a 4090, and you spend $300 more for CPU that will give you 0% more performance on average. That's not a flex, that's stupid. Put the money towards a better GPU.
GPU performance isn't a static target. It will improve over time so faster CPU's will be needed in the future.
Outside of very fringe use cases, you're a Flight Simulator hobbyist, or something of that ilk. If you don't own a 4090, you should not be looking at a 9800X3D. It's a total waste of your money that's better off being spent elsewhere on your computer.
Usually, they are, but this take is one of the most reasonable ones he has. It's not even in the top20 most crazy things userbenchmark has said.It's kind of weird how desperate and pathetic they sound when they try to attack AMD. Oddly surreal. It almost feels like they're making a parody of themselves.
They don't even make remotely consistent or convincing arguments. Do they think sounding desperate in their claims will, uh, somehow convince the rest of us?
Hell, I didn't even ever read any reviews on Userbrenchmark until relatively recently, and when I finally stumbled across one, I had to read it twice because it didn't even seem real.
It's just pathetic.
I have actually tested this not too long ago. A 4090 paired with 14700K (rig built for my brother), 10980XE, 7960X and 13900KS and a 7800X3D (my friends rig that's usually paired with a 7900XT).The entire point of using those as a benchmark is that after that games can become GPU limited, obviously. But it's pretty simple to assume that a cpu isn't "optimized" just for the 1080p low benchmark, but that the CPU performance will scale up at higher resolutions with better GPUs.
Put simply, the best "1080p low" CPU is probably the best CPU for 2k/4k so that it's not dragging performance down on the system overall. Feel free to test this theory by putting a 4090 in a system with a core i3 2120, and compare 1440p benchmarks in any modern game of your choosing at high settings.
So, did they every use that argument in the past? When will they stop using it in the future?The anti AMD spin is crazy, but the fundamental argument is sound.
Most people gaming at 1440p will never notice the difference between a 13600k and a 9800x3d. A 14900k or 285k makes even less sense for most gamers when there are cheap, new 12th and 13th gen and zen 4 available.
If you have a limited budget and you are choosing between a 9800x3d+4060 and a 13600k+4070 super, get the better gpu and the cheaper CPU.
I’d get a 7600x instead of the 13600k, but most modern CPUs are fine for anything up to a 4080 at 1440p.
I don't know, who can call out the real slander company? UBM is slandering AMD. And nobody can repute that?So you're writing an article in order to slander some other company. Not cool.
What matters is the benchmarks. I've compared all my benchmarks to user bench and they really haven't been that much different.
Get off the slander train. It doesn't make you look good.
I loved their 80386-40 more than Intel's -33.Until Zen 3 came around, AMD had to rely on aggressive marketing to survive.
Have you ever heard of minimum frame rate?Most people gaming at 1440p will never notice the difference between a 13600k and a 9800x3d.
Only 7.5%? May I give my account number to you for transferring _only_ 7.5% of your monthly income? For 7.5% lower performance one can buy a latest gen 4080 card.A 14600K trails a 9800X3D by only 7.5% at 1440p when using a 4090.
It's always amazing when people discovers the "value comparing" feature of the money. I'd bet a Pentium G4400 costs even less!A 14600k currently costs less than half what a 7800X3D costs.
That is where the 9800x3d stands. For games it is the fastest today and will provide the throughput to service the next generation (or 2) graphics cards.GPU performance isn't a static target. It will improve over time so faster CPU's will be needed in the future.
The stock 9800X3D trails the 7800X3D, 7950X3D, 14900K, 13900K, and barely beats (<1 fps difference) the 14700K, 13700K, and Ultra 5 245K for 4K minimum frame rate.Have you ever heard of minimum frame rate?
Have a look at Hardware unboxed, there is a video which covers your questions.I myself am not interested in gaming, but it would be good if all CPU reviews would focus on CPU-GPU combinations, as well as display resolutions, that are most relevant for most potential buyers of that CPU. People are increasingly gaming with 4k monitors, and many reviews of "gaming CPUs" competely ignore 4k gaming, because for most games it does not at all matter whether it is the latest fastest CPU or something else. That data is simply not shown in the reviews at all. OK, "everybody" knows that that is how it is, but still, that information should be given in ALL reviews that address gaming uses of CPUs. I wonder why this is not the case. Lots of hardware purchase decision are made because buyers do not understand what the implications of those purchases are. People want faster SSDs, faster wifi routers, faster memory, faster CPUs, even if in reality the "faster" is not really faster at all, and that "upgrade" for that reason is not really an upgrade at all, only a purchase done because it makes the buyer feel better.