News UserBenchmark suggests you buy the i5-13600K over the Ryzen 7 9800X3D — says AMD drives sales with 'aggressive marketing' rather than 'real-world p...

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jp7189

Distinguished
Feb 21, 2012
510
298
19,260
And they are consumers, not developers, and don't understand that the same could be achieved without all that overkill.

A game can render one frame right before each refresh and also get low latency. It's a simple problem with an equally simple solution.
And if that frame lands just after the refresh? If the machine is pumping out more frames there's more chance of hit just before the refresh.

Honestly, I'm just playing devils advocate. I don't actually think if matters that much, but if you forget practicality for a minute, there are tons of examples in every facet of life where people spend big money chasing that last half percent for some kind of competitive edge.
 
  • Like
Reactions: Scraph

spongiemaster

Admirable
Dec 12, 2019
2,345
1,323
7,560
You have missed out on the future proofing part of the issue. Most people who build their own rig upgrade parts, not whole system in one go, so say at this point of time, you can get a 13600k and a 4070 and game well on 1440p or even 1080p, but when next gen GPU comes up in a few months, or 2 generations later, the 13600k is highly likely becoming a bottleneck, and a 9800X3D will likely remain relevant till 6090 era. And considering at least Zen 6 will still be at AM5, upgrading from 9800X3D to the next gen X3D could likely sustain your MB and ram config for even longer, that's where the money saving kicks in.

You get the LGA1700 chips at a low cost since the platform is now EOL, the best you can get is 14900KS, and in a 1-2 years there will have no replacement new CPU for the 1700 platform, and the 2nd hand market will likely flooded by degraded RPLs. If not such situation the i5s won't be at their current cost and will make the X3D chips more attractive.
Future proofing has always been and will always be BS. 2 GPU generations from now is going to be in the 3-4 years from now range. We're looking at Zen 7 at that point. The 5800X3D is 2.5 years old to put that 3-4 year timeframe in perspective. in 2028, if you're shopping for an $800 RTX 6070, you're going to want a better CPU than the 9800X3D.
 

YSCCC

Commendable
Dec 10, 2022
566
460
1,260
Future proofing has always been and will always be BS. 2 GPU generations from now is going to be in the 3-4 years from now range. We're looking at Zen 7 at that point. The 5800X3D is 2.5 years old to put that 3-4 year timeframe in perspective. in 2028, if you're shopping for an $800 RTX 6070, you're going to want a better CPU than the 9800X3D.
Not really, at least quite a lot of ppl are still buying a 4070 just to plug and play on their 9th or 10th gen i7. At 1440p the CPU isn't bottlenecking it a lot yet you can max out the effects with that. The cost of replacing the whole thing will easily double the $800 budget.
 
  • Like
Reactions: jp7189
i mean Zen2 was about when they matched intel. Zen3 was generally amd beating intel overall and zen 4/5 it was no brainer.

userbenchmark is one of worst sites to trust given their biased history & relying on massively variable datasets.

can only imagine how bad it will get when they start using llm to sort stuff.
Zen 2 was ahead of Intel in terms of IPC, however, Coffee Lake Refresh was usually faster due to a higher clock speed. By Zen 3 AMD was winning across the board against Comet Lake, Rocket Lake, etc...
 
Mar 10, 2020
414
376
5,070
Future proofing has always been and will always be BS. 2 GPU generations from now is going to be in the 3-4 years from now range. We're looking at Zen 7 at that point. The 5800X3D is 2.5 years old to put that 3-4 year timeframe in perspective. in 2028, if you're shopping for an $800 RTX 6070, you're going to want a better CPU than the 9800X3D.
That is dependant on how well the 9800x3d can feed the 6070, the 5800x3d is still viable with the 4090.

Choose your desired frame rate in the games of your choice. Pick the hardware that matches or exceeds your requirements today and enjoy.

You can future proof to a degree, AM4 demonstrated that with its longevity and upgradability that the supporting hardware can cope well. The longevity of performance parts, processor and gpu, is dependent on the upcoming software.

Yes, every current processor will become the bottleneck just as all that follow will eventually follow the same fate, the same is true of the GPUs.
The question is do you buy 1 cpu that is viable for 3, 4, 5 years or do you buy 2 cpus because the cheaper one chosen has become non viable with the future software and needs replacing to play Starfield 2?

Only you can answer the question for you. It’s a gamble either way.
 
  • Like
Reactions: snemarch
Jun 5, 2024
11
22
15
And if that frame lands just after the refresh? If the machine is pumping out more frames there's more chance of hit just before the refresh.
Reasonable question.

If you only render the frames you need, you also know when they'll display. And if you know when they'll display, then you can render frames to show the right thing at the right time. In other words, you can _eliminate_ latency this way.

To avoid missing the refresh, you just start rendering early enough. The limiting factor is prediction: you need to be able to predict the positions of objects at display time, before you start rendering.

If your CPU is not spinning on all of these extra frames, you can get a higher boost clock, process user input with lower latency, etc.
 
Oct 29, 2024
10
8
15
hahahaha the irony, that picture is so funny. "Userbenchmarks" , rank by performance , and you can just see that the further down you go , the higher gets the user rating :D hahahaha wtf , good lolz.

"Something seems off my dear Wattson"
 
Oct 29, 2024
10
8
15
The anti AMD spin is crazy, but the fundamental argument is sound.

Most people gaming at 1440p will never notice the difference between a 13600k and a 9800x3d. A 14900k or 285k makes even less sense for most gamers when there are cheap, new 12th and 13th gen and zen 4 available.

If you have a limited budget and you are choosing between a 9800x3d+4060 and a 13600k+4070 super, get the better gpu and the cheaper CPU.

I’d get a 7600x instead of the 13600k, but most modern CPUs are fine for anything up to a 4080 at 1440p.
Both yes and no. Because to high fps gamers, these things matter. I am from the competitive fps scene, so to me all my friends would love the 9800x3d. And then you can argue that if you want gaming value , something like 5800x3d or 5700x3d is a nobrainer atm. At least for fps players , which is my perspective in this reasoning.

In your suggested builds you are comparing a "mid cpu + mid gpu" vs "top end cpu + low end gpu" , its just not an honest comparison.......... yea obviously its a bad idea to go all in on the cpu just to buy a gpu from the very very low end.

Btw I have a 11600k on my second PC , and while I would love a better cpu, I dont really have any problems to justifying an upgrade.
 
Last edited:
Mar 10, 2020
414
376
5,070
ght time. In other words, you can _eliminate_ latency this way.
Which latencies?

Counter Strike at upto a nominal 500fps has one major latency, the seat to keyboard interface.

It is up to programmers to program effectively in order to minimise latencies in their code. Programming for the latest and greatest cpu/gpu combinations is a mistake as they will inherently reduce latencies wrt time, not cycles.
Slower cpu/gpu combinations present increasing latencies wrt time. PCs are hellish, with many combinations and permutations.
 
What data do you have to back up this statement?
I plugged an RX6700XT into my old 4770k (not overclocked) with a 3440x1440 monitor. It worked out quite well for a lot of games even a decade later. However, since I upgraded to a 9700X I have noticed that my minimum FPS is much better than it was before and to an extent my max FPS has slightly increased. However, a 6700XT is about the most that 4770k could handle but with the 9700X I am good for basically any GPU in the next few years and even in a decade it can still take a midrange GPU and be OK.
 

Kentmos

Prominent
Jun 1, 2023
3
0
510
There is actually a point here.. I run the good, old Intel i7 3960X on a Asus Sabertooth X79 (clocked to 4178MHz, and 1676 MHz on the RAM Quad chan) with a RTX 4070ti Super, and I got 76 avg fps on Diablo 4, 4k Ultra settings, no RT.
 
There is actually a point here.. I run the good, old Intel i7 3960X on a Asus Sabertooth X79 (clocked to 4178MHz, and 1676 MHz on the RAM Quad chan) with a RTX 4070ti Super, and I got 76 avg fps on Diablo 4, 4k Ultra settings, no RT.
Overall Diablo 4 isn't very CPU intensive. Minimum CPU is a 2500k and recommended is a 4670k. Tomshardware did a GPU review of Diablo 4 on 36 GPUs with a 13900k for the CPU. The 4070Ti was getting an average of 109 FPS at 4k with 73 FPS minimum. That means just a CPU upgrade could easily see you getting 40%+ more performance with your GPU.
 
  • Like
Reactions: Kentmos

Elusive Ruse

Estimable
Nov 17, 2022
450
582
3,220
Overall Diablo 4 isn't very CPU intensive. Minimum CPU is a 2500k and recommended is a 4670k. Tomshardware did a GPU review of Diablo 4 on 36 GPUs with a 13900k for the CPU. The 4070Ti was getting an average of 109 FPS at 4k with 73 FPS minimum. That means just a CPU upgrade could easily see you getting 40%+ more performance with your GPU.
The 9800X3D has incredible minimum framerates, I think this has gone under the radar due to the surprising gen on gen avg framerate uplift.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060

rluker5

Distinguished
Jun 23, 2014
901
574
19,760
Imagine if instead of a small site with a known bias it was all of tech media making the same claim.
That's how I felt when Zen, Zen+ and Zen 2 launched.
Yes this guy is very biased. Not a big deal from what I've seen.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Imagine if instead of a small site with a known bias it was all of tech media making the same claim.
That's how I felt when Zen, Zen+ and Zen 2 launched.
Yes this guy is very biased. Not a big deal from what I've seen.
Tech channels are very biased too. They are literally getting paid by the viewers, and what brand is the large majority a fan off?

I mean lots of content creators in this space have said that unless they make an Intel / nvidia bad amd good video, they are not getting traffic.
 
  • Like
Reactions: rluker5
Mar 10, 2020
414
376
5,070
Tech channels are very biased too. They are literally getting paid by the viewers, and what brand is the large majority a fan off?

I mean lots of content creators in this space have said that unless they make an Intel / nvidia bad amd good video, they are not getting traffic.
Proof?

HUB slated the AMD 9000 series launch (x3d excluded)
Jays 2 Cents too
GN as well….

The negativity wrt Intel does exist, it’s a reaction to intel’s behaviour over the past 12 months and a lack of trust. Also underwhelming performance in the target market for most YouTube reviewers wrt games. (285 crashes notwithstanding).

Perhaps there are production oriented review sites that show intel 285 in a better light. I can’t comment to this, I haven’t looked for them.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Proof?

HUB slated the AMD 9000 series launch (x3d excluded)
Jays 2 Cents too
GN as well….
Sure, hub slated amd 9000. And in the same video he slated them, he said stay away from intel and buy amd 7000. Which is fine, but I'm just pointing out that you can't take that as proof that he isn't pro amd.

Proof as in what? Off the top of my head Daniel Owen made a video that he basically straight up said that any content that isn't ***ting on intel and praising amd doesn't get any traffic. He is not the only one, lots of creators had made similar claims on Twitter (GB techtesters etc.). I mean even hub himself said it on Twitter just a week ago.

The negativity towards intel has nothing to do with the past 12 months. It has existed for the past 15 years, lol.

Eg. Just so I'm not misinterpreted, I don't think hub is pro amd. I think he doesn't care about amd or intel. But his fans (as is the entirety of the vocal internet) are religiously pro amd and so the content he is crating takes that into account.
 
  • Like
Reactions: rluker5
Mar 10, 2020
414
376
5,070
Proof as in what? Off the top of my head Daniel Owen made a video that he basically straight up said that any content that isn't ***ting on intel and praising amd doesn't get any traffic. He is not the only one, lots of creators had made similar claims on Twitter (GB techtesters etc.). I mean even hub himself said it on Twitter just a week ago.
Ok… links?

AMD were not relevant until Ryzen within your stated time frame, bulldozer was a huge misstep, the Nvidia rise to power was firmly under way and thinking back to 2014 the good that was said was with regard to the R9 390 / Nvidia 970…. “Should have bought a R9 390” the 980… 1080 and newer hammered AMD.
 

YSCCC

Commendable
Dec 10, 2022
566
460
1,260
What data do you have to back up this statement?
No solid numbers on surveys, basically just looking around the local gaming community and friend's purchaing habit, and for example:

https://www.pcgamebenchmark.com/gpu...Search=GTA5&game=any&cpu=any&gpu=5895&ram=all

you can see that it isn't that rare for ppl to even submit benchmark results using 10th gen or lower. And simply given the arguement that 9800X3D is usless cause for the current gen TOTL 4090, at 4k it basically bottlenecking the gaming FPS for anything above 12600k or even 11600k, the X3D will very likely be able to not bottleneck the TOTL GPU at all for 3-4 years later at least
 
Status
Not open for further replies.