News UserBenchmark suggests you buy the i5-13600K over the Ryzen 7 9800X3D — says AMD drives sales with 'aggressive marketing' rather than 'real-world p...

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

spongiemaster

Admirable
Dec 12, 2019
2,346
1,325
7,560
Proof?

HUB slated the AMD 9000 series launch (x3d excluded)
Jays 2 Cents too
GN as well….

The negativity wrt Intel does exist, it’s a reaction to intel’s behaviour over the past 12 months and a lack of trust. Also underwhelming performance in the target market for most YouTube reviewers wrt games. (285 crashes notwithstanding).

Perhaps there are production oriented review sites that show intel 285 in a better light. I can’t comment to this, I haven’t looked for them.
Jay 2 Cents has also mentioned at least a couple times in videos of his that I've watched that videos that praise AMD generate more traffic than videos that praise Intel or Nvidia. There's no way I'd be able to find those videos now, as they were just side comments in videos about other topics. The only secondary proof you need for this is to look at the comment section of any tech site.
 
  • Like
Reactions: TheHerald
Mar 10, 2020
420
384
5,070

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060

spongiemaster

Admirable
Dec 12, 2019
2,346
1,325
7,560
I plugged an RX6700XT into my old 4770k (not overclocked) with a 3440x1440 monitor. It worked out quite well for a lot of games even a decade later. However, since I upgraded to a 9700X I have noticed that my minimum FPS is much better than it was before and to an extent my max FPS has slightly increased. However, a 6700XT is about the most that 4770k could handle but with the 9700X I am good for basically any GPU in the next few years and even in a decade it can still take a midrange GPU and be OK.
I don't know what this has to do with the question you quoted. Did you reply to right post?
 
Mar 10, 2020
420
384
5,070
https://factoriobox.1au.us/results?...9f36f221ca8ef1c45ba72be8620b&vl=&vh=&sort=ups

Check the results, scroll down to find where the 7800x 3d is, and then if you are so kind explain to me why am I reading again and again about how insanely fast amd is in factorio :ROFLMAO:
(Not digging at you).

Sadly the benchmarks, at least on the first pages are pretty useless. They don’t give the settings that are being used other than heavily overclocked memory. The 4500 memory I have to assume is ddr4 and the 8000 memory will be ddr 5.
Nothing is quoted for the processor settings. Difficult to gauge anything from them.
 

spongiemaster

Admirable
Dec 12, 2019
2,346
1,325
7,560
No solid numbers on surveys, basically just looking around the local gaming community and friend's purchaing habit, and for example:

https://www.pcgamebenchmark.com/gpu...Search=GTA5&game=any&cpu=any&gpu=5895&ram=all

you can see that it isn't that rare for ppl to even submit benchmark results using 10th gen or lower. And simply given the arguement that 9800X3D is usless cause for the current gen TOTL 4090, at 4k it basically bottlenecking the gaming FPS for anything above 12600k or even 11600k, the X3D will very likely be able to not bottleneck the TOTL GPU at all for 3-4 years later at least
I'm not saying 9th and 10 gen users don't exist. I'm saying 9th and 10th gen users that have a GPU at a 4070 level or higher are likely an extremely niche portion of the market.

You're not paying attention to what I'm saying. I don't think there is anything wrong with a person using a 4090 buying a 9800X3D. If you want the best and the rest of your system is already maxed, go for it. What I'm saying is that if you don't have a 4090, there is no logical reason to go for a 9800X3D outside of fringe situations. A 14700k is $150 to $200 cheaper than a 7800X3D right now, despite only being 2.5% slower at 1440p. You're not going to future proof anything buying the 7800X3D. In 4 years when 1440p is maybe similar to 1080p today, the difference between the 14700k and 7800X3D will increase to maybe 5% which is the difference at 1080p between those 2 today. Does spending an extra $150 to $200 today make sense to get 63fps in 4 years vs 60 fps with a GPU equivalent to a 4090? I'd argue, no. That money is better not spent and put towards a better upgrade in the future.
 
Last edited:

mrsense

Honorable
Jun 28, 2019
7
7
10,515
On Userbenchmark website;

"We are an independent team of scientists and engineers. We do not have time for HR, PR or marketing."

The conveniently left out "funded by Intel"
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
(Not digging at you).

Sadly the benchmarks, at least on the first pages are pretty useless. They don’t give the settings that are being used other than heavily overclocked memory. The 4500 memory I have to assume is ddr4 and the 8000 memory will be ddr 5.
Nothing is quoted for the processor settings. Difficult to gauge anything from them.
Does it matter? The fastest 7800x 3d is sitting below 12900 and 12700k. Come on man...

Obviously everything you see on the top of the charts is megatuned.
 
Mar 10, 2020
420
384
5,070
Does it matter? The fastest 7800x 3d is sitting below 12900 and 12700k. Come on man...

Obviously everything you see on the top of the charts is megatuned.
Because Factorio is cache-intensive, it shouldn't be surprising that AMD's Ryzen 3D V-Cache parts dominate the leaderboards. The Ryzen 7 7800X3D led the rankings 64% faster than the Core i9-14900K. Even the last-generation Ryzen 7 5800X3D outperformed the Core i9-14900K by 23%. The vanilla Ryzen models are no match for the Core i9-14900K.

Not having played factorio the benchmarks don’t show me anything but you have to look at benchmarks as a whole, not cherry pick to prove a point. The leaderboard link in the toms hardware quote (immediately above) paints a different picture.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Because Factorio is cache-intensive, it shouldn't be surprising that AMD's Ryzen 3D V-Cache parts dominate the leaderboards. The Ryzen 7 7800X3D led the rankings 64% faster than the Core i9-14900K. Even the last-generation Ryzen 7 5800X3D outperformed the Core i9-14900K by 23%. The vanilla Ryzen models are no match for the Core i9-14900K.

Not having played factorio the benchmarks don’t show me anything but you have to look at benchmarks as a whole, not cherry pick to prove a point. The leaderboard link in the toms hardware quote (immediately above) paints a different picture.
I've repeated it a billion times, but we are going back to the usual "amd is good".

Factorio has a limit of 60 ups on their off servers. So getting 999999 ups on small maps does not matter, cause the game is played at 60. What matters is the highest achievable base you can have while maintaining that 60. So those 3d chips do great in the tiny maps (which doesn't matter, cause remember, 60 ups), but they hit a wall on bigger maps when it actually matters cause the data doesn't fit the cache anymore.

Which is what i've been saying all along, 3d chips are 50% faster than everything else when you are looking at the sky (light gaming scenes). In heavier scenes, oh well...
 
Mar 10, 2020
420
384
5,070
I've repeated it a billion times, but we are going back to the usual "amd is good".

Factorio has a limit of 60 ups on their off servers. So getting 999999 ups on small maps does not matter, cause the game is played at 60. What matters is the highest achievable base you can have while maintaining that 60. So those 3d chips do great in the tiny maps (which doesn't matter, cause remember, 60 ups), but they hit a wall on bigger maps when it actually matters cause the data doesn't fit the cache anymore.

Which is what i've been saying all along, 3d chips are 50% faster than everything else when you are looking at the sky (light gaming scenes). In heavier scenes, oh well...
Not having played factorio the benchmarks don’t show me anything but you have to look at benchmarks as a whole, not cherry pick to prove a point. The leaderboard link in the toms hardware quote (immediately above) paints a different picture.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Not having played factorio the benchmarks don’t show me anything but you have to look at benchmarks as a whole, not cherry pick to prove a point. The leaderboard link in the toms hardware quote (immediately above) paints a different picture.
Then you can go to the factorio reddit where everyone playing the game actually says the same thing, the 3ds performance on factorio is greatly overrated and intel does better

Even HUB realized it

View: https://youtu.be/0oALfgsyOg4?t=537
 
Mar 10, 2020
420
384
5,070
there are lies, damned lies and benchmarks. Factorio site shows 2 contradictory lists. The one you provided and the leaderboard I highlighted… I don’t have a dog in this fight, it matters to me not.

Reddit is more rabid than a (in your opinion) bunch of AMD fanboys .. (Intel fanboys are as bad or worse at the moment since Intel really is in a bad place.. they need a win but won’t get one this product cycle so we will have to tolerate their bluster)

Anyway, back to the origin of this thread.. user benchmark..
have fun.
 

hannibal

Distinguished
Intel outsell AMD by 72% vs 28% so UserBencmark is right!

:ROFLMAO: :ROFLMAO: :ROFLMAO:

That site is a joke... But Intel is still jumpin over AMD... maybe with smaller foot than before, but still.
So people buy Intel even it is more expensive and slower... So Userbenchmark can go to pension and world still run around Intel...
... sigh... I was hoping 50% vs 50% situation for better competition in long run. No hope for that...
 

logainofhades

Titan
Moderator
What I'm saying is that if you don't have a 4090, there is no logical reason to go for a 9800X3D outside of fringe situations.

CPU heavy titles will benefit, even with a lesser GPU. Despite having a 12700k, as a WoW player, I would love to have an x3d chip, as WoW loves that extra cache, and greatly can help with min FPS. Before you say it, I wouldn't call 7.25 million subscribers fringe.

I went from a 3700x to a 5800x, back during Shadowlands, and even with an RTX 2060 at 1440p, I saw a significant boost to min FPS, and far less stutter. Hence why I chose to keep jay's 12700k rig, that I won, as the 12700k was a bit better CPU side, over keeping my 5800x. Only thing I changed was swapping out the 3070ti, for my RX 6800.

Microsoft flight sim is another CPU heavy title that loves that cache. Even the 5800x3d beats anything Intel has to offer.

Just because you don't have a reason for such a CPU, doesn't mean there aren't millions that do.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
CPU heavy titles will benefit, even with a lesser GPU. Despite having a 12700k, as a WoW player, I would love to have an x3d chip, as WoW loves that extra cache, and greatly can help with min FPS. Before you say it, I wouldn't call 7.25 million subscribers fringe.

I went from a 3700x to a 5800x, back during Shadowlands, and even with an RTX 2060 at 1440p, I saw a significant boost to min FPS, and far less stutter. Hence why I chose to keep jay's 12700k rig, that I won, as the 12700k was a bit better CPU side, over keeping my 5800x. Only thing I changed was swapping out the 3070ti, for my RX 6800.

Microsoft flight sim is another CPU heavy title that loves that cache. Even the 5800x3d beats anything Intel has to offer.

Just because you don't have a reason for such a CPU, doesn't mean there aren't millions that do.
This guy tests wow, and it doesn't seem like you really want a 3d chip for it.

View: https://www.youtube.com/watch?app=desktop&v=zeAi41v75Kc
 

logainofhades

Titan
Moderator
That 7950x3d was not setup properly, which is shown with the"7950X3D (ccd0+ccd1 enabled)" in the title. Without core parking enabled, on those dual CCD chips, performance tanks.


I wish this guy was still around. Not sure what happened to him, but WoWhead suggested his content for WoW benchmarks, back then. 5800x3d, unless you really spend a lot of time with tuning, beat a 12900k.
View: https://www.youtube.com/watch?v=gOoB3dRcMtk&t=40s
 
  • Like
Reactions: bgunner

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
That 7950x3d was not setup properly, which is shown with the"7950X3D (ccd0+ccd1 enabled)" in the title. Without core parking enabled, on those dual CCD chips, performance tanks.


I wish this guy was still around. Not sure what happened to him, but WoWhead suggested his content for WoW benchmarks, back then. 5800x3d, unless you really spend a lot of time with tuning, beat a 12900k.
View: https://www.youtube.com/watch?v=gOoB3dRcMtk&t=40s
He has videos where he tested ccd0 vs ccd1 on his channel, there was some improvement but nothing drastic. I don't see it beating the 14900k, especially considering the 14900k was gpu bound with that 4090 for the majority of the video.

The 12900k is crap at stock, ecores drop the cache clock by 1000 MHz, from 4.7 to 3.7. If you don't tune your ram, it's garbage. 14900k fixed the cache issue.
 
Mar 10, 2020
420
384
5,070
Intel outsell AMD by 72% vs 28% so UserBencmark is right!

:ROFLMAO: :ROFLMAO: :ROFLMAO:

That site is a joke... But Intel is still jumpin over AMD... maybe with smaller foot than before, but still.
So people buy Intel even it is more expensive and slower... So Userbenchmark can go to pension and world still run around Intel...
... sigh... I was hoping 50% vs 50% situation for better competition in long run. No hope for that...
Dell, Lenovo and HP count for 40% of Intel revenue, not sure how much of this goes to corporate and how much to retail, the split is 19%, 11% and 10% for each company respectively for the big 3.

Take the corporates out of the 72/28 figures you will be closer to 50/50.

CPU revenue was split 10 billion, 17 billion, 2 billion across desktop, laptop and a nebulous other.

2023 figures.

Purely anecdotal but the 2 PC vendors in my town have sold few, <20 Intel cpus this year. By contrast they have orders in for 200 9800x3d chips, all sold.. they have only taken orders to their allocation.
 
Last edited:
  • Like
Reactions: tamalero

spongiemaster

Admirable
Dec 12, 2019
2,346
1,325
7,560
Considering Hardware Unboxed just released a video showing the 9800X3D being up to 35% faster in 4K compared to the 285K in some games, I'm sure people are definitely going to notice that difference when using a 13600K -
View: https://www.youtube.com/watch?v=5GIvrMWzr9k
-

9800X3D vs 285K @ 4k
- Hogwarts Legacy = 35% faster
- Assetto Corsa = 60% faster
- Homworld 3 = 34% faster
- Warhammer = 21% faster
- 14 game average @ 4k = 21% faster for the 9800X3D.

So yes you will definitely notice those results, especially since these gaps will be larger at 1440p... There are games like factorio that also process up to 6x faster on X3D chips.

You also aren't considering the power consumption of the 13600K compared to the 9800X3D or any other X3D chip out there, which is multiple times higher at full load which will require more expensive cooling to keep the temps down.
No one is talking about a 285k in this thread. If you only want a gaming CPU, you should not buy a 285k. No one is arguing otherwise.

The difference between a 13600K and a 9800X3D in gaming power usage is 14 watts (65W vs 79W). That's an irrelevant difference to base a decision on. If I gamed 24 hours a day, every day of the year, it would cost me $13.50 additionally per year. At a more reasonable, but still high 2 hours a day (every single day of the year), I'm looking at $1.12 per year. It's going to take a long time to make up the cost difference at a $1 a year.
 

logainofhades

Titan
Moderator
He has videos where he tested ccd0 vs ccd1 on his channel, there was some improvement but nothing drastic. I don't see it beating the 14900k, especially considering the 14900k was gpu bound with that 4090 for the majority of the video.

The 12900k is crap at stock, ecores drop the cache clock by 1000 MHz, from 4.7 to 3.7. If you don't tune your ram, it's garbage. 14900k fixed the cache issue.


RTX on will cause a GPU bound scenario, which I saw they had on for what I was able to watch. Almost nobody uses it, in WoW. It only does RT shadows, and offers no real visual improvement, for a big hit to FPS.

5800x3d beats a13900k in Microsoft Flight sim too, so the 14900k wouldn't fair any better.

tAa9XAuvkL4HctAJ87x83f-1024-80.png.webp
 
Their, not there.
I apologize for the misspelling, while I was writing this I had the GF yapping in my ear and you really should have read it before I edited it as auto correct changed unparalleled to unapparelled. While editing the GF was still yapping in my ear.
I wonder how many people pay them for the ability to simply run their benchmark. In the last year or so, you had to pay a subscription to site just to see your results.

https://www.tomshardware.com/softwa...k-now-requires-a-pound10-monthly-subscription

Finally profiting off being on top of most search results.
It's a good question. I personally prefer to use 3D Mark since I I have it and CPU-Z/GPU-Z for comparisons to see where my system falls in to the line up. This gives me a good idea if I have issues with my setup or overtime a loss of performance for one reason or another.

I had missed that article when it came out and a good read and raised a eyebrow.
I myself am not interested in gaming, but it would be good if all CPU reviews would focus on CPU-GPU combinations, as well as display resolutions, that are most relevant for most potential buyers of that CPU. People are increasingly gaming with 4k monitors, and many reviews of "gaming CPUs" competely ignore 4k gaming, because for most games it does not at all matter whether it is the latest fastest CPU or something else. That data is simply not shown in the reviews at all. OK, "everybody" knows that that is how it is, but still, that information should be given in ALL reviews that address gaming uses of CPUs. I wonder why this is not the case. Lots of hardware purchase decision are made because buyers do not understand what the implications of those purchases are. People want faster SSDs, faster wifi routers, faster memory, faster CPUs, even if in reality the "faster" is not really faster at all, and that "upgrade" for that reason is not really an upgrade at all, only a purchase done because it makes the buyer feel better.
The main reason you do not see 4K benchmarks for CPU reviews is because at that resolution the system is GPU bound and it does not represent CPU performance because the bottleneck then is the GPU not CPU. GamersNexus explains this in detail and why the testing is done that way. If you want more FPS @4K then you get a better GPU, this is a fact that is undeniable. If you want more FPS @1080 or @1440 then you get a better CPU, this too is a fact that is undeniable.

Having 1% more frame rates at 4k from a CPU is still nothing to ignore because of the GPU bound constraint.
 
Last edited:
  • Like
Reactions: stuff and nonesense
Status
Not open for further replies.