News AMD Ryzen 9 7950X vs Intel Core i9-13900K Face Off

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

AndrewJacksonZA

Distinguished
Aug 11, 2011
576
93
19,060
Data? What about the fact that AMD's new platform will be around for years, several generations to come while Intel is on the way out with this, just cranking up outdated stuff? Maybe that should be considered alongside the price.
And the higher cost of the new AMD platform?

What about the extra cost of high end cooling required for Intel power hungry chips - where is that data in this article?
"the recommended 280mm AIO (or air equivalent) for Intel, or 240mm AIO (or air equivalent) for AMD."
 

fball922

Distinguished
Feb 23, 2008
179
24
18,695
FWIW, I'm still rocking my i7-7600 non-K. Only now when paired with an RX 6800 XT is it being a bottleneck, and not that bad. What is bad is that I'm busy converting a large video library to the AV1 codec with my ARC A750 (I have two cards in my system,) and the HUGE bottleneck in Handbrake appears to be the CPU, not the A750.
I have been considering snagging an ARC for exactly that purpose... Happy with the performance and PQ? What settings are you using and how small are your output files (assuming BR source, 4k maybe?).
 
I went with a 7950X. My reasons -

I don't like Intel's initial E-cores design. I think Intel's 12th gen let the consumer down horribly with their marketing on the number of cores without really detailing the fact that one P-core is equal to (or better than) eight E-cores, in some instances. Supposedly, the performance penalty for having the E-cores active on the 13th gen is much less than on 12th. I know that the whole big.LITTLE design is the way of the future. My disappointment is just as much on how it was marketed as the actual performance numbers.

BIGGEST REASON - LONGEVITY. Intel's 13th gen chips is the LAST GEN on LGA 1700. All new Intel chips (14th gen and forward) will require a new motherboard. Zen 4 requires a brand-new architecture with a brand-new socket. AMD, themselves, said that it will be supported at least through 2025, but I bet it will go a year or two beyond that. I will be able to put 2, 3, maybe 4 new generations of AMD CPUs on the motherboard I just bought.

For those who already have some Intel components, the choice is harder, of course. If you already have an Intel LGA1700 motherboard (that is compatible with Intel-13th gen), decent RAM, and a good PSU, Intel's 13th-gen is a no-brainer for breathing new life into your system.
 

joker965

Prominent
Nov 4, 2021
4
1
515
The Intel one does use more power in heavy use. But to be relevant you need to look at how you will be using the chip for your "8 hours a day".
Most of these heavy uses are better performed on a more efficient GPU, for time, part cost and power. My typical use is frame capped gaming, where Intel is fairly even, and light use where HwInfo64 tells me my overclocked 13900kf is not even using 10w right now.

The assumption that all people are running CPU render farms with their home PC is flawed.

If we looked at more data on this I believe it would show that the Intel chip would use more power on average than the AMD chip in general. If we combine that with the extra power that the cooling for the Intel chip will need it is almost certain that it will cost more to use the Intel chip over time. It is like a car. The cost of use of a thing over say 5 years must be the real number that is considered when purchasing a thing. That being said I would take either one if they are handing them out. ;)
 
  • Like
Reactions: rluker5

ekio

Reputable
Mar 24, 2021
84
112
4,710
I will never buy Intel again.
I will never forget the pre ryzen dark age decade when Intel scammed the entire world.
I am so thankful to AMD that 7950x it is for me!
Those who are bending over for a 5 percent perf advantage should consider twice.

PS: Testing games in 1080p is bad faith. Nobody paying this much for a cpu plays 1080p...... 1440p & 4 k should be the ref.
If you game in higher definition, the difference between Intel AMD and will shrink like an ice cube in the sun, but the journalist is obviously partial.
He want the games to be cpu limited the maximum possible to make AMD side look worse.
 

uwhusky1991

Reputable
May 7, 2020
15
10
4,515
We put the Core i9-13900K and Ryzen 9 7950X through a six-round fight to see who comes out on top.

AMD Ryzen 9 7950X vs Intel Core i9-13900K Face Off : Read more
For me, if you add in upgradeability, I'll choose AMD. This is the first generation of AM5 so I suspect there could be at least 3 generations that use it, but potentially as many as 5 as there were with AM4. I'm not even sure if Intel will use LGA 1700 for 14th gen processors.
 
  • Like
Reactions: JoBalz

rluker5

Distinguished
Jun 23, 2014
626
381
19,260
If we looked at more data on this I believe it would show that the Intel chip would use more power on average than the AMD chip in general. If we combine that with the extra power that the cooling for the Intel chip will need it is almost certain that it will cost more to use the Intel chip over time. It is like a car. The cost of use of a thing over say 5 years must be the real number that is considered when purchasing a thing. That being said I would take either one if they are handing them out. ;)
You may be right IDK. I have 11.845W total cpu package average over the last 3:19:00 per HWinfo64 and there are some benches in there. I have chips that use less, this one can use less if I turn down the pcie speed, cores have averaged 4.172w. But I don't have a Ryzen to compare. I have too many computers as it is.

Either way both the 7000 series Ryzen and 13 series Intel use a lot less power than if they were running CB 8 hours a day. Probably not even enough power to worry about. I probably save more on a single light bulb, going from 75w incandescent to 8w led.

And both series are fast enough to make most happy with in the margin of personal preference. So really just a good time to be upgrading if you have some legitimate benefit.
 
  • Like
Reactions: JoBalz

AndrewJacksonZA

Distinguished
Aug 11, 2011
576
93
19,060
I have been considering snagging an ARC for exactly that purpose... Happy with the performance and PQ? What settings are you using and how small are your output files (assuming BR source, 4k maybe?).
Very happy. I can't tell the difference between the original files and the output files. The sizes vary from as low as 13% of original to about 110% depending on the content, source codec, and resolution. I'm converting a large amount of Microsoft conference and training videos (720p30 and 1080p30,) video captured by my RX470 of gameplay (1080p60,) and encoding my DVD collection (720p25 and 720p30.)

I'm using Handbrake Nightly, starting off with "H.264 MKV 1080p30" as a base, and I've modified it and saved it as a custom preset.
Summary tab:
Passthru Common Metadata = checked.

Video tab:
Video Encoder = "AV1 (Intel QSV)"
Framerate = "Same as Source"
Quality = Constant Quality 24
Encoder Preset = Quality
 
  • Like
Reactions: fball922

rluker5

Distinguished
Jun 23, 2014
626
381
19,260
I will never buy Intel again.
I will never forget the pre ryzen dark age decade when Intel scammed the entire world.
I am so thankful to AMD that 7950x it is for me!
Those who are bending over for a 5 percent perf advantage should consider twice.

PS: Testing games in 1080p is bad faith. Nobody paying this much for a cpu plays 1080p...... 1440p & 4 k should be the ref.
If you game in higher definition, the difference between Intel AMD and will shrink like an ice cube in the sun, but the journalist is obviously partial.
He want the games to be cpu limited the maximum possible to make AMD side look worse.
I was one of those scammed. Bought a 4770k rig in 2013 and it still runs everything 60+fps but CP2077. (does 4.7 quiet, 4.8 loud and 4.9 hard bench, uses 32GB 2400c10 that I got with it in 2013) Daughter has it and doesn't want me messing with it. a $300 chip that is good for a decade. What a scam. If I had gone with the AM2-3-4-5 platforms and upgraded for low cost like I should I could have benefitted from savings of -thousands on the last 9 years of complete stability and good enough performance.
 
  • Like
Reactions: AndrewJacksonZA

AndrewJacksonZA

Distinguished
Aug 11, 2011
576
93
19,060
@rluker5:

I'm holding out for a Meteor Lake or Arrow Lake i7, but let's see what the next gen or from AMD brings to the table in terms of multithreading, and heck, it might be better to use my 6800XT on an AMD platform instead of an Intel one.

One of the things that Intel has on its side is its programming ecosystem and tooling. Combine that with (what appears to be) Visual Studio's preference for Intel architectures and the reverse engineering and debugging tools that are generally aimed at Intel platforms and Intel is a less risky choice for people like me who generally buy in five to eight year gaps.
 
  • Like
Reactions: rluker5

rluker5

Distinguished
Jun 23, 2014
626
381
19,260
@rluker5:

I'm holding out for a Meteor Lake or Arrow Lake i7, but let's see what the next gen or from AMD brings to the table in terms of multithreading, and heck, it might be better to use my 6800XT on an AMD platform instead of an Intel one.

One of the things that Intel has on its side is its programming ecosystem and tooling. Combine that with (what appears to be) Visual Studio's preference for Intel architectures and the reverse engineering and debugging tools that are generally aimed at Intel platforms and Intel is a less risky choice for people like me who generally buy in five to eight year gaps.
I'm less responsible with money so I upgrade more. I went from DDR3 straight to DDR5 on desktop and bought as quick as I could even though I could get 60fps on everything but CP2077. But you can see firsthand that your system is still doing fine. I even went from a 12700k to 13900kf and the framerate in games went up by about 30% oc to oc in scenarios that I never use in real life. That didn't save me any money at all and I'm kind of relieved I won't have the opportunity on this Z690 to "save" money like that again.

I have heard that SAM works better than rebar so you will probably get a few more frames out of your 6800XT on that.

And while I have a 13900kf right now, next year it won't be the fastest, it will go into the huge category of "good enough for now". If you are fine waiting until Meteor or Arrow Lake or Zen5 or 6, your CPU is going to stay as good as it is now and those future CPUs won't be any slower if you buy them. And the current ones will still be as fast as they are now and will be selling for a discount. But savings is what you get when you have self restraint :p
 

Zerk2012

Titan
Ambassador
It is still truth if one looks at e.g. 4K benchmarks if one games at 4K. Performance measuring is multifaceted and is of better value if it caters to a wider variety of audiences.
When you compare processors you don't do 4k because you become GPU limited plain and simple. A CPU will put out the same FPS at 1080p as it will in 4K until your GPU limited.
 

Arbie

Distinguished
Oct 8, 2007
208
65
18,760
I will never buy Intel again.
I will never forget the pre ryzen dark age decade when Intel scammed the entire world.
I am so thankful to AMD that 7950x it is for me!
Those who are bending over for a 5 percent perf advantage should consider twice.

I agree with that much of your post. Not with the rest where you call this a biased article. Looks like Intel, under terrific pressure from AMD, has finally produced a competitive chip. Though noticeably behind in efficiency which is very important to me, so I wouldn't call them equal. THANK YOU AMD. Without you Intel would just sell us $300 14nm quad-cores forever. Everyone needs to remember that when they decide who should get the profits from their next CPU purchase.
 
Last edited:
  • Like
Reactions: SunMaster

tummybunny

Reputable
Apr 24, 2020
33
31
4,560
Why are these benchmark comparisons ALWAYS done in Windows 11, which is better for Intel thanks to the updated thread sxheduler, than Windows 10, which a vast majority of people are actually using?

If you want to use Windows 10, winner = AMD.
 
  • Like
Reactions: TheOtherOne
Why are these benchmark comparisons ALWAYS done in Windows 11, which is better for Intel thanks to the updated thread sxheduler, than Windows 10, which a vast majority of people are actually using?

If you want to use Windows 10, winner = AMD.
Soooo, what you are saying is that people should buy intel because...future proof.
I mean we all have to face it, sooner or later everybody will be on windows 11.
 
  • Like
Reactions: rluker5

Brian D Smith

Commendable
Mar 13, 2022
117
68
1,660
But AMD Fanboys will ignore the data.

Ummm...what's to 'ignore'?
Seriously, I've only ever owned Intel - you might therefore care to label me an 'intel fanboy'....but when I check out the areas I am most interested in...AMD now comes out on top. Therefore...my next (and it will be soon) desktop with be...and AMD. Go figure.
 
  • Like
Reactions: AndrewJacksonZA

TheJoker2020

Commendable
Oct 13, 2020
219
64
1,690
Why no real-world Gaming performance benchmarks.?

Very few people are going to blow a massive wad of cash on a top tier gaming system and then play at 1080p or 1440p, with the exception of a small number of people who play competitive twitch shooters at minimum settings at 1080p, whereupon, better performance can be had by using a better internet provider and better hardware between the ISP and the PC.!

Spending $1500 on a GPU, $550 on a CPU, plus the rest of the system and NOT using a 4K screen is going to be a tiny fraction of those using these CPU's for gaming, so, I ask, where are the 4K gaming reviews.?

I can nitpick more, such as, why is the Intel system being tested with DDR5-6800 whilst the AMD system is not, but the cost of the memory is not part of the comparison pro's and con's, and then we have the motherboard comparison of pro's and con's where you include DDR4 (yes FOUR) motherboard options to use with an i9-13900, and can sacrifice a significant proportion of the performance (depending on what is being tested).

This review seems to be rather biased and has several glaring issues.

I am pointing this out because I have read and watched several other reviews that compare these CPU's and accompanying components and they all have fair and balanced comparisons, and of course test multiple games at 1080, 1440 and 4K.

I am very disappointed, and hope that you take my criticism as it is intended to be and turn it into something constructive by adding those 4K reviews, using the same speed (and cost) RAM for both systems, and fix the other issues I have highlighted. Yes, this will mean that you have to re-do half of the benchmarks and rewrite half of this article.
 
Last edited:
  • Like
Reactions: dinvlad and JoBalz
But AMD Fanboys will ignore the data.
thats what fanboys of anything do.

thats why they are fanboys.

That's all everyone says when Intel beats AMD is to wait for AMD stuff to come out. I know because I've been watching the show for decades.
its tug of war.

intel was unmatched at start.
amd was best when they made their quad cores.
intel pulled ahead later. (also made the glorious 4770k which was the best purchase anyone would of made on a cpu)
amd was best to OC around bulldozer (might of been 1 b4 or after i 4get order) as you could push crazy high cores.
intel took lead again and held it for ages (and got greedy on pricing then)
amd releases Zen2 & matches intel and forced to reduce prices
amd zen3 crushed intel until 13th gen (and even then in certain thigns the 5800x3d is still king)
intels currently better value to performance.

thats WHY we keep saying competition is good.
also AMD generally "dis" save you $ long term (due to multi generational cpu upgrades on same socket)

and if u only gamed and bought a 12th gen 12900k....you did get boned as 2 weeks later 5800x3d came which was better (not in every game as some always favor intel or amd) & cheaper so that would of been worth waiting.

a lot of the "wait" is due to people over hyping a release based on "leaks" that are always over promising (on both sides)

only place amd is entirely untouched and always the answer is server side as nothing intel has (even their upcoming one) can touch zen4 epyc.


The older Intel chips we know now people were and are still effectively using 10 years running.
i plan to run my 5700x (might up to a 5800x3d 2023 just cause i might upgrade relatives system with mine) for next decade.

the power zen 3 has is not gonna be "obsolete" in next decade.

in my 28 yrs of gaming i was purely intel until my current system build. Either side is great anymore as they are overkill for 90% of users.
AMD need to redesign its CPU to make better price per performance ratio to fight overall cost of the system. its not a good idea to upgrade with spending 100% more money ( Mobo and DDR5 + ryzen chip) and just get +-35% performance ( 13% IPC and 25% clock speed)
they do.

AMD, unlike Intel (used to not lower em but the do eventually now), does over time reduce prices of their chips. (usually a yr or so after launch)

and its high cost is due to a "new" nm process size. (time casues wafers to have less defects & lower costs as result)

same for ddr5...it'll drop. (any early adopter tech is always "bad value" thats the cost of beign early adopter)

and nobody "should" be upgrading from past gen to next gen...its rarely ever worth it.

skip a gen and then upgrade for a noticeable increase thus better value.



and in closing

Nobody (who is into tech) denies 13th gen is betetr performance to cost than Zen4.

but again...either are good and more than enough for most users.

then again AMD's EPYC are getting some interesting stuff (CXL is extremely interesting) that if makes its way down to the lower chips could change the story (but could be many yrs if it trickles down)
 

JoBalz

Distinguished
Sep 1, 2014
102
43
18,640
But AMD Fanboys will ignore the data.

I've used both Intel and AMD about equally over the years but the last five or so, I'd be in your "Fanboy" category. First, I don't play any games that would benefit from the Intel architecture. Two, power consumption is important to me. Intel's consumption is still higher than the AMD CPUs and I believe in reducing my power footprint where I can. Third, I Iike supporting AMD so that we continue to have a viable alternative to Intel. Plus, when Intel gets stagnant and AMD comes out with CPUs that trounce them, it encourages Intel to innovate to stay competitive. While AMD's prices are higher when first released, they do tend to drop after a time, bringing them more in line with Intel prices. The biggest negative right now is that AM5 boards have been released at higher prices than prior AM4 boards (but AM4 has been around for a number of years, so lower prices would be expected on them as a mature architecture) and DDR5 RAM is also higher than the older DDR4 RAM. In the end, Intel's biggest draw is for serious gamers and for those that currently want the latest technology but have some restrictions on their budget. In the end, use what you personally like since they will both will do the job in the end.
 
  • Like
Reactions: TheJoker2020

TheOtherOne

Distinguished
Oct 19, 2013
220
74
18,670
I am not gonna get into all the mumbo jumbo about price reduction or hardware compatibility etc.. A simple thing is, Intel draws more power and unless you are getting electricity for FREE or sharing bills with others, you WILL end up paying much higher in long run and it won't really take that "long".
 

HideOut

Distinguished
Dec 24, 2005
560
83
19,070
"AMD's Ryzen 7000 chips come to market with an entirely new architecture and process node. "

not according to everything or everyone else, including AMD. This is a big refresh of 5xxx/Zen 3. The Zen 5 is the entire new chip.
 
  • Like
Reactions: Thunder64