News AMD Unveils 5 Third-Gen Ryzen CPUs, Including 12-Core Flagship

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The mobos allowing up to 300w to be pulled through the socket tells me someone is expecting OC potential in these chips. The chips themselves, even the older gen Ryzens sip power compared to their current Intel counterparts. I couldn't get may Gen 1 Ryzen to pull 300w through the socket by any strecth of the imagination - even with great cooling, because they just didn't OC that high to consume that much power. However, one of these maybe pushed to 5ghz will start to suck back some power - maybe getting closer to 300w, but even still, I'd reserve that for the 12 and the likely-coming-at-some-point 16 core units


Note this: The 9900k can pull more power through the socket than the FX 9590 did and even more than the 16 core Threadripper. Are Intel enthusiasts afraid about their mother boards crapping out after three years? Actually, that's something I've wondered ... And now they just announced another iteration of the 9900K that will have even higher all all core boosts so expect that power consumption to increase even further.

All that said, I think it may be possible that Ryzen 3xxx might have some decent OC capabilities - since ~5ghz is pretty much as high as x86 can go (practically), the architecture still has the clock speed headroom that Intel has entirely run out of. Not that AMD necessarily wants to go there -- Intel 9th gen only suck 50% more power than equivilent Ryzen 2xxxx because of that 5.0 ghz clock speed ...
The 2700x draws 200w even if the 3000 draws less at the same clocks it also runs at higher clocks,so let's be safe and say the 8core will still draw 200w how much would 50% more be to feed the 50% more cores of the 12cores one?
 
I don't understand your point about TDP lock. What is a TDP lock? None of AMD's CPU's are locked to a given frequency or TDP. Saying there is a 100MHz difference is not completely true...


Code:
AMD Ryzen 7 2700X:  3.7GHz/4.3GHz:  105W TDP:   8c/16t
AMD Ryzen 7 2700:   3.2GHz/4.1GHz:  65W  TDP:   8c/16t
AMD Ryzen 5 2600X:  3.6GHz/4.2GHz:  95W  TDP:   6c/12t
AMD Ryzen 5 2600:   3.4GHz/3.9GHz:  65W  TDP:   6c/12t

AMD Ryzen 9 3900X:  3.8GHz/4.6GHz:  105W TDP:  12c/24t
AMD Ryzen 7 3800X:  3.9GHz/4.5GHz:  105W TDP:   8c/16t
AMD Ryzen 7 3700X:  3.6GHz/4.4GHz:  65W  TDP:   8c/16t
AMD Ryzen 5 3600X:  3.8GHz/4.4GHz:  95W  TDP:   6c/12t
AMD Ryzen 5 3600:   3.6GHz/4.2GHz:  65W  TDP:   6c/12t

The R7 2700X has a 100MHz lower  base clock then the 3800X and 200MHz lower boost clock.
The R7 2700X has a 100MHz higher base clock then the 3700X, 100MHz slower boost clock and has a 40W higher TDP.
The R7  2700 has a 400MHz lower  base clock then the 3700X and a 300MHz slower boost clock.  They have identical TDP.
The R5 2600X has a 200MHz lower  base clock then the 3600X and a 200MHz slower boost clock.  They have identical TDP.
The R5 2600X and R5 3600 have identical clock speeds but the R5 3600 has a 30W lower TDP.
The R5  2600 has a 200MHz lower  base clock then the 3600 and a 300MHz slower boost clock.  They have identical TDP.

So not only is each 3000 series 100MHz to 400MHz faster base clock and 100MHz to 300MHz faster boost clock they each have approximately a 15% IPC advantage and some of the 3000 series have a lower TDP. With the stock cooler I imagine we will be able to get another 200-300MHz out of these CPU's with an X570 motherboard. With the right CPU, Motherboard and CPU Cooler I think 5GHz is totally possible on the 3900X and especially the 3800X.
You see that TDP rating after every CPU in that list?
AMD can take their measurements at say 95W TDP the new smaller nm cpus will manage to clock higher with a 95 tdp limit then what the older ones could reach.
They didn't show any actual numbers or FPS so you can compare with zen+ numbers they just showed 15% more IPC but that could just be 15% better clocks at say 95w or 65w TDP.
 
Are you talking about the power consumption? You don't believe the fake TDPs Intel puts on their chips do you? Intel's own announcement was that their TDPs consider BASE CLOCKS ONLY. Intel said that after Ryzen launched to fool the fools ... thanks for providing the evidence of how well Intel propaganda works on fanbois.

Get up to speed. Look here: https://www.tomshardware.com/reviews/intel-core-i9-9900k-9th-gen-cpu,5847-11.html

See that? Intel 8 core/16 thread consuming 100% more power than AMD 2700x (8c/16t) under the same load. Sheesh. Reality completely escapes some people ... who might also be drunk.
In prime 95 with AVX that has double the AVX load than what ryzen has...the intel one also will give you much much better performance in AVX workloads.
 
I'm hoping that this is why AMD kept the clock speeds down. Especially if this all the clock speed they need to keep pace with Intel in games now that they've improved IPC. It would be super cool if there is significant OC headroom in these chips with aftermarket cooling. That way the enthusiasts who don't care about power efficiency can overclock their CPU up to 5ghz and everybody else can get great power efficiency and good temps. Perhaps that's just wishful thinking though.

Clocking the CPUs higher at stock would probably have required more than the Prism cooler they are providing with the 3700x, 3800x, and 3900x. They would have had to have left them out completely or included a very expensive stock cooler. It's hard to expect a better stock cooler than the prism. I'm very annoyed they are shipping the 3600x with the spire and the 3600 with the stealth though. If its anything like the 2600x it going to be too much for it with stock boost. I wish they'd provide the prism on all x chips and the spire on everything else. That stealth cooler is better suited to a quad core.
To be honest if the CPUs are able to maintain boost clock with stock coolers we should really not complaint about the stock coolers that come packed in the box for free.
 

Math Geek

Titan
Ambassador
The 2700x draws 200w even if the 3000 draws less at the same clocks it also runs at higher clocks,so let's be safe and say the 8core will still draw 200w how much would 50% more be to feed the 50% more cores of the 12cores one?

not sure where you get that number from but you couldn't hit 200w with a 2700x unless you went on LN2 and tried for 6 ghz!!

https://www.tomshardware.com/reviews/amd-ryzen-7-2700x-review,5571-11.html

here is tom's review where they overclocked it and torture tested it and only reached 150w in prime 95 which we all know is nowhere close to actual real world usage. 125w is the most you'll see in a normal gaming situation and that's overclocked as well in a game that uses the cpu 100%. closer to 100w is more likely scenario.

200w is just not gonna happen no matter how much you want it to.
 
  • Like
Reactions: JQB45 and TJ Hooker

joeblowsmynose

Distinguished
The 2700x draws 200w even if the 3000 draws less at the same clocks it also runs at higher clocks,so let's be safe and say the 8core will still draw 200w how much would 50% more be to feed the 50% more cores of the 12cores one?

Not sure about your numbers ... 2700x draws 104w under a torture loop according to Tom's review of the chip. add 50% to that rou're looking at about 160w, then OC that and you get maybe 220w for the 12 core, consider a 16 core and under a good overclock you might then be coming up on 300w, but that's assuming these chips like high clocks.
 
  • Like
Reactions: TJ Hooker

joeblowsmynose

Distinguished
In prime 95 with AVX that has double the AVX load than what ryzen has...the intel one also will give you much much better performance in AVX workloads.
I don't care what the reason is for Intel CPUs drawing 2x the power ... the fact is that under stress testing (and lighter loads as well) ... they do. Please check the Tom's review of either 2700x or 9900k and look at the power consumption pages.
 
Last edited:
not sure where you get that number from but you couldn't hit 200w with a 2700x unless you went on LN2 and tried for 6 ghz!!

https://www.tomshardware.com/reviews/amd-ryzen-7-2700x-review,5571-11.html

here is tom's review where they overclocked it and torture tested it and only reached 150w in prime 95 which we all know is nowhere close to actual real world usage. 125w is the most you'll see in a normal gaming situation and that's overclocked as well in a game that uses the cpu 100%. closer to 100w is more likely scenario.

200w is just not gonna happen no matter how much you want it to.
Yeah sorry I was looking at full system draw.
 
I don't care what the reason is for Intel CPUs drawing 2x the power ... the fact is that under stress testing ... they do. Please check the Tom's review of either 2700x or 9900k and look at the power consumption pages.
It's great that you don't care about the reasons...
I don't care about power draw under stress testing... because I wouldn't be using my CPU for stress testing,
if I was using my CPU for work and burned 2x power to do much more work I wouldn't care much either.
 

joeblowsmynose

Distinguished
It's great that you don't care about the reasons...
I don't care about power draw under stress testing... because I wouldn't be using my CPU for stress testing,
if I was using my CPU for work and burned 2x power to do much more work I wouldn't care much either.

In line with that, I (and a growing number of others) have noticed that Intel "enthusiasts" only care about power draw and heat if it concerns AMD and not Intel ...

My CPU actually pulls more power doing 3D rendering than it does stress testing and is often rendering out an animation for hours or even days in a go, so its not like its an unrealistic scenario. I don't know, I buy an 8 core CPU to use all the cores and all the power it can provide. Not sure about anyone else ...
 
  • Like
Reactions: JQB45

joeblowsmynose

Distinguished
You see that TDP rating after every CPU in that list?
AMD can take their measurements at say 95W TDP the new smaller nm cpus will manage to clock higher with a 95 tdp limit then what the older ones could reach.
They didn't show any actual numbers or FPS so you can compare with zen+ numbers they just showed 15% more IPC but that could just be 15% better clocks at say 95w or 65w TDP.

What? That's all wrong ... completely. IPC is ENTIRELY independent of clocks (there's how many instructions that can be had within one clock cycle, and then, there's how many clock cycles can occur within one second (clock speed) - not the same). What AMD clearly showed was about a 5% increase in clocks, and claimed a 15% in IPC - these two both combine to get general performance uplift.

TDP has NOTHING to do with performance. AMD and Intel measure TDP differently. AMD uses all core boost numbers under typical full load (hence the 2700x's 105w TDP using about 104w at stock under load as Tom's reviews shows) and Intel uses base clock ONLY. Intel even tried to (quietly) clarify this shortly after Ryzen launched. This is why an Intel supposed 95w TDP can balloon to 250w power consumption and the Ryzen parts do not. Entirely different metrics, and because of this metrics, the TDPs between the two DO NOT compare, and Intel 8th and 9th gen pull far more power under loads than their Ryzen counterparts.

None of this is really debatable ...
 
  • Like
Reactions: JQB45 and TJ Hooker
I have a theory...

I think that AMD has the 16 core CPU just as ready to go as the rest of the lineup... but they didn't show it. I think AMD had 2 slideshows ready to go and were waiting to see if Intel would release anything that they would consider a threat. Instead of a threat they got the pre-overclocked 9900K variant. So, instead of a 16 core CPU they only delivered what they needed to beat Intel. I think AMD is sandbagging. They are holding that 16 core chip in reserve until they feel like they NEED to release it. I don't know if they are doing that because yields were a bit on the low side, they aren't confident they can extract much more from their current chips, or if they are just trolling the crap out of Intel. In any case I think they didn't release it because they didn't have to.

But that's just a theory. A chip... (is shot)
 
  • Like
Reactions: JQB45 and LeeRains

Math Geek

Titan
Ambassador
intel did the same thing. as soon as Ryzen came out with it's core counts, all of a sudden intel had more cores in like 2 days. it takes years to fully develop a cpu. they had that sitting there waiting in case they needed it. when they did, they just had to let it into the wild. intel only had to drop very small ipc improvements for years. so that's all they did in the range of 4-5%. not that amd is dropping large gains, intel all of a sudden has "18%" improvements. it's all a game that is not based on what is best for the consumer in any way.

not like this isn't common practice in every industry.
 
Last edited:
  • Like
Reactions: JQB45

hftvhftv

Distinguished
Ambassador
not sure where you get that number from but you couldn't hit 200w with a 2700x unless you went on LN2 and tried for 6 ghz!!

https://www.tomshardware.com/reviews/amd-ryzen-7-2700x-review,5571-11.html

here is tom's review where they overclocked it and torture tested it and only reached 150w in prime 95 which we all know is nowhere close to actual real world usage. 125w is the most you'll see in a normal gaming situation and that's overclocked as well in a game that uses the cpu 100%. closer to 100w is more likely scenario.

200w is just not gonna happen no matter how much you want it to.
I guess my Sea Sonic X850 is going to be way overkill unless I go X299 or X399 and get dual Radeon VII or RTX 2080 Ti's.
 
I think I might ... he's likely referring to max power consumption limits that the onboard sensors will regulate clocks and voltage to maintain.

GPUs tend to take full control, but with CPUs the user is still in control of clocks and volts - the only "lock" I am aware ogf with Ryzen is perhpas the IF being possibly a bottleneck to higher clock speed.

With GPUs, RadeonVII had an initial bios issue where, for no reason, the card could not be overclocked at all, but this has been rectified, and there's been some pretty good performance numbers out of VegaVII on water and OCd.

And of course with Vega, there's always the mighty unlimited power mod, that basically let's you use any amount of power she'll take without exploding. :)

I think that must be what he was on about. I don't really agree with him on that tho ...

It still makes no sense. TDP has nothing to do with IPC. IPC does not stand for performance or wattage. It stands for how many instructions the cpu can execute during a single clock tick. Keyword= single clock tick, not 5000000000 or 4600000000 clock ticks.
 
You see that TDP rating after every CPU in that list?
AMD can take their measurements at say 95W TDP the new smaller nm cpus will manage to clock higher with a 95 tdp limit then what the older ones could reach.
They didn't show any actual numbers or FPS so you can compare with zen+ numbers they just showed 15% more IPC but that could just be 15% better clocks at say 95w or 65w TDP.
So you're admitting that you have no clue what IPC means?
 

King_V

Illustrious
Ambassador
Does Lisa The Great realize how they just screwed their company stock price in not providing 5.0GHz gamer CPU??? DIY pimp rig people will shun her offering as for the most part they got "it" already, the performance that is. No major performance kick is here now!

I mean, this is a crusher and now all the AMD FanBois will not "run" with the 5.0GHz Intel boys in online gaming, an arena in which the clockspeeds are so ever ever ever important, and the processing power BANDWIDTH & GIGAFLOPPAGE does not mean a thing.

I got a dull headache caused by AMD's offer of an upcoming INFERIOR CPU on the top end vs. Intel. I wanted top o' the hill stuff so I can pull
my Ryzen 2700[no X] out of my tower and put it in my HTPC running a 2200G + GTX580 8GB.

All Lisa Sue offers is the 7nm and in the middle is still the 8C 16T Processing power at virtually the same clockspeed
as the 27XX CPUs. For a guy like me there is not a great upgrade as the 3900X is not even a 5.0GHz CPU that matches Intel's clockspeed.
Well, this is a pathetic less than competitive offering. I wish it was a mistake, but the mistake part is reality. A pity.
I am thinking as you may suspect that these little "thangs" we see are perhaps H Y P E, perhaps the product of AMD agents in place?


valveman2012, is that you? If so, your grammar has improved considerably.
 

hftvhftv

Distinguished
Ambassador
No need of X299 or X399, just get i9-9900K and Z390 in combo of RTX2080Ti SLI and you will reach the limit even without starting on OC.
I'm currently running two GTX 770 4GB Windforce cards in SLI, with an FX 8350 Overclocked to 4.7GHz @ 1.5V with a Swiftech H240X, 2 Noctua 3000RPM 140mm fans, another 140mm fan, and two other 230mm fans plus 2 hard drives, and 2 SATA SSDs. The RTX 2080Ti is a 260W card, my 770s are right around the same with the extra memory and factory overclock. I think an FX 8350 overclocked would still be more power hungry than a 9900K overclocked.
 

joeblowsmynose

Distinguished
Did not Intel also release it 10nm parts, we need to compare 7nm to 10nm parts to be fair(er).

What? Wait until the end of 2020 for Intel 10nm to actually hit desktop at these performance levels? That neither makes sense nor would it be fair.

Intel has traditionally always been ahead of AMD in process nodes, I don't think even the most hardened AMD fanbois demanded that we only compare like node to like node. You compare latest offerings. The 9900k is only 6 months old, and Intel is still creating yet-to-launch new iterations of it. (KC, KFC, KS)

What would be fair, is to compare the performance to what is available at launch day. I don't think Intel will be pulling anything new out of their butt within one month, so comparing the Ryzen 3800x to the 9900K is fair.

What isn't fair is AMD obliterating the HEDT 12 core 9920x in core for core performance, and then offering it less than half the price ... THATS hardly fair, but its what Intel had coming to them for a decade of milking their customers while offering negligible gains generation over generation over generation.
 
Last edited:

joeblowsmynose

Distinguished
It still makes no sense. TDP has nothing to do with IPC. IPC does not stand for performance or wattage. It stands for how many instructions the cpu can execute during a single clock tick. Keyword= single clock tick, not 5000000000 or 4600000000 clock ticks.

No you are right ... it didn't make sense. I'm not even sure I was anywhere accurate in trying to understand what he meant. Its clear he doesn't really understand what TDP or IPC really is. (post#87)
 
Last edited:
  • Like
Reactions: JQB45

TJ Hooker

Titan
Ambassador
AMD can take their measurements at say 95W TDP the new smaller nm cpus will manage to clock higher with a 95 tdp limit then what the older ones could reach.
They didn't show any actual numbers or FPS so you can compare with zen+ numbers they just showed 15% more IPC but that could just be 15% better clocks at say 95w or 65w TDP.
Are you conflating IPC and (single threaded) performance? Because a 15% performance increase that came from a 15% clock speed increase would never be referred to as an IPC increase.
 
  • Like
Reactions: JQB45

joeblowsmynose

Distinguished
In prime 95 with AVX that has double the AVX load than what ryzen has...the intel one also will give you much much better performance in AVX workloads.

Look at the gaming loop then on the same page ... 9900k pulling over 20% more power in lightly threaded tasks ...

The fact that Intel lost the efficiency crown with their lofty 5.0ghz goals isn't debatable. It happened. Blame Intel for fooling all its supporters by vastly changing the way they determine what TDP to put on their chips. As is clear by these power consumption charts I linked, AMDs 105w TDP in reality indicates vastly lower power consumption than Intels "supposed" 95w TDP ... cat's out of the bag on that one.

Even with the 15% IPC uplift, Ryzen 3xxx (due to 7nm node - smaller nodes only provide thermal headroom to allow clock and IPC uplifts against that constraint) still looks about 30% more efficient than Ryzen 2xxx. That extra 30% efficiency can then go towards higher clocks (overclocking) as long as the architecture can get to those clocks to eat the rest of that efficiency gain. That's how it all works together.

So I'll say again ... that 300w socket capability tells me A)16 core is inbound at some point (pretty much confirmed) and B) the mobo makers (or AMD directing them on these specs) are expecting this new architecture to OC a fair bit better than previous Ryzen ... OR AMD is expecting future gen chips to possibly clock higher and consume more power and they want to guarantee backward compatibility.

I think I heard that MSI is NOT supporting Ryzen 3xxxx on any of their 1st gen motherboards while AMD promised it would be possible ... 300w at the socket spec is future proofing for backward compatibility which AMD does offer and provide the choice for the consumer. If anyone knows I'm wrong on that rumour, please correct me.
 
Last edited: