Discussion AMD Ryzen MegaThread! FAQ and Resources

Page 71 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965


Finally, some news on it! I wonder how much that is going to cost. I think we might learn more next Tuesday!
 

jdwii

Splendid
So 1.225V 3.7Ghz for all my normal tasks whenever i need the power 1.3V 3.9Ghz. Ryzen software i love it. Now i just wish i could get 3200myz memory haha

All using Amd balance power settings really nice low temps and when doing normal tasks its at 25C-30C or so.

Sure most of you guys heard of Chew he is a overclocking genius and Amd sent him to overclock bulldozer back in the day well he claims its quite normal for lower binned chips to run at lower temps but they typically need more voltage for maximum overclocks. Basically that makes me happy i bought my 1700 over the 1800x or 1700x


Main goal i had with upgrading to my 8350 to my haswell was super quiet performance and i easily get to keep that with ryzen its odd and makes me feel weird to say that Amd runs super cool and cooler then Intel its like i'm from a geek twilight zone episode.
 

jaymc

Distinguished
Dec 7, 2007
614
9
18,985


little off topic. Could you post same in AMD Future Chips thanks.

 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


This is second gen. First gen was 2.4GHz base and 2.8GHz turbo.

This is basically two 1700 dies glued together. The TDP rises up to the Moon. Marketing label is 180W, real TDP is >200W.
 

$hawn

Distinguished
Oct 28, 2009
854
1
19,060


What was the first gen?

Also the 1700 is rated at 65W. So thats 65W x2 + A little extra for the extra connections.
180W or lower seems easily achievable.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


The first gen of samples AMD released before the second gen leaked now by videocardz.

The 65W was the marketing label. The real TDP is 90W:

What are the TDPs, within the meaning of the consumption limit and therefore the maximum number of watts to be dissipated, of the Ryzen? AMD also communicates this value, but less markedly: 128 watts for the 1800X / 1700X, and 90 watts for the 1700. These are the values that are most comparable with the TDP communicated by Intel.

That is why AMD rates this X399 chip like 180W and not as 120--140W.
 


So, AMD is using it's own TDP measurement and says 65W when they should be saying 90W; ok, don't disagree.

Then, new part is 180W (by AMD's info I'd imagine?) and you say it's 200W+. But then, those 20W+ extra come from where exactly? I really don't think the extra intra-connectivity of the CCX'es nor the MCM packaging interconnects would add that much to the TDP?

Also, shouldn't these be higher binned parts anyway? Or are we talking consumer parts and not server parts?

Cheers!
 

8350rocks

Distinguished


If I am reading correctly, Juan is not using TDP, but total wattage consumption, which does not relate to TDP, as TDP is Thermal Dissipation.

Max power draw for the 1700 is around 105W stock running prime95:

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9ZLzgvNjY2MDgwL29yaWdpbmFsL2ltYWdlMDAxLnBuZw==


Also, X370 is rated as 140W socket, even though the chips are all 125W or less TDP.

The thermal dissipation AMD calculates is not the same as Intel...we have known that for some time.

As for the chip he is talking about, I could see it potentially drawing 200W from the wall under something like prime95, but I doubt you require 200W cooling.
 
But Power Consumption does not equate directly to TDP... Argh!

In any case, if it's Power Consumption and not TDP per sé, then I see how they can suck 220W, but *under torture* and a worst case scenario.

So, more or less, my questions still stands.

Cheers!
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


It is not about measurement, but AMD invented a new definition of 'TDP' that doesn't correspond to the usual industry usage of TDP. That is why the new RyZen '95W' chips can dissipate more power than older 125W Piledriver chips.

180W is the official wattage for the top 16C chip. >200W real TDP is estimation made by CanardPC and me. The extra 20W+ come from the MCM package.
 

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965
TDP
Thermal design power
"The thermal design power (TDP), sometimes called thermal design point, is the maximum amount of heat generated by a computer chip or component (often the CPU or GPU) that the cooling system in a computer is designed to dissipate in typical operation. Rather than specifying CPU's real power dissipation, TDP serves as the nominal value for designing CPU cooling systems.[1]

The TDP is typically not the largest amount of heat the CPU could ever generate (peak power), such as by running a power virus, but rather the maximum amount of heat that it would generate when running "real applications." This ensures the computer will be able to handle essentially all applications without exceeding its thermal envelope, or requiring a cooling system for the maximum theoretical power (which would cost more but in favor of extra headroom for processing power).[2]

Some sources state that the peak power for a microprocessor is usually 1.5 times the TDP rating.[3] However, the TDP is a conventional figure while its measurement methodology has been the subject of controversy. In particular, until around 2006 AMD used to report the maximum power draw of its processors as TDP, but Intel changed this practice with the introduction of its Conroe family of processors.[4]

A similar but more recent controversy has involved the power TDP measurements of some Ivy Bridge Y-series processors, with which Intel has introduced a new metric called scenario design power (SDP).[5][6]"
https://en.wikipedia.org/wiki/Thermal_design_power
3:37 minute video
https://www.youtube.com/watch?v=yDWO177BjZY

Edit: Just too add TDP is used to find proper cooling of the CPU/GPU under normal conditions. If you look at the total power draw it would help you find what power supply you would need. Ryzen eats power pretty good once you turn up the frequency. TDP doesn't count boost clock!
 


As others have said TDP is *thermal* power. In theory a chip could draw 300w *electrical power* and only dissipate 100w thermally (heat is after all just wasted energy). It's all down to the type of transistor used, the design of the chip and so on.

As an FX 8320 owner- I can tell you to keep this thing cool, at stock settings, I've had to under volt it and switch to a closed loop water cooler (just to stop it thermal throttling under load). That is supposedly a 125w chip that can be kept cool with an air cooler (how?!).

On the other hand we can see Ryzen behaves itself using a 95w Wraith air cooler, so long as the clock speeds for all cores are kept below about 3.8ghz.

I think that suggests Ryzen is very *thermally efficient design* (anecdotally people like @jdwii have noted how cool it runs), however TDP has nothing to do with *electrical efficiency* which is a different metric. I understand that TDP and EDP are fairly directly linked it's worth keeping in mind that one a proportion of the other, and the factor that differentiates the two can and will be different between designs. Given that Ryzen is built on what is essentially a mobile process- and the issues that phone and tablet manufacturers have in keeping the modern parts cool (the 'tSkin' temp AMD was going on about with some of their tablet oriented offerings a while back), well I think it makes sense that Ryzen is a more thermally efficient design in terms of power used vs heat generated.

TL,DR; a Ryzen 7 1700 can to my mind only generate around 65w of *thermal energy* and still consume 100+ w of electrical energy. I don't think AMD's measurements are necessarily wrong and everything we are seeing suggests Ryzen runs cooler than equivalent Intel parts- albeit at similar electrical power consumption.
 

truegenius

Distinguished
BANNED
if we plot graph of % difference between 3200 & 2133 then we can see that ryzen is providing 1.5x to 2x performance compared to intel ( eg if intel's performance is improving by 8% then ryzen's by 15% ), which means ryzen benifits significantly much more from faster ram than intel does.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


CPUs don't generate net work and all the consumed energy is dissipated as heat by virtue of the first law of thermo. That is how we measure TDPs, and how all reviews of RyZen have found that the marketing TPDs of the R7 models don't correspond to the real TDPs.
 

8350rocks

Distinguished
http://www.barrons.com/articles/amd-analyst-day-on-tap-this-could-go-to-uncharted-territory-says-rosenblatt-1494618365?

Interesting read about financial analyst insight into AMD's roadmap, and also a few confirmations of upcoming products.
 

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965


8350rocks, you can scroll back when I first introduced myself in this thread, and see where I talked about Intel's 10nm vs. AMDs 7nm! Exciting! People really don't understand how good the Ryzen architecture really is! Now at 14nm advertised it measures 17nm vs. Intel's 14nm measuring 13nm. 4nm gap and killing it!!! Wait till Intel drops 10nm measured 9.5nm, and Ryzen 7nm measured 9.2nm get dropped on top of intels 10nm! Ryzen will reduce it's process node by 7.8nm vs Intel's 3.5nm. Intel has been getting away with basically the same architecture by shrinking down the process better than anyone else. Now that they are running out of room to run it's about to get FUN! Ryzen will almost cut it's process node in half at 7.8nm vs Intels 3.5nm reduction. It's about to get biblical for Intel(end of days)! I've enjoyed watching this coming for the last year. Now, more people will catch on!
 


AMD is still in a tough spot. While Ryzen is very competitive at the high end (though lets see how the Core i9 series performs/price), I still have doubts while that is enough to get AMD in the OEM/Business market they desperately need to get into. And Xeons still offer more Price/Performance in server land.

On the GPU front, AMD is squeezed between NVIDIA at the high end, and their own APUs at the low-midrange markets. I think the best they can do is basically try to fit the void between APU level performance and the 1060, but is there enough potential sales there to justify all the R&D costs?

Right now, AMDs condition has gone from "dire" to "stable", but I don't really see a way forward for potential growth. I'm wondering if AMDs best bet right now is to just focus on APUs, given Intels about the launch the i9 series which competes directly against Ryzen. And if Intel can just literally slap a 12/24 core/thread CPU together in a couple of months (assuming it's a reaction to Ryzen and not previously planned), then AMD really can't compete against Intel head to head.

Going to be interesting to say the least.
 

jaymc

Distinguished
Dec 7, 2007
614
9
18,985


"We think AMD’s Zen in 2017/18 may be better than Intel x86 roadmaps based on Skylake, Kaby Lake, etc. Servers – AMD Naples 32 core (64 thread) will go up against Intel Skylake Purley 28 core (56 thread) this summer and both likely launch at Computex later this month. It is possible Intel may introduce a 32 core product to counter Naples, however it is not clear at this time. Naples will sport 8 DRAM channels vs. Skylake’s 6 DRAM channels for 33% memory bandwidth (this is hugely im- portant for hyperscalers). More cores, more threads, more bandwidth, better price performance, better performance/watt, for AMD."

Strong words great article... I'm looking forward to Tuesday...(an the end of the month)..
Thing's are really starting to hot up... The only piece of the puzzle that seem's to be lacking, also the only piece of their plan that appears to be going awry is SK Hynix an HBM 2 production numbers... I hope it's finally ramping up.
Where Naples is weak Vega is strong.
I suppose if push comes to shove customers could opt into Nvidia GPU's an still go with Naples. Be a spanner in the works though.

What is going on with Sk Hynix HBM 2 typical AMD something always goes wrong. An it's beyond their control (this wrinkle anyway).. frustrating !!

I can't help wondering if Nvidia is buying up more HBM 2 than they need from Samsung...
I mean even if they had enough they could just keep buying it or block booking it anyway (even if they cancel later samsung cannot sell any to AMD).. Or stock pile it they would use it eventually I guess. This would keep the brakes on Vega !! It would be one way of holding Vega back for the time being anyway. I wonder if this is what has been going on.. How much HBM 2 have they bought from samsung an whats their requirement ?? This would be a typical example of a company with massive resources manipulating the market to it's own ends. Just a thought.