Principled Technologies Says it Messed Up Intel 9th Gen Testing

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


XFR is only thermal boosting - not throttling. Specified clock speed and power dissipation is what the chip is designed for. If the chip only has 105W of power delivery available and/or is at 80°C, it won't go past its defined base frequencies, but neither will it go under them when fully loaded. Throttling doesn't actually happen before the chip has reached 90-95°C, so there's still quite a bit of margin.

AMD are confident enough in their build quality to allow their chips to automatically go past their rated speeds if power delivery and thermal dissipation are good enough, but only up to a point.

On the other hand, Intel rate their chips to an all-core frequency, but there are documented cases of Intel chips throttling with the boxed coolers, as in the chip goes to lower (and sometimes MUCH lower) clock speeds on load.

Or would you consider Intel's throttled speed the "base" clock, and Intel's base clock being maintained when properly cooled the "boost" clock?
 


EXCEPT: Thermal throttling is always in reference to the base frequency, not boost frequencies or XFR dynamic overclocking-boost. Yes, base frequency can be changed via an overclock (and it adjusts normal minimum operating speed.) However, both boost and XFR does not touch the base clock speed. The same applies to Intel's speed boost tech. I'll repeat: Thermal Throttling is always in reference to the base clock speed.
 
I don't believe there is any standardization for thermal throttling. I really think it's a matter of viewing the glass half full instead of half empty.

Anyways, there's some things I've been completely ignorant on and I feel i have disgraced the community. In light of this I would like to send my apologies to everyone who has read this thread past and future.

Furthermore, to try and get rid of my biases I have decided to cancel my i9-9900K pre-order, at least until prices cover down on it, and I've purchased an AMD Ryzen 7 2700X and X470 motherboard with overnight shipping. Should be here tomorrow. Competition is good for the community. It's been too long since I've built an AMD system. I've literally got 3-4 recent Intel builds just laying around. If i like what I've bought, them i will likely end up selling all my Intel (maybe keep just one) to offset the cost.

Is the next-gen Ryzen supposed to be compatible with current chip sets?
 


TL;DR : compatible : yes with BIOS update. Fully supported : it depends.

Long version :
AMD has said that the AM4 socket should stay relevant up until 2020 - so yes, you should be able to plug a Zen2 into a current motherboard (or any motherboard dating back to 2017 whose manufacturer maintains the BIOS seriously). They do warn that newer CPUs may come with features that wouldn't be supported by current chipsets, but we're talking 3-5% performance in edge cases. If, for example, XFR3 allowed 300MHz overboost on the CPU but the motherboard's chipset only supports XFR2's 200 MHz, you'd only get the latter.
Same thing for SenseMI, let's say the newer version would handle clock speeds in 5 MHz increments instead of 25 MHz, you'd only get 25 MHz steps - that would cost you a few cents a year in "wasted" energy, but this wouldn't objectively be felt in normal use. If the motherboard maker goes through the process of implementing and then validating the feature in their software, you might actually get support for that feature when it comes out. Don't hold your breath for that though, it'll be the exception not the rule - it did happen though, with SenseMI being supported on some X370-based motehrboards..
Same for power profiles : if futures CPUs for some reason rated above today's biggest beast from AMD (Ryzen 1800X), then some motherboards may not support them (I'm thinking M320-based ones, or very cheap early B350-based ones), but most should (overclocking FTW).

In all cases, it will completely depend on BIOS updates. High-grade motherboards should be good for a couple years.
 


You should be nothing but happy with the 2700X.

It has been a common misconception in the PC gaming community that all AMD chips are garbage. The truth of the matter is that AMD has only produced one generation of garbage CPUs and that was the original Phenoms (and arguably the K5 but that was just an attempt at a Pentium that fell flat on performance, it wasn't actually broken). They had a pretty big flaw that had to be corrected by a redesign that turned into a new generation of chips (Phenom II). Many AMD chips simply weren't as fast as Intel's offerings. They under-performed due to poor design or lack of optimization, but were perfectly serviceable processors with respectable, if mid range, performance. Then there have been generations that have made Intel shake in their boots. Intel actually sued AMD over the 486 and 386. At the time AMD was producing superior versions of a licensed product. Fast forward a few years and the Athlon came out competing against the Pentium III and was a superior chip as well as the first x86 CPU to pass 1 GHz. Then came the Althon XP, which was faster, cheaper, and cooler than the Pentium 4. AMD's beatdown of Intel culminated in the release of the Althon 64, stealing the x64 standard from Intel, and the Athlon 64 x2 and FX (not to be confused with the Bulldozer one). Intel had to make some radical changes and from those ashes we got the Core series (God bless its divine architecture).

In many ways AMD made the bed they are trying to crawl out of right now. Their turn of the millennium CPUs were so good that they awoke the sleeping giant. If not for AMD we might still be talking about NetBurst architecture Pentium 9s right now. The good news is that AMD has finally risen to the challenge and produced a CPU worthy of putting up a fight against Intel's top chips again, and while it isn't always the best at everything, it should be respected for both its performance and for its pushing of Intel to release better CPUs much faster.
 
Thanks for that detailed summary. I love that type of nostalgia! And that's sort of my reason for buying this Ryzen 7 2700X. I've have hated missing out on this AMD Ryzen hype. I wanted to buy one from the start, but again I didn't have the money and I already had a good Intel. I got a promotion at work this year though and I'm now able to make this purchase.

My first full build was a budget gaming AMD Athlon II X2 system. I couldn't afford anything better than the Athlon at the time and I was more than happy with it. I sort of missed out on building with the earlier Athlons, I was just too young to be able to buy PC parts like that.
 

You seriously bought the Ryzen system? I was convinced you were trolling us all!

I said I was stepping out of this thread but had to come back to say, if you're actually serious, then full respect to you, sir!

Just for the record, I don't think there's any doubt that the 9900K will be the faster CPU, often by a significant margin. Anyone denying that is wearing red-tinted glasses. But the 2700X is a great CPU at a great price. I also think we have AMD to thank for the 9900K, no way Intel would have released this CPU without serious competition. In any case, have fun!

In terms of the XFR vs "thermal throttling" (and not having a go at anyone here), it's worth pointing out that Nvidia GPU boost works in a very similar way. There is a hard temperature limit, at which point the CPU/GPU will throttle down. Stay under that and the chips are not technically throttling. They are, however, using real time power and temperature data to determine the boost (Nvidia GPU) or XFR (AMD CPU) frequency. If you give an Nvidia GPU plenty of thermal headroom (for example, drop GPU temps down to the 50s instead of 70s), they will boost significantly higher in most workloads. Even a very capable 2.5 slot aftermarket GTX 1080ti will run faster if you open all your windows in winter. Similarly, put that same card on a custom loop, without changing a single setting on the card, and you'll see higher boost clocks and a measurable (though not massive) performance increase.

I can certainly understand why the distinction between "boosting" and "throttling" seems like petty debates around semantics. In the end of the day, whether "boosting" or "throttled", the product runs faster with a better cooler. I can then understand why people could consider that cooler inadequate. The key problem with that logic is that just about any cooler becomes inadequate, as long as the product performs better with a better cooler. So the 3 fan cooler on the ROG Stix 1080ti OC becomes inadequate because the card runs better on water. Or, by that logic, even the U14S is (potentially) inadequate for the 2700X because it would perform better with a sub ambient chiller. That logic takes you to the ridiculous very quickly!

That's why Nvidia, AMD and Intel all specify a TDP and base clocks. Coolers should be at least good enough to keep the products in safe operating temperatures at base clocks... i.e., avoid thermal throttling. Each company then has approaches to take advantage of additional thermal headroom when it's available, so they benefit from better cooling or colder days, which shouldn't really be thought of as thermal throttling.
 


The first PC I ever used was an AMD based 386, with the math co-processor. It was a great introduction to computers. The first PC I helped build was my fathers K6 200 MHz, which eventually became mine. The first PC I built all on my own was a K6-2 500MHz. I built many other AMD systems over the years, the majority of which were Athlons (like 5 or 6 of them), but had some Intel machines sprinkled in there (laptops... AMD hasn't ever really done mobile well) up till my Phenom II. Since then it has been nothing but Intel machines, not because I've abandoned AMD, but because AMD doesn't do mobile very well and I've only built one desktop which I went Intel on for performance reasons. However, I am months away from buying either a 2600X or 2700X. I actually quite like AMD, for mostly nostalgic reasons.

Which is why I'm pretty upset about Intel and PT right now. They didn't even have to cheat to win. Everyone knows that Intel has the clock speed advantage and the IPC advantage (if only just). All they had to do was release an 8 core CPU with hyperthreading and it would beat anything AMD had on the market... but instead we get this joke of a benchmark run that completely misrepresents Ryzen, and actually kinda misrepresents Intel's 8th gen as well (there is no reason the 8086K should lose ANY test to an 8700K... like AT ALL). How anyone with any knowledge of these products could publish this is completely beyond me.
 
Yes, I actually bought the Ryzen 7 2700X this morning. I wasn't intending to troll, but I guess it kinda went there and I am sorry.

*Now I'm trolling Intel for making me buy a new motherboard and their overpriced CPU.

mx3OHoX.jpg

kI786rC.jpg



Regarding throttling, I had this very debate regarding Nvidia's GPU Boost a while back. The 1080 Ti throttles in 12MHz increments after reaching 55C (or somehwere). But of course there are people that say since it is still above base clock it is not thermal throttling. So, basically every GTX 1080 Ti except a few of the liquid cooled hybrids have inadequate coolers (hey, you guys are the ones getting sideways over the stock AMD cooler). I just didn't know the 2700X couldn't boost all the way with the stock cooler. There's a lot of things about Ryzen I don't know and I usually pass up the posts about them. But I want to know, and I learn best from first hand experience. That's why I bought the 2700X.
 


I'm seriously not trolling here; but from what you say the 2700X can hit max boost with the stock cooler, and what you say about XFR, the Noctua cooler would have allowed XFR to essentially overlcock the 2700X. So everyone is upset because the 2700X wasn't overclocked compared to the 9900K not being overlcocked? That doesn't seem fair seeing how the Intel's boost technology doesn't overclock past stock boost. That's the way I have seen this from the start, some people just made it sound like the 2700X wasn't or wouldn't have reached max boost with the stock cooler.
 


Stock settings are stock settings. XFR is technically an overclock by the strictest definition, but it is not a user applied overclock. It is kind of like a system curated overclock. Like it was said before, it is like NVidia's boost and no one considers that an overclock even though it clocks up the GPU when conditions allow it and more cooling means more speed... well, until the power limit.

Compare that to the Intel boost technology and it is slightly different. Intel will boost 2 cores to the max frequency and leave it there until it reaches power or thermal limits, then dial it back. The only difference between the two is really that Intel sets a cap, then rates their chip for that speed that you won't always get. AMD rates its boost as its boost clock and even with the stock cooler it can run all day at that boost clock if the task requires it (a single threaded heavy program). XFR allows for additional performance whenever possible.

The all core boosts work the same way as well. AMD rates at the base clock speed, but XFR can push beyond that when possible. Intel used to specify the all core boost and rate their chips at it, but doesn't anymore[strike] partially because people were upset that it wasn't holding the all core boost due to thermal and power limits[/strike] because they just stopped for some "unknown" reason.

So with that explanation who do you think is really the worse of the two?
 

alextheblue

Distinguished

I thought Mitch cleared this up for you already with his excellent posts on the matter. Let me try to take a different tact. Ryzen reaches max boost just fine. They don't advertise that you'll get MORE than that with stock cooling. That's why they separated XFR. So you get what you pay for (boost) with the stock cooler, but you only get to take real advantage of dynamic overclocking (XFR) with better cooling (and appropriate chipset support). So it works as advertised.

Meanwhile, Intel's situation is fuzzy. All their greater-than-base clocks are lumped into "boost". But if you run an Intel chip with a stock cooler hard, you throttle down from those top boost clocks. They are essentially doing the same thing but they blur the lines between guaranteed speeds, more-or-less guaranteed boost (really boost isn't guaranteed at all but you can hold at least SOME speed above base), and what might as well be called dynamic overclocking "boost". They also (due to architectural differences) have a wider range in terms of both boost and dynamic overclocking "boost" which only conflates the issue. To reiterate: They're doing the same thing. They just obfuscate things.

Intel's chip actually benefits more from the additional cooling headroom than AMD's, IMO. So it won't help AMD all that much, by itself. But it's only RIGHT to level the playing field - if you equip one with a premium aftermarket cooler that dissipates more heat, the same has to be done for the other. Even though it won't benefit Ryzen as much, it's only fair.
 

As I said in my post above, the problem with that logic is that things get ridiculous. On a hot summer day in Australia those hybrid models will run slower. By your logic, does that mean they're "inadequate"? Or, I could put an aquarium chiller on one and it would likely boost higher still. Someone could then come along and put the chip under LN2 and it may well boost yet higher. Do you then suggest that a chiller is "inadequate"? How does that make any sense whatsoever?

from what you say the 2700X can hit max boost with the stock cooler, and what you say about XFR, the Noctua cooler would have allowed XFR to essentially overlcock the 2700X. So everyone is upset because the 2700X wasn't overclocked compared to the 9900K not being overlcocked?
Intel have turbo boost too, which is NOT guaranteed. It drives their CPUs well above the TDP, so will only work when the CPU is equipped with a cooler that provides the additional headroom. The 9900K still has TDP of 95W, but it has an all-core turbo frequency of 4.7Ghz. We need to wait for reviews for confirmation, but I'd be extremely surprised if those 8 cores and 16 threads draw less than 150W at full load at 4.7Ghz. Put a little 95W cooler on the 9900K and it will not perform as well as it does with the U14S.
What I'm saying is, just like the Ryzen 2700X, the 9900K will run faster with better cooling. That's precisely why it's questionable to give the 9900K a superior cooler.
 
You make it sound like AMD holds its boost better than Intel. But, I've been using Intel for a few years now and if I run Prime95 that boost isn't going anywhere even with a cheap cooler on an i7-7700K that reaches 90-95C.

I don't know how AMD holds boosts. But I intend to find out. ;)
 

The 7700k is actually quite unusual from Intel's recently releases because the base clock (4.2) is so close to the max turbo clock (4.5). You're absolutely correct that with a 91W cooler under load the 7700K should not drop below it's base clock of 4.2Ghz (we'll ignore AVX loads for this discussion).

The 9900K, in the same way, should not drop under its base clock under load with a 95W cooler. The difference is that the base clock of the 9900K is 3.6Ghz. The all-core max turbo clock, on the other hand, is 4.7Ghz (on 8 cores!). I can all but guarantee you that the 9900K won't get anywhere near that all-core turbo clock under sustained load with a 95W cooler. We can say that with confidence because the i7 8700K blows way past 95W to maintain its 6 core turbo of 4.3Ghz [EDIT: I later realised this is an overstatement, but the next sentence still applies]. As we've all been saying, better coolers provide the potential for higher boost clocks for better performance.
 

Actually I should correct myself. GamersNexus found the 8700K drawing around 95W with a Blender load. That was with a good cooler so we can safely assume it was running well above its base clocks. It looks like my earlier statement about the 8700K blowing way past 95W when turboing was an exaggeration. The actual power draw will vary depending on the CPU and the load, but it looks like I did overstate the power draw on the 8700K.

In any case, the principle is the same. Intel do not guarantee turbo clocks, just like AMD don't guarantee XFR. Both fully acknowledge power draw under turbo/XFR can exceed TDP. The power draw of the 7980XE, for example, usually exceeds TDP dramatically under all-core turbo. I suspect we'll see the same thing with the 9900K. The principle of better cooling potentially resulting in better performance is beyond question.
 
Oct 16, 2018
3
0
10
Some of these replies are just vacant minded biases in of themselves. It's painfully obvious that this testing was a planned stunt that they had calculated whether or not the bad reception would be still worth to post the bias results. There is no accident here, if anyone thinks or says its a reasonable mustake by any means then they are lost
 
Oct 16, 2018
3
0
10
It's actually amazing how easily they ride the media wave and just get away with whatever the hell they like. "Principled technologies" now there's a grammar lesson on oxymorons kids.
 

It wasn't a mustache at all,other sites have found out the up to 50% faster thing long ago.
Yes the test was flawed but even if they did it perfectly it would still have shown the intel system to be 50% faster.
https://www.techspot.com/review/1655-core-i7-8700k-vs-ryzen-7-2700x/page8.html
 
After me being so biased against AMD, I went and bought the Ryzen 7 2700X. And I must admit, as a gamer it's not impressive. When Intel's cheaper i5 outperforms it I see no reason to buy it unless I was a streamer. Except, for "branding" and AMD nostalgia. I like the Ryzen branding and the nostalgia of using an AMD processor.
 


"
Closing Remarks
Having established that the Core i7-8700K is hands down faster than the Ryzen 7 2700X for gaming, it's also not a great deal faster. Realistically at 1080p with a beastly graphics card you’ll stand to gain up to 15% performance at the high-end, but will more often see gains of 10% or less.
"
Where do you see 50% faster?
 


Nobody is forcing you to buy 2700X, go with the cheaper I5 if you like :)