Intel Coffee Lake (8th & 9th Gen Core CPUs) + Skylake-X Refresh & W-3175X MegaThread! FAQ and Resources

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


Just because a CPU has more cores doesn't mean it needs a beefier cooler.
I admit larger die, IHS, improved power delivery per application can help to offset thermals while increasing power consumption in a given range.

In fact, Coffee Lake is noticeably more efficient per watt vs. Kaby Lake.
Depends on application and workload, but for the most part I agree.
 
Efficiency re-review of Covfefe Lake form TR: http://techreport.com/news/32666/revisiting-the-power-consumption-and-efficiency-of-intel-core-i7-8700k

The R7-1700 is a good match to the i7-8700K in terms of efficiency.

And I don't know why the "cooling" discussion has been taking so long... You can cool Covfefe Lake with a 95W TDP rated cooler. Just don't expect miracles if you try to OC it and make sure you paste it perfectly so it doesn't throttle. The crappier the cooling you put, in a closed case, the more throttling you will see. Hell, I'd even say that your case airflow is even more important than the cooling HSF you choose (not talking water).

Cheers!
 


Interesting results
the Core i7-8700K consumes an estimated 2% less energy than the Ryzen 7 1800X, and about 4% less energy than the Ryzen 7 1700X, but it's anywhere from 2% slower against the 1700X to 7% slower against the 1800X.
If you're going for the least power consumed for rendering at the expense of a slightly longer task-to-completion, the Ryzen 7 1700 emerges victorious. It's also the only 65W CPU on our test bench, so keep that in mind.
Overclocking the Ryzen 7 1700 and i7-8700K does let them finish the bmw27 render faster, but the extra power consumed this way outstrips the time saved by hastening the pace of number-crunching on both chips. The Ryzen 7 1700 at 4 GHz finishes about 4.4% faster than the i7-8700K at 5 GHz, and it consumes 2% less power.
 

The 1600 is only 5% faster on multi threaded while the i5 8600K is 35% faster on single core.
http://cpu.userbenchmark.com/Compare/Intel-Core-i5-8600K-vs-AMD-Ryzen-5-1600/3941vs3919
 
^ the 1600 is a straight $100 cheaper though ($120-150 if you buy a really good cooler for the 8600k & account for mb price differences)

I can't understand for the life of me people keep comparing the ryzen to the 8600k , that's a vast price difference .

It should be compared to the 8400 at the absolute most , even then its $20-30 more for the intel because a good b350 board only costs $80.

In the UK you can do a ryzen 1700 build for about £40 more than an i5 8400 , & about £60 less than an 8600k build.
 


Because it's mostly an exercise in, "I want what I want and you need to want it. too."
 
And correct me if I am wrong but the 8600k wins in gaming, but the 1700 wins in "productivity".

And although that seems straight forward, every review i watched said don't get the 1700, instead get the r5 1600. However the 8600k annihilates the 1600 as it should (bc it costs more).

Thus I am even more confused about what is best.
 


Please post exactly which reviews you were watching.
 
For straight gaming the 8400/8600k/8700/8700k are all better than any & all ryzens 'at this point in time'

That's predominantly shouldn't make you think the ryzen are bad because they're not at all.

Its the same as its been since ryzens release , 144htz screen & a powerful enough GPU & you want to play AAA titles at 120fps+ ??
Unlocked Intel plain as day.

Streaming/multitasking/background rendering while you're gaming ??
Coffee lake i7 or ryzen 6c/12t ,8c/16t

Sub 75htz screen & a mid-range GPU ??

Makes no odd at all

Said this before , the ONLY coffee lake CPU that competes directly is the i5 8400 vs the ryzen 1600 because its the same price.

Everything other combination has a wildly staggered pricepoint so to me they shouldn't be directly compared.

There is no right or wrong answer at all here.
 


That review uses another concept of efficiency for its final graph. Using the ordinary concept of efficiency the i7-8700k was more efficient than the R7-1800X, because the 1800X was 7.93% faster on Blender but consumed 9.15% more power.

Also he only tested the top i7-8700k model, which will have worse efficiency than other lower-clocked CoffeLake models. It is the reason why the 1700 is more efficient than the 1800X. Performance varies linearly with frequency, but power consumption varies nonlinearly. So reducing frequency increases efficiency with everything else the same.

Other reviews measured efficiency. Kitguru used CB15 and got that the i7-8700k was 24% more efficient than the R7-1800X.

https://www.kitguru.net/wp-content/uploads/2017/10/Power-per-Cinebench.png
Power-per-Cinebench.png


HFR measured efficiency using x264 workload. Again the i7-8700k was more efficient than 1800X.

http://www.hardware.fr/getgraphimg.php?id=609&n=1
getgraphimg.php


The more efficient chip was the i5-8400, again due to lower clocks.
 


I guess it depends on the title, but for me personally i have a 1080 and 1700X and have had no issues getting above 100-120fps at 1440P in all my titles.
 


They can also claim the Earth is flat... it doesn't mean it is. I think it is common sense that 95W i7-8700k at stock cannot consume similar power to 180W TR-1950X. As mainstream reviews show even when the i7 is massively overclocked at 5GHz or 5.1GHz, it continues consuming less power than TR-1950X.

"auto settings" enables MCE in some mobos. MCE implies the mobo no longer run the chip under Intel specs, but instead locking all-core turbo to single-core turbo speeds, and over-volting to sustain the higher clocks. Serious reviews got 80--90 ºC only when the chip was massively overclocked at 5.0--5.1GHz, whereas temps were low on stock settings.

Another pair of useless/biased reviews from ArsTechnica and LinusTech. Nothing new here.
 
This is the only one I could find right off hand:

https://www.youtube.com/watch?v=eYD4vtelXEg

As far as the R5 vs R7 videos..I don't even know anymore as I've watched so many, between Bitwit, LTT, Jayz2Cents, Paul's Hardware, Hardware Unboxed, Science Studio, etc. The distinct impression that I got was that the 1600x is VERY close to as good as the 1700 and that the 1600 can be OC'd to be as good as the 1600...ergo...buy the 1600.

Currently this is me 1080p "Sub 75htz screen & a mid-range GPU"

Hopefully in 1-2 years I'll have a high end graphics card and a new monitor for 4K @ 60Hz.

Predominantly gaming is my major use, but I always think my son will get into streaming or editing. I guess it doesn't make much sense to purchase parts based on "what-ifs".

Will the i5-8600k be able to handle 4k?

Maybe I should just buy the damned 8700k and not worry lol.
 
@rob.salewytsch
The CPU has little to no bearing on resolutions , CPU's handle physics & calculations not graphics.
If your GPU can handle the resolution with absolutely no stress the CPU will pump out exactly the same fps at 1080p/1440p/4k irregardless.


4k 60htz ? Any CPU from the ryzen 1600 up can handle that paired with a powerful enough gpu.

60htz essentially means 60fps (unless you're a csgo script kiddie who thinks that 200fps on a 60htz screen is somehow beneficial).

Ryzens fps foothold is in the 90-120fps range depending on the game.
 
@madmatt30

so...the GPU handles the Resolution bu the cPU determines how many FPS are pumped out? I would have thought both resolution and FPS are considered graphical based outputs. Obviously I would have figured wrong.

I do wonder why the FPS that I see in review videos (sorry no links) go down as the resolution goes up, even when they keep a gtx 1080 ti for all tests. Is it because the GPU is actually stressed and your comment

"If your GPU can handle the resolution with absolutely no stress the CPU will pump out exactly the same fps at 1080p/1440p/4k irregardless."

is more theoretical than practical? I would have thought the FPS would be the same as it goes up in pixel density (even though that sounds ridiculous) if the CPU was determining FPS output . Wrong again, obviously.
 
^ the fps going down as resolution increases is down solely to the GPU , the CPU plays no part whatsoever in this

Imagine you get a game that the 1080ti can easily manage at 100fps on 1080p, 1440p & 4k

Imagine you have a CPU than can also handle stable fps of 100fps on that game.

Now if you artificially lock the fps to 100 so its prevented from going higher the cpu usage no matter what resolution you use would remain exactly the same.
 
Ok, I get it. The GPU can't handle the same FPS as the resolution increases.

So then why do all these videos, which are supposed to be comparisons of CPUs, show all the FPS for the resolutions of 1080/1440/2160 if it has nothing to do with the CPU?
 
^ I have no idea ;-)

CPU testing to determine it gaming limits =

Stupidly overpowered GPU (1080/1080ti)
1080p (or even 720p resolution)

Max settings , disable vsync - fps minimums/maximums/averages are then down solely to the CPU - what you see are solely the CPU's limits as you've removed any chance of GPU limitations.

You them get the odd numpty though who will exclaim 'that's stupid , who runs a 1080ti at 1080p' when the whole point of the exercise is to remove any chance of a GPU bottleneck
 


Here's the interesting page, since the image is not being displayed: https://www.kitguru.net/components/leo-waldock/intel-core-i7-8700k-and-core-i5-8400-with-z370-aorus-gaming-7/6/

They don't test the R7-1700 in there, but given how the 1600X is right next to the i7 8700K, I would imagine the 1700 is up there as well. Keep in mind both have similar ranges: 3.6Ghz base, 3.7Ghz all-core, 4.0Ghz 2-cores and 4.1Ghz XFR for the 1600X and 3.0Ghz base, 3.1Ghz all-core (not 100% sure on this one), 3.7Ghz 2-core and 3.75Ghz XFR for the 1700. The interesting part is when the software is heavily multi-threaded. The 1700 will have monster efficiency and there's no way around it. Intel will have parity (or way better efficiency) with the i7-8700 instead, but until then, the R7-1700 is just better. Deal with it.

Also, interesting that in that same link they prove you wrong with the i5-8400. It's right *after* the 1600X.

As for the frenchy site, well, look at the R7-1700 being more efficient than the i7-8700K. Funny how 2 pieces of evidence you provide contradict each other. What gives?

Cheers!

EDIT: Tag hell.
 
Harder, better, faster, stronger.
by Michael Higham on October 5, 2017

Temperatures

As stated earlier in the methodology, we used the NZXT Kraken X62; an all-in-one liquid CPU cooler with a dual-fan 280mm radiator. However, the 8700K still got pretty hot. At idle, the CPU sat at a mild 32 degrees Celsius and went up to 78 degrees Celsius under load during our runs of X264. With using the 5.0GHz overclock profile, it reached 86 degrees Celsius under load in X264, which is considered higher than desirable.
 
Core i7-8700K Review: Coffee Lake Brews A Great Gaming CPU
by Paul Alcorn October 5, 2017 at 6:01 AM

I just wanted to note that Tom's Hardware test setup, and pay attention to the cooler used! They were able to just dial the temp they wanted the CPU to be!
Test Systems
We introduced our new test system and methodology in How We Test Graphics Cards. If you'd like more detail about our general approach, check that piece out.
In this case, only the hardware configuration with CPU, RAM, mainboard, as well as the new cooling system are different, so the summary in table form gives a quick overview of the systems used:
Cooling
Germany
Alphacool Eiszeit 2000 Chiller(1,500W cooling capacity)
Alphacool Eisblock XPX
Thermal Grizzly Kryonaut (For Cooler Switch)
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9YL0IvNjg5Mzc1L29yaWdpbmFsL0Vpem8tQmVuY2gtVGFibGUuanBn

[video="https://www.youtube.com/watch?v=VDEu7zPTNu4&ab_channel=AlphacoolInternationalGmbH"][/video]
Notice the cooled with chiller@20C water temperature
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9GL0cvNzE3MjQ0L29yaWdpbmFsL2ltYWdlMDAxLnBuZw==

AVX without offset pushes the result as high as 170W. The Core i7-8700K at 4.9 GHz even throttles due to its package temperature. And that's in spite of our compressor cooler's efforts! Thermal paste under the IHS does us no favors.
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9GL0IvNzE3MjM5L29yaWdpbmFsL2ltYWdlMDAyLnBuZw==

The above graph shows that a closed-loop liquid cooler is able to keep an overclocked Core i7-8700K from throttling after 20 minutes of warming up. A good heat sink and fan combination should perform almost as well, again, given ample airflow.
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9GL0MvNzE3MjQwL29yaWdpbmFsL2ltYWdlMDA0LnBuZw==

Under our stress test, the overclocked processor gets uncomfortably hot, even under our compressor cooler.
While we're only measuring an average of 170W, thermal throttling keeps the 180W+ peaks from becoming our average power consumption result. At that point, even the most powerful coolers have to throw in the towel.

To be sure, it's surprising just how much power such a tiny processor can consume once it’s pushed to its limits. Nevertheless, Intel’s Core i7-8700K is relatively easy to cool, even on air. You'll just want to stay away from taxing rendering sessions and AVX-optimized workloads. At that point, you're best off with an all-in-one closed-loop liquid cooler.

Edit: Notice
graph shows that a closed-loop liquid cooler is able to keep an overclocked Core i7-8700K from throttling after 20 minutes of warming up.
Alphacool Eisbaer 420mm AIO keeps the i7-8700K from throttling after 20 minutes of warming up. How long does it take for water to absorb heat and become saturated in a 420mm AIO? More than double 20 minute warm up!
 


While going through this, I just realized that all the Coffee Lake chips(i5/i7), except the i3 ofcourse can handle just about any GPU you throw at it. So actually comparing the gaming benches are a moot point, and hence why we see similar range of gaming performance from all the Coffee Lake processors right now. The cards are just not able to challenge the processors. Its when more powerful cards will be introduced a couple of years down the line and the i5s will start bottlenecking and the i7s will be optimized, then the actual class of each processor will be more prominent. :)
 


As explained just above efficiency increases when clocks are lower. Underclock the 8700k and you increase the efficiency. Not a mystery here. This same physical law is the reason why the 1700 is more efficient than 1700X, and the 1800X. Same chip but lower clocks equals more efficient. Take a 1800X and underclock it to 1700 levels and you get same efficiency.

As showed, the i7-8700k is more efficient than the 1800X. The only reason why your link claims the 1800X to be more efficient is because it changed the definition of efficiency, but using the proper (standard) definition the i7-8700k was more efficient (7.93% vs 9.15%).

Links aren't contradicting each other, because efficiency is not a constant among workloads. What the three reviews show is that the i7-8700k is more efficient than that R7-1800X on three different workloads: Blender, x264, and CB15.

So i7-8700k is cheaper, faster, and more efficient than R7-1800X.
 
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9GL0IvNzE3MjM5L29yaWdpbmFsL2ltYWdlMDAyLnBuZw==


Temperature in the 90 ºC when overlocked to 4.9GHz. As stated before the 92 ºC reported by LinusTech for stock settings (3.7GHz) were crazy nonsense

Also temperatures that LinusTechChips report cannot be stock temperatures. On stock the i7-8700k temperatures are around 50--60 ºC

stability-tests-645x760.jpg


https://www.kitguru.net/wp-content/uploads/2017/10/xTemperatures.png.pagespeed.ic.NROMEs-ck3.jpg

Temperatures in the 90 ºC range as LinusTechChips measured are only obtained with the CPU overclocked to 5GHz or so. Kirguru got 80--90 ºC. TechRadar measured a peak of 87 ºC when pushed the CPU to 5.1GHz.