Intel's Future Chips: News, Rumours & Reviews

Page 121 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Gon Freecss

Reputable
Apr 28, 2015
448
0
4,810
Wrong, Coffee Lake is still using Skylake cores, not Cannonlake cores.

Cannonlake is under the same micro-architecture umbrella as Skylake, but it's not the same. It has IPC increases.
 

aldaia

Distinguished
Oct 22, 2010
533
18
18,995

Canonlake is also using skylake cores. Here are some sources.
https://www.thomas-krenn.com/en/wiki/Intel_Microarchitecture_Overview
https://en.wikipedia.org/wiki/Cannon_Lake_(microarchitecture)
Cannon Lake (formerly Skymont and Cannonlake) is Intel's codename for the 10-nanometer die shrink of the Kaby Lake microarchitecture. The successor of Cannon Lake microarchitecture will be Ice Lake, which will represent the architecture phase in the Intel Process-Architecture-Optimization Model
I would aprreciate sources indicating it's a new microarchitecture.

Regarding IPC, it's entirely possible to increse IPC with no changes in the microarchitecture. For instance 1700X and 2700X are same microarchitecture, yet minor tweks increased IPC of the later.

On that aspect I have yet to see any review of canon lake since it was paper launched. If you have one please post link.
 

Gon Freecss

Reputable
Apr 28, 2015
448
0
4,810
It's the same thing with Nehalem -> Westmere, Sandy Bridge -> Ivy Bridge, Haswell -> Broadwell. They have IPC increases from some architectural changes.

Cannonlake is the same as the above.

Also, about Zen+, the IPC increases are minor. They tweaked their immature cache/IMC. Intel's die shrinks (at least from the first generation) all contain some changes.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


10nm brings about twice more density than 14nm. So one can introduce twice moar cores in about the same die size. That was the original plan: 4-core Skylake 14nm --> 8-core Icelake 10nm.

8-core CoffeeLake is still a rumor (but one persistent). However, Intel has 6-core today. So they could do a 12-core Icelake on 10nm. Technically it is possible, if management decided to do it, I don't know. I asked a friend to see if he has some info about this.

I think you really ask why they didn't just backport Icelake onto 14nm++. I believe it is because Icelake core is wider and bigger. If rumors are correct Icelake has twice bigger SIMD units and 2x--4x larger caches. An 8-core Icelake die on 14nm++ would be noticeably larger than 8-core Coffeelake die on 14nm++.
 

aldaia

Distinguished
Oct 22, 2010
533
18
18,995
I just found a very instructive explanation published yesterday of what is and what is not TDP.
Since we have had eternal discussions in this thread about TDP I think it's worth to post here.
https://www.overclockingmadeinfrance.com/quest-ce-que-le-tdp/

Unfortunately its in french, but google translation is relatively decent. So i post translation too for those not understanding french.
https://translate.google.com/translate?hl=es&sl=fr&tl=en&u=https%3A%2F%2Fwww.overclockingmadeinfrance.com%2Fquest-ce-que-le-tdp%2F&sandbox=1
 

Gon Freecss

Reputable
Apr 28, 2015
448
0
4,810
10nm should be 2.7x as dense as 14nm. 10nm+ (the process Icelake is made on) is 10% denser than 10nm. 14nm++ is actually a bit thinner than 14nm. 10nm+ looks like it's gonna be 3 times as dense as 14nm+.

Yeah, the original plan was 4C Skylake -> 8C Cannonlake and then 8C Icelake. Now, it's 6C Coffee Lake (Skylake), with a rumored 8C. I can see them coming up with a 12C die for Icelake.

Ah, that's interesting. A wider core and much larger caches are gonna bring some much needed improvements. 10nm+ clocks nearly as high as 14nm++, but with much lower power consumption. Interesting times ahead.
 
Having bigger dies is good and all, but what are the plans for the iGPU then?

They're still, supposedly, improving the iGPU in the consumer-only parts and eating away die size with cores will impact the amount of transistors they assign to the iGPU. Or are they not really improving the iGPU as much as purportedly before now?

Cheers!
 

Gon Freecss

Reputable
Apr 28, 2015
448
0
4,810
Icelake will have Gen 11 graphics. Skylake is Gen 9. Kaby Lake and Coffee Lake are still Gen 9 (Gen 9.5 to be exact). Cannonlake is Gen 10 (Goldmont Plus uses Gen 9.5 execution units, and Gen 10 display). Anyway, execution units will be doubled with Gen 11 compared to Gen 9. Add architectural improvements (should be great gains), and graphics are gonna be good. Tigerlake will have Gen 12 graphics, and I expect it to improve drastically.

10nm+ is very dense, so I think 12 cores + iGPU would be plausible.

I don't think Intel will release a 10 core i5 chip nor an 8 core i3 chip. Maybe they'll go with 2 i7 SKUs.

i7: 10 and 12 cores.
i5: 8 cores.
i3: 6 cores.

What do you think?
 


Having 2 i7 flavours sounds awful, but they've done it for notebooks... I would rather they just use i9 in that case.

And as for the iGPU, what are the changes? I know they've been adding EUs with new gens, but I don't remember reading anything different from that trend.

Cheers!
 

Gon Freecss

Reputable
Apr 28, 2015
448
0
4,810
Uh, they've done it recently with Haswell-E, Broadwell-E, and Skylake-X. 6C and 8C as i7s, but that's HEDT.

I'd rather not they call a mainstream part an i9, but we'll see what happens.

I don't think we know anything but the amount of EUs doubling. This hasn't happened before within the same model. GT2 has been 24 EUs since Broadwell, which is Gen 8. GT2 for Gen 11 should be 48 EUs. Also, according to the wiki, Gen 10 GT2 has 40 EUs.
 


True. I forgot the X-pensive platform. It is annoying for me, TBH. At least they use numbers, so yay?

In any case, it only makes sense for Intel to bundle the graphics for laptop-grade CPUs, but desktops and specially K versions, makes little to no sense at all. I mean, I could understand it if all iGPUs were "Iris" parts, where they can actually pull their weight, but they're not. This is a side-rant, but I don't think the iGPU improvements Intel is making (or claiming to make) are really worth the transistor count and extra heat. I've seen they'll try to start accelerating some tasks with the iGPU, but it sounds like they're really struggling to find a valid use case scenario for them.

Cheers!
 

Gon Freecss

Reputable
Apr 28, 2015
448
0
4,810
We'll see what happens. I personally wouldn't care too much.

They do it as a cost effective way. Why make two dies, one with an iGPU and one without? That'd decrease yields and cut on margins. Also, it's pretty helpful to have an iGPU. It's the best encoder outside of CPU encoding, which is a feature a lot of people use. Intel can also offload stuff onto it. It's a good thing to have IMO.

Also, 48 EUs on GT2 with Gen 11 improvements will be very well. GT3e (Iris) will only be on mobile from now on I think.
 


Welp... i9 for laptops, lol.

https://twitter.com/IanCutress/status/986696498258866176

Cheers!
 

Gon Freecss

Reputable
Apr 28, 2015
448
0
4,810
I believe they'll use the i9 branding in the mainstream segment now. Lol

The i9 branding signifies server architecture. Problem with using the scheme on the mobile is that it offers no more cores. Just a higher binned chip.
 

aldaia

Distinguished
Oct 22, 2010
533
18
18,995
Intel Platform Vulnerability Lets Malware Erase or Block UEFI Firmware Updates
https://www.techpowerup.com/243422/intel-platform-vulnerability-lets-malware-erase-or-block-uefi-firmware-updates

A new Intel platform vulnerability emerged, chronicled by the company under CVE-2017-5703, dated April 3, which could let malware erase your motherboard UEFI BIOS, or render the EEPROM chip storing it "read-only" forever, preventing future BIOS updates, exploiting vulnerabilities in Intel's implementation of the SPI (serial peripheral interface) on its platforms. The vulnerability affects all Intel processors dating all the way back to 5th generation "Broadwell." The company quietly passed on fixes to its OEM partners to release as BIOS updates.

The vulnerability came to light in the public as Lenovo, Intel's largest OEM partner, deployed BIOS updates for its vulnerable products, while detailing it. Lenovo describes the vulnerability as "the configuration of the system firmware device (SPI flash) could allow an attacker to block BIOS/UEFI updates, or to selectively erase or corrupt portions of the firmware." It goes on to add that "this would most likely result in a visible malfunction, but could in rare circumstances result in arbitrary code execution." Intel said it discovered the vulnerability internally and hasn't noticed any exploits in the wild that take advantage of it. "Issue is root-caused, and the mitigation is known and available," the company said in a security advisory. "To Intel's knowledge, the issue has not been seen externally."
 


iGPUs are excellent for workstation PCs and low/mid range laptops. Going forward, you could see more compute/parallel workloads offloaded to them dynamically, but there currently isn't an easy way to do this.

My stance is that iGPUs should compete with the corresponding NVIDIA x400 series card for that generation; they should be putting significant downward pressure on low to mid end GPUs.

It's also hard to just rip them out; sure, no one using a K series chip is actually going to use the iGPU, but right now my work PC is a 8600 that's using one. This is a valid use case. Intel isn't going to do the work to remove the iGPU from just the K series chips, at best they'll fuse them off, which doesn't gain the end user anything.

Put it this way: iGPUs are like onboard soundcards; they get the job done, but only cover the basic use case.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


That image is about transistor density, but I dislike transistor comparisons because not all transistors are the same. I prefer other metrics like this one

small_Intel-10nm-Die-Area-Scaling.jpg


So 10nm is 2.3x more dense than 14mm.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I asked him: Someone asked this to me: "So, if Intel made an 8 core Coffee Lake die, do you think it's feasible to think they're making a 10+ core Icelake die?"

Reply: "Absolutely. I actually expect it."
 

Gon Freecss

Reputable
Apr 28, 2015
448
0
4,810
Oh, then if 10nm is ~2.3x as dense as 14nm, then 10nm+ should be ~2.5x as dense as 14nm. 14nm+ and 14nm++ are less denser than 14nm.

Oh, nice.

I expect Zen 2 mainstream to top out around 12 cores. 3 CCX's with 4 cores each. That will push Intel to 10 cores at least on the top end mainstream.
 

That sounds a bit scary, but I would imagine you still need people to actively run something to that effect with admin rights?



I don't disagree with your opening statement at all, but I still consider Intel is barking at the wrong tree with their iGPUs. Adding more EUs to their iGPU is just useless for everyone that "works" with their CPUs. In your own usage case, those transistors would be better off invested in extra cache or more RAM channels. I really believe everything after the HD3000 series (Ivy-era iGPUs?) is just a waste of space in the mainstream CPUs. Plus, whenever I look at a desktop or laptop that HP or Dell sells for Companies, they ALL come with discrete GPUs. It's really stupid even for big Corporations.

Cheers!
 

Isaiah4110

Distinguished
Jan 12, 2012
603
0
19,010


I work in IT for an enterprise organization (700+ employees). None of the PCs we order come with discrete GPUs, and all are running Intel Core i7 CPUs. There are only a handful of PCs in the entire organization that have discrete GPUs, and those were added after the fact to facilitate 4K displays for improved multitasking/productivity. So I would definitely disagree with the claim that Intel's iGPUs are useless and unnecessary.

Just saying...
 


Like I said, in the low-end laptop side they're perfectly justified. All the mid-tier stuff I see from Dell and HP has a discrete video card. Even worse, they're either Quadro or Fire cards ~_____~

For reference, I work at a place that employs a bit under 16K people.

Cheers!

EDIT: Mistake XD
 


And you're sure they have discrete cards, but disabled? The laptops and PCs that come with them are disabled for some idiotic and moronic reason in our case. Even more, when the models are listed, they don't mention they come with discrete GPUs. I find it so hard to believe, that I wouldn't if I didn't have one in my own hands to prove it. I had to download the drivers from AMD's site on my own to have the discrete card fully enabled and ask the tech-team to enable it in the BIOS and raising a friggin' request. It's like... WHY?!

But sorry, this is too long for a rant now. I'll stop here x_X

Cheers!
 

Isaiah4110

Distinguished
Jan 12, 2012
603
0
19,010


No, all our computer are desktops with Core i7 (mostly 3770, but some 4770 and 6700 IIRC) CPUs, and they do not even have discrete GPUs installed when we order them.

There is absolutely no need for discrete GPUs for "standard" desktop office usage. Those who have discrete GPUs in my organization have a 4K display (using an R7 250 to drive it) for the added productivity/multi-tasking benefits it provides.
 
Status
Not open for further replies.