Intel's Future Chips: News, Rumours & Reviews

Page 137 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Thats a good sign. As I said similar to the Z68. Z68 had new features, mainly the one feature was the Lucidlogic Virtu which allowed the system to use the built in IGP when on desktop but the GPU when gaming.

Basically it was new features but not required for the new CPU itself. I hope Intel sticks to this path. I actually liked the old setup on a new CPU and chipset with a die shrink/enhancement staying on the old chipset but a new chipset with new features if you wanted (think 3/4 series).
 


I totally agree with you :)
 
Intel guts 10nm to get it out the door
What is going wrong with 10nm and what is changed
Aug 2, 2018 by Charlie Demerjian

So what is actually going wrong with Intel’s 10nm process? SemiAccurate has been researching that question for years and it is finally time for a comprehensive answer.
About a year later in their Q1/2018 earnings call, Intel partially admitted what SemiAccurate has been saying for over a year, 10nm is due in Q4/2019 best case.
What Intel is not telling you, or the analysts, is that the 10nm you may get in late 2019 is not the 10nm they had intended to come out in 2015. More importantly this new process is a significant step backward from the 10nm they promised, as touted in their manufacturing day. How much of a step backwards? Several of SemiAccurate’s moles are saying it is effectively a 12nm process rather than a 10nm process, and the technical changes more than back that claim up. Don’t expect this to ever be publicly admitted to, it is still ’10nm’ and always will be even if the tech doesn’t back that name up.
https://www.semiaccurate.com/2018/08/02/intel-guts-10nm-to-get-it-out-the-door/
 
TO be fair it never has really been 10nm much like AMDs 7nm is not 7nm nor is Intel or GloFlos 14nm really 14nm.

Again with Charlie though. Man he has an article about it out every other day. Swear he is just trying to stir the pot.
 


Hey, Charlie has to justify those subscription costs somehow.
 
You guys can say what you want about Charlie, but he has been right about 10nm. You guys might not want to acknowledge that for whatever your reasons are, but there are many others who do respect his opinions, and they pay for subscription. Sure, he tries to add some hyperbole at times, but the crux has been accurate. Late 2019 falls in-line with Intel's internal roadmap. Problems with 10nm and multi-year delays is now common knowledge, and rumors from other sources that suggesting Intel was changing 10nm to a less ambition design is being confirmed by Charlie in this article. If you don't want to believe Charlie you should really not believe anything Intel has been saying for the last 3 years about 10nm since it was supposed to start ramping in late 2015.
 
Ran across this video today, great video if you have the free time 41 minutes long, computer buffs might recognize the name John Hennessy 😉
[video="https://www.youtube.com/watch?v=Azt8Nc-mtKM&feature=youtu.be"][/video]
The future of computing: a conversation with John Hennessy (Google I/O '18)
 
DjrrhN2V4AcLlLZ.jpg
 


I am not saying I believe anything any company says, I only believe it when it actually is in the hands of trusted reviewers like TH or Anandtech.

However my issue with Charlie is just that. He has been, on multiple occasions, a Doom and Gloom guy for intel or nVidia. Also used to be a AMD "white knight". That puts a bad taste in your mouth and makes it hard to trust anything he ever says as word. People can trust him all they want. Until 10nm is launched and someone delids it and digs into it I wont trust a thing he says.

BTW I don't see any links to anything other than his own articles. That slide is one thing but an actual link to an actual statement from Intel might make it easier to believe a word.
 


I only believe it when it actually is in the hands of trusted reviewers like TH or Anandtech.
This is understandable, there are a lot of rumors floating around, and Charlie cite his sources as "Intel moles," so there should be some degree of skepticism. I'm just pointing out that he appears to have been right about 10nm, maybe we will find out the truth, but we know for sure that Intel has been lying about 10nm for multiple years. It's always right around the corner! We went from a late 2015 ramp up to a maybe late 2019 ramp up. How many years do you have to be lied to before you start looking for answers somewhere else? Charlie just happens to be offering answers, even if the sources can't be verified by us personally, just saying.


 


I agree that Intel has been pushing 10nm. Lying is a stretch. I would say they tell people what they are expecting but as with anything it can be changed.

I don't trust that Intel will have it out until its out however I would trust the information about the process they give rather than Charlie, especially with just "Intel moles" which could be entry level people with no real knowledge.

Using that kind of reporting is just a way to generate revenue. True tech reporting would utilize only information that is verifiable by not only himself but other people and sites as well. Its why I trust TH more than I trust a lot of other sites since they try to stay away from rumors, although recently they have done more than usual, and they only report with verifiable information.

Until Charlie does that instead of using unverifiable sources he is doing nothing but click baiting. Right or not his style of reporting is why I would rather not read anything he says.
 
9KMy4iJ.png

Djsmg9GW4AAHlxC.jpg

Djsmyc0XsActFCO.jpg

Djsm67CWwAAAyVP.jpg

We'll have to see if Intel changes its messaging, but we're a long time removed from the Tick-Tock cadence. Considering that Intel hasn't delivered a smaller process in significant volumes since 2014, it's fair to say that the original Moore's Law is officially dead.
https://www.tomshardware.com/news/intel-cpu-10nm-earnings-amd,36967.html
Intel News
3h3 hours ago
More
Intel CTO weighs in on the evolution of Moore’s Law: https://www.eetimes.com/author.asp?section_id=36&doc_id=1333549 … via @eetimes
https://twitter.com/intelnews/status/1025408756459352065

The Continuing Evolution of Moore’s Law
Michael Mayberry
8/2/2018 07:01 PM EDT
Moore’s Law is dead – Long live Moore’s Law! This was the essence of the debate at DARPA’s Electronics Resurgence Initiative (ERI) Summit in San Francisco.

But to understand the debate, we need to agree on what is meant by Moore’s Law.

Gordon Moore’s original observation in 1965 was that as you packed more functions into an integrated circuit, the cost per function decreased. The first part of this observation is about economics, and its essence has remained constant even as the underlying technology and rate of improvement has evolved.
https://www.eetimes.com/author.asp?section_id=36&doc_id=1333549

This is just what I have on hand, but there are more misleading statements made on earnings calls, basically saying 10nm is right around the corner, and it never materializes. When someone promotes something with dates, then result to doublespeak after the fact when talking about future goals, does not promote confidence into anything they say. You get the feeling you just walked onto a used car lot the their employee of the month comes out to greet you with open arms and a hardly hand shake, after you seen numerous cars litter the both sides of the highway with cars left for dead brandishing dealership tags!
 
So you would say that scientist who predicted certain technologies would be around in X year were lying?

Again its a stretch to call it that. Some of the information you have is very old and as with anything tech wise anything can happen. Even in a year things could go wrong.

I wouldn't call AMDs Phenom I disaster them lying. I would call it the same as Intels 10nm issues. An unfortunate cascade of problems. Remember Phenom was supposed to launch at 2.8 or 2.6GHz. The 2.6 or 2.4. Then 2.4. It finally launched at 2.2GHz. AMD didn't lie. They just came into a situation that was not predicted. Intel isn't telling a lie. They have just ran into issues that have caused them more problems than they imagined. If anything they were being over zealous and extremely ambitious with 10nm and it has caused them to fail to deliver the product when intended.

Again I am not saying I believe they will have it out in 2019 or 2018 as I wait till I see the actual product on shelves. But to say they were lying when it is something that is extraordinarily hard to design in the first place which means issues can cause delays is a stretch.
 
a·pol·o·gist
əˈpäləjəst/Submit
noun
noun: apologist; plural noun: apologists
a person who offers an argument in defense of something controversial.
"an enthusiastic apologist for fascism in the 1920s"
synonyms: defender, supporter, upholder, advocate, proponent, exponent, propagandist, champion, campaigner; informalcheerleader
"one of Eisenhower's better-known apologists"
antonyms: critic
But to say they were lying when it is something that is extraordinarily hard to design in the first place which means issues can cause delays is a stretch.
You are taking an apologist view of Intel.

get oneself into or out of a situation by lying.
"you lied your way on to this voyage by implying you were an experienced sailor"
(of a thing) present a false impression; be deceptive.
"the camera cannot lie"

The are clearly being deceptive and making false impressions, which is in definition, lying.
 
I guess we just have different view points. This all started because I don't like Charlies way of writing an article. I can post many he has done with similar styles and seem correct where he spells doom and gloom for other companies yet those companies are still around some stronger than others.

You may think I am being an apologist, which I have never apologized for anything, but I think I am being more fair and objectionable. I stated I will wait for verifiable sources rather than unverifiable "moles" and that makes me an apologist.

Yay.
 
I really don't want to argue over such petty things, I'm just pointing out inaccuracies. I would much rather discuss topics about technology.
http://www.tomshardware.com/forum/id-1581001/intel-future-chips-news-rumours-reviews/page-69.html#21200639
You contradict yourself,
I am not saying I believe anything any company says
and then say
an actual link to an actual statement from Intel might make it easier to believe a word.
http://www.tomshardware.com/forum/id-1581001/intel-future-chips-news-rumours-reviews/page-69.html#21201310
Clearly, you say you would believe something if Intel made a statement about it.

I stated I will wait for verifiable sources rather than unverifiable "moles" and that makes me an apologist.
I did not call you an apologist for this statement or anything about Charlie, I even mention it would be prudent to be skeptical of him and his sources.
I clearly point to the statement found in this post.

Again, I'd much rather we get back on topic, and abandon this tit for tat discussion for something more technologically based.
 
Intel is not going to lie legally. They will get creative with the truth and/or lie by omission (tricky to prove).

Any Company would, though. Intel has just hired the best people in the world to do that 😀

Cheers!
 


Thats not what I meant. I mean that Charlie has been known to just post for posting sake with information, as in the last article, that is not 100% verifiable by an outside source. If he had some link where Intel made the statement it would make it 1000x easier to believe what he is stating is true or if there were others reporting the same thing without linking to his article. I am not saying I would believe Intel. I am saying that if they provide information its easier to trust what the writer of the article is saying is probably true.

It doesn't take an insider to see Intels 10nm is not doing as well as it should be doing. I could easily write an article, take some rumors and claim I have a mole (would work easier for me since I live in Chandler) and state thats whats happening.

As I said all it is is different views. I don't respect Charlie or his writing due to the way he goes about it.

Now if TH had written the same thing I would trust it more mainly because as I said, TH tends to wait for verifiable evidence to post it.

Either way I am done with this discussion.
 


He was wrong about 14nm. And 22nm. And 45nm. And so on. But if you proclaim doom and gloom for long enough, eventually one of your predictions will end up being correct.
 
Intel has no chance in servers and they know it
Numbers straight from the horses mouth
Aug 7, 2018 by Charlie Demerjian

Servers – The New Spin Battleground:

Why is this relevant? Mainly because 10nm is now 4 years delayed (Note: That means a 6-year shrink cadence, not a 4-year one) so badly that multiple generations of Intel server products are delayed too. Intel’s Purley, aka Shylake-EP/SP was a solid chip betrayed by a 3x price increase. OEMs are reporting that sales are awful, mainly due to the fact that the volume SKUs are TCO underwater compared to their Broadwell-EP predecessors. What did Intel do to fix this product mess? Forced customers into Purley. If you don’t think this will have long term effects… well they are already visible if you know where to look.

Then we come to Cascade Lake, the successor to Purley. It brings quite literally nothing to the table, it is a minor bug fix to Purley and nothing more. OK with the Meltdown and Spectre patches it will slow down a bit, but there is nothing really new in Cascade Lake. TDP goes up from ~160W for mainstream Purleys to ~200W in mainstream Cascades which is how they get the very modest performance increases.

Couple this to some, but not all, of the features promised for Purley and you have Cascade Lake. No more cores, no more memory channels, no more PCIe lanes, and nothing to close the yawning gap to AMD’s Epyc. Performance does go up though, but less than the TDP increase as a percentage. How much? 6-8% on a per-socket basis meaning Cascade will still be TCO underwater compared to 2015’s Broadwell-EP.

Luckily Intel has a cunning plan there too, raise prices from Purley’s ~$13,000 to ~$20,000. No that isn’t a joke, a 6-8% performance boost almost completely due to TDP raises comes with an ~$7000 price increase. Did we mention AMD’s Epyc, which is about 15% slower on a per-socket basis, costs less than 1/4th as much? And has more PCIe lanes, more memory channels, more cores, but does take more energy. Over the service lifetime, SemiAccurate feels safe in claiming that an Epyc box won’t consume very much of the $15,000+ delta, per CPU mind you, in electricity even at the high rates in some countries.
Cooper Lake:

Cascade is not due out until Q4/2018 for the hyperscalers and Q1 for the rest of the world, a ploy that shattered Intel partners’ trust the last time the company played this game. According to the latest leaked Intel roadmap, volume for Cascade won’t start it’s ramp until ~1Q before AMD’s monster Rome CPU. (Note: The leading edge of the boxes are for PRQ, not for volume release so add at least 1Q to the times)

It won’t’ be a fair fight. Why? Rome will beat Cascade by more than 50% in per-socket performance, likely tie or win on a single threaded basis, and more than double the Cascade’s core count. Please note that by more than 50% we don’t mean a little more, we mean a lot more, think abusive rather than hair’s width margins.

There are Cooper Lake SKUs that can close the gap a bit, the 3-die water cooled 350W Cooper-AP that requires new infrastructure will roughly halve the performance gap at a much higher price and significantly lower TCO. How bad is Cooper Lake? Normalized to Purley it is a bit less than 40% faster. This may seem like enough to hold the line but Cooper is not set to come out until about a year AFTER Rome, still have six memory channels, less PCIe lanes, and all the rest. And that is for the monster 3-die new socket version, the mainstream Cooper Lake won’t even reach those uncompetitive heights. Intel putting Cooper Lake out is nothing more than a desperation play.

Ice-ing On The Spin:

Cooper Lake is due out in early 2019, at the moment, and is going to be followed by Ice Lake which SemiAccurate exclusively told you was a mid-2020 product. Guess what? Ice Lake is slower than AMD’s Rome. Significantly slower. That is OK though because AMD will have Milan out at the same time as Ice Lake and Milan raises the bar by solid double digit percentages once again. Ice? Intel is claiming that it will raise the bar by less than 20%, but that is before the process changes from the old 10nm to the new 10/12nm take a hit out of the gains. SemiAccurate hasn’t seen the new numbers in enough detail to say much other than performance will go down from the current goals.

SemiAccurate has laid out a pretty stark picture of Intel’s performance and market competitiveness over the next 3-4 years. We are highly confident in the information presented because the majority of it comes from Intel’s internal outlook and documentation that was shown to us. They know they have no chance in their most lucrative core market, and are trapped between raising prices to keep margins up and cratering marketshare. Either way they lose because they aren’t close in performance

https://www.semiaccurate.com/2018/08/07/intel-has-no-chance-in-servers-and-they-know-it/

It's a long article, there is more information to be found by using the URL above.
 
There's a simple "trick" Intel can pull off to increase core count without significantly change the core design of, well, core: less cache, more cores, a bit more TDP. This is thinking monolithic design though... AMD has taken a rather radically different approach to this, so not really an apples to apples.

Maybe that is what Charlie is alluring to? Even if Intel can increase the core count to match AMD, they will sacrifice IPC in the process, one way or the other.

So this begs the question: what's more important in the current Server world? Per-core performance or moar coars?

Intel won't lose the performance crown for individual cores, but losing another important metric must hurt in a growing market for sure...

Well, time to speculate peeps!

Cheers!
 


Its interesting as AMD once criticized MCM and said monolithic was the way to go. I think Intel should go back to MCM though. Its obviously better for TDP and easier to cram more cores in. Omnipath should allow for them to communicate very fast.

That said, Charlie needs to keep up on things better. Purley is not the previous generation. Purley is the platform, that includes the chipset etc. It is the new one including Omnipath. Skylake-SP was the first Purley platform CPU with Cascade Lake being the refresh although it also does have one thing to up itself over Epyc, and thats Optane DIMMs. The 8 socket part can put as much as 6TB of Optane in it which is quite an advantage. If you can load the system and use it for storage you can eliminate a pretty big bottleneck. Of course it depends on a lot of factors.

He is also predicting some pretty major performance gaps. He has done this in the past but I will wait to see both sides before jumping to conclusions. There is a lot to a server and the CPU is just one part.

I would also say that per core performance is more important but it still helps to have more cores per socket. The more per socket the less total sockets needed. Its not a 100% win either way.
 


Neither; Performance Per Watt dominates. Server loads do tend to be more sensitive to more cores however, but let's not pretend Xeon's don't exist or anything.
 
Status
Not open for further replies.