Intel: 10nm Is Not Dead (It's Getting Better)

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

They actually shipped low-margin laptop chips for the Chinese market, first (Q4 2017, IIRC). That's likely because the dies were small and the clock speeds could be low.

It was a similar story with Intel's 14 nm - they launched with lower-clocked dual core Broadwell laptop chips. The big, high-margin server chips didn't ship on 14 nm until much later.
 

You're confusing his general doom & gloom with the specific predictions he makes about product launches/delays, code names, features, specs, etc.

He's like a gossip columnist, but instead of celebrity gossip, he's all about the secrets of Silicon Valley.


Then you obviously haven't read nearly as much of his coverage as you're letting on. He covers AMD at least as much as anyone else, reporting on their management shakeups, layoffs, bad decisions by their board, as well as product leaks.
 

They also run a foundry business, which is how their 10 nm delays cost someone dearly.

https://semiaccurate.com/2018/08/30/update-to-intel-custom-foundry-10nm-customer-meltdown/
 
Regarding Charlie, he makes it seem like he does have a colorful personality. That said, his material is most of the times spot-on way before the rest of the industry picks up on rumors of things that have already materialized. Charlie is very knowledgable of the industry and their plans way before everyone else gets a clue. After all, his forums are full of people who work at companies he writes about. Plans change, situations change, markets change, and sometimes his predictions don't pan off, but I see his texts as a great, informed insight into where things are going months or years before any other place writes about it. I see his stuff as "most likely true from where we're looking at now".

In general, that's how I see the SemiAccurate forums as well. There are people with level of knowledge (in PC hardware space) and insight unmatched by any other forums, as many of them actually develop things they are talking about. That's the place I trust the most with predictions, as they are very educated predictions that are more likely to materialize than just your ordinary rumor mill. Sure, they are wrong and they are surprised at times (since many initiatives are successfully kept hush-hush), but less often than anyone else. I surely wouldn't attack their credibility, apart from stating the obvious that their predictions might not turn out to be true due to unexpected things happening or unexpected outcomes of predicted plans. They are surely distinctly different than news sites, even great ones like TomsHardware or Anandtech, who are great at reporting on the released components instead, while SA is just speculating based on their knowledge, but really good knowledge.
 

Why do you assume he has evidence? What if it's just good contacts? A lot of the news is based on testimony from reliable sources, rather than a smoking gun of some kind.

He didn't always have a paywall, but everyone ripped off his stories. He ensnared them a few times, by intentionally reporting incorrect information (like code names) and seeing which news sites it showed up on, but even after he named and shamed the guilty, it still didn't stop.

I wish he would remove the pay wall on older stories, so we could actually see his previous predictions in detail.


Blame the business model of tech news. Most sites rely on advertising and manufacturer-provided review samples. If you report their inside dirt, that will dry up in a hurry. Also, he's been in the industry and now the news business for long enough to cultivate some deep contacts. Few of the journalists who cover tech actually worked in the thick of it.



Not really. All you said was that you remember him predicting something about Nvidia failing. I'm not saying he never said such a thing, but I think you probably didn't understand when he's being hyperbolic. He has a certain sardonic tone that takes a bit of getting used to.


He had plenty of coverage of AMD's woes. You can search his site, if you don't believe me.


Hey, if you'd rather believe Intel's PR department, that's on you.
 

The 10nm CPUs that Intel sourced to select OEMs do not have an IGP, so they aren't going to be used for laptops in remotely meaningful quantities as that would require adding a graphics solution that costs more on its own than faster existing AMD and Intel CPUs with IGP. They are primarily meant for embedded devices where display output is either not necessary, uses custom devices or self-refresh panels.

I doubt Intel is shipping those by choice. More likely, it is using the design as a test vehicle to refine 10nm on and is selling viable chips at a loss to recover some of the R&D costs.
 
Are we all forgetting that Intel will take a huge SEC penalty if the info they put out today is provably false. I'd accept the reports if Intel had put out something equivocal or vasilating, but they put out unequivocal info. Granted, they're stuffed to the gills with cash, but the board of directors would have their hides if the company suddenly faced a huge SEC sanction for false/misleading info.
 
I have said it before and I will say it again. Yes, Intel has had 10nm process problems, resulting in delayed and yield issues. But the true problem with 10nm is the core architecture, at the clock speeds their current chips operate at, has reached a thermal/node wall. The first 14nm products were delayed and then it took two generations to get the clock speeds seen at 20nm, I"m talking about reasonable max overclocking with decent air or liquid cooling.
Essentially the chip acts as a heat pipe. The smaller node means smaller copper on the front end, smaller mid vias, and smaller bumps. Silicon does not conduct heat well so while perhaps the power consumption drops by 30%-40% your cross-section of the heat pipe, non-silicon parts, is reduced by > 60%. Now that Intel has pulled out all stops with the 9xxxk included using a STIM they are at a thermal limit, as shown by an overlocked wall at 5.1-5.2 GHz. To clock form 4.4 to 5.1 you have to double the supplied power.
10nm will not support 6 or 8 core processors clocked at 5Ghz using Intel's Core architecture, not without a chiller. This is really bad news when it comes to their higher core count servers. So yes they will release 10nm, it will work especially well with dual and quad-core low power processors such a laptop chips and below. Yes, they may release desktop I5's, 7's and 9's, but those high power high core count parts will be delayed, and they will clock at lower all core speeds least for the next 2 generations.
Perhaps single thread performance will remain the same or even increase a little, but this architecture has reached the end of the thermal and node shrink road. With the last 2 generation IPC gains of < 3% there is little hope for improvement there. AMD 7nm Zen will release next year but Intel will not be able to counter it with higher clocked improved core architecture. I don't see Intel leaving this rut until it has a new architecture.
 

It is absolutely possible for Intel to be making headway into getting 10nm to market but having to limit the number of chips it makes on it due to low production volume and ultimately skipping 10nm for most mainstream products and investors can't do much if Intel hits genuinely unexpected brick walls.

I wouldn't be surprised if we end up with a Broadwell-style situation where 10nm desktop CPUs do get launched in late 2019 but are priced out of most of the market and remain nearly unobtainable through their entire market life.
 


I see this as the most realistic scenario. The problem with holding Intel to any word on 10nm, legal or otherwise, is that they define the nodes themselves. When they finally get to "7nm" I bet that will be based on the aggressive 10nm node they have delayed for the last 5 years.
 
Unless Intel was leaving the chip business there is no way they were going to abandon 10nm completely. Cause if they can't get 10nm down it's not like they'd instantly skip to 7nm and have 0 problems.

Throwing out 10nm would leave them with 14nm+++++ which would definitely fall behind AMD on it's 7nm fab due starting next year. No way would intel want to lose the single thread IPC crown. Especially after losing the multiple core multi-core crown to AMD. They'd have to become the "value" chip and eat tons of profit.
 

Even if Intel did absolutely nothing, it'd still take years for the market to shift away from Intel in large enough numbers for Intel to worry about. If Intel can't meet 14nm demand today, it won't be able to meet 10nm demand any time soon either, which means Intel's inventory will remain sold-out for quite some time to come. Companies don't drop prices on products that are perpetually out of stock.

Right now, a good chunk of AMD's sales are simply due to people and companies being unable to obtain the Intel chips they want at a remotely reasonable price. Based on Zen 2 rumors, you can expect AMD's next-gen chips to be roughly even with Intel's and this usually isn't enough to convince people to switch vendors, so AMD will still be the underdog even if it pulls slightly ahead due to simple market inertia. It'll be at least another two years of AMD further (im)proving Zen/+/2/2+ and itself as a company before enough switch to considering AMD as their first choice and Intel has to begin worrying about it.
 
As to Charlie's credibility, I have not read his stuff, so I can't comment to his record specifically. But I will say that claiming he isn't credible simply because his record isn't 100% or that he earns a living through the subscription fees to read his full stories, is wrong-headed. If you get in the business of making predictions, you're gonna be wrong eventually, that is why most news sources avoid it. This leads to the fact that if you're writings are critical of the very people you're writing about, your reliance on their money for advertising. His actual record, and any bias therein, is completely fair game.

As to the statement and SEC implications, that is a red herring. Look at the Tesla situation - the CEO blatantly and outright lied, with the apparent intent of manipulating the company's stock, and all they got was basically a slap on the wrist. Intel's statement from their PR arm takes only a couple folks in an office continuing to review design concepts or production plans for one day beyond the date of the statement to get them in the clear.

Now to the crux of the story. It would be hard to blame Intel for looking to walk away from the 10nm node, especially with their rival moving to 7nm already. I'm sure they rather keep working on it, fulfill their original stated goals, but with as many delays as they've had, and seeming to be touching on further issues still, the better approach would likely be to continue working on whatever aspect can be transferred to the 7nm node and focus on that. Focusing more time on 10nm runs the risk of them delaying any future work on 7nm for a prolonged time, giving AMD a future path towards stepping an entire node plus ahead of them in R&D. If I were Intel, I wouldn't want to simply bank on AMD stumbling and letting me get back a big edge. I would want to continue pressing my advantages where I can.
 
If Intel did kill 10nm, they would have to write off all their fab investment in 10nm. I would guess Intel will find some way to not do this, even if the fab utilization isn't as high as they predicted. Even that lower utilization would probably show up as an impairment.
 
neither one fits.Why is Toms hardware trying to push all this RTX news? Seriously the RTX is like 10% more horse power then the GTX series its not relly that great of a deal Linus tech tips already debunked the hype. Plus there relly no RTX games to compare the RTX with sooo.... keep your 1080tis a lil longer folks they are really not worth it!
 


Because that "7nm" is about as good as Intel 10nm would be.

https://www.semiwiki.com/forum/content/7602-semicon-west-intel-10nm-gf-7nm-update.html

That is if Intel gets 10nm out at all or moves to a different process node.



If you haven't read his articles or been around for the history its not easy to see why I, and I am not the only one but plenty of people see him the same way, we don't take his word for it.

My only complaint about the paywall is that he is making massive claims, and the only one for some reason as I stated before he is the only one who seems to have any contacts inside Intel, that can cause a lot of issues for said company yet not offering anything more to people who don't want to pay him that he heard it from a source. That makes it hard to find credibility in those statements for such big news.

Just to give similar examples, WCCFTech is seen as a rumor mill. They will post any rumor they see about a product. Hell even in the same day they will post one rumor then another to contradicts the previous one. They just want the click bait. Sometimes they are correct rumors. Should we trust them?

We have had forum members that show obvious bias to one team or another. Should we trust their information?

I state that if someone in the news shows some bias then you cannot take their word at face value. With Charlie thats how I see it. He is not willing to provide the proof freely with a massive claim and is instantly denied by Intel. Hard to say he is correct.
 

Nah, Intel will get 10nm to work since it can't get to 7nm without solving whatever problems it is already having with 10nm first and once 10nm is sorted out, Intel will use the living daylights out of what 10nm capacity it can copy-exactly in the fabs that have been upgraded. However, due to 10nm being so late, I could imagine many fabs that were originally targets for 10nm migration but not upgraded yet skipping to 7nm instead.

It costs billions of dollars to tool up a chip fab for a smaller process, even Intel would have a hard time writing that off with no results to show for it. Intel's 10nm may never reach full-scale production across multiple fabs like 14nm has but it will happen at a still significant scale simply because Intel is too many (tens of) billion dollars deep into it.
 


Intel already is retooling FAB42 for 7nm. I think they said its a $7 billion dollar investment.
 
Intel WILL come out with some process they will call '10nm'. There is no way in hell they will say its canceled and call the next process 12nm or something else.

That said....the 10nm process they have been having problems with for 3 years now is almost certainly dead. They have likely canceled all the parts that they were having trouble with and relaxed the process to make it work. They will still call it 10nm, but its very likely not the same 10nm they have been talking about for the past few years.

They cant afford to screw around anymore. They need to get what works out the door, and drop the stuff that isnt working. Id be shocked if this is not exactly what they are doing.
 

You seem preoccupied with desktop/laptop, but AMD's prize is actually cloud. And there, it has a large enough price advantage that simple economics should be pulling away a significant amount of business from Intel.

Somewhere, I read Intel's new cloud chief quoted as saying he wants to keep AMD's cloud market penetration below 20%, in 2019. Given that they were basically at 0% in 2017, that speaks volumes.
 

Intel's 10 nm is not EUV-based. They could use EUV in whatever follow-on process and just call that 7 nm, to match what others are doing.


One of the first lessons they teach in business school (I'm told) is the concept of a sunk cost. Once you conclude that an approach is not workable (or profitable), you stop investing in it. It's said to be "throwing good money after bad".

Intel already dropped it's CEO and reorg'd. So, it seems like they've probably cut everyone whose fate was to deeply entangled in the old plan. That should clear the way for them to do what's right for the business.
 

Okay, then it sounds like we agree.

And my point was that cost is actually a big driver, for that market. AMD's multi-die strategy gives it an unassailable cost advantage over Intel.