The future of chipmaking and fabs?

one-shot

Distinguished
Jan 13, 2006
1,369
0
19,310
Intel wouldn't be known as the dinosaur, the company would just outsource to a Fab that can economically produce the chips. Intel wouldn't run itself into the ground by building Fabs that wouldn't return a profit. C'mon, JD, you know that. It's just a change in the industry that has to happen. Moore himself predicted the trend of growing costs of Fabs relative to the cost of microprocessors. All is well, Intel isn't going anywhere.
 
Its what the article was saying, if it continues, and theres no more fab/chipmakers together, then thats what I mean. It (Intel) may be the only 1 left, and not so far off. Once Intel goes fabless, itll mean what I said.

Itll be interesting, if this continues, if Intel holds onto their fabs, and sources out. It appears there is a change, and as much crap AMD took for doing this, it seems they were just ahead of the pack.

I think the 450 mm wafers, and the coming smaller processes will reduce the amount of fabs to begin with, even with all the projected growth in using chips in everything, which Otellini hinted at in the article. So, no more cant produce enough for demand, and all that "once GF gets really going, theyre going to stick AMD in the back of the bus " thought.

To me, whomever gets to the next level, using whatever were going to use (instead of Si) , theyll have the advantage. Thats going to take alot of money, much more than weve seen in the past, and anyone holding fabs, it just may turn out to be prohibitive
 

wuzy

Distinguished
Jun 1, 2009
900
0
19,010
No matter how the economic spins, the end result is whoever has the lower cost chip for around the same performance is the one that wins.
Surely a company as big as Intel will adapt to changes when the time is right.
 

caamsa

Distinguished
Apr 25, 2006
1,830
0
19,810
Other companies may be able to compete against Intell in the low end but at the high end Intell rules.

You need the cutting-edge technology/design and that is where Intell is in the lead.


"Others, such as AMD, have spun off manufacturing completely (although AMD’s decision had much to do with a lack of cash after it bought ATI, a maker of graphics chips, for $5.4 billion in 2006)".

I still think AMD is still in the game but they better come up with a better design or else they will remain in the lower/budget end.
 
But as we near this transition, and as GF gets better, the gap will close on whos cost less etc. GF has the funding, as does Intel and TSMC, everyone else? Maybe Sam also, but thats it.
There wont be the need for all the others for one, and even with the list in the article, and what Ive mentioned may be too much.
All Im saying is, the day may come, as the Intel as we know it now, fabs intact, may change
 
Id add, as SoC becomes more and more common, outsourcing becomes more likely at Intel.
If Larrabee doesnt pan out for graphics rendering, this too could drastically change Intels overall plans as well. If its just good enough, but cant keep up with traditional gpus, theyll have to look elsewhere for a solution. I know that its maybe this and that, but it is all very possible. This in itself could put Intel behind, and outsourcing would be sped along at a greater pace, or, as the need arises.
This is of course reliant upon AMD mostly, and partly on nVidia as well, when fusion happens , in HW and SW solutions.
Heres what Im talking about, or explains it much better than I

http://venturebeat.com/2009/05/22/interview-stanfords-bill-dally-leaps-from-academia-to-the-computer-graphics-wars/

Heres a quote:

"However you package it, the PC of the future is going to be a heterogeneous machine. It could have a small number of cores (processing units) optimized for delivering performance on a single thread (or one operating program). You can think of these as latency processors. They are optimized for latency (the time it takes to go back and forth in an interaction). Then there will be a lot of cores optimized to deliver throughput (how many tasks can be done in a given time). Today, these throughput processors are the GPU. Over time, the GPU is evolving to be a more general-purpose throughput computing engine that is used in places beyond where it is used today."

If this becomes "the" change, along with the higher costs of fabbing, Intel has to persue both, and hit their mark. Im not saying they wont, or cant, but it could change the way they do things, especially, like I said, if LRB doesnt produce
 
jaydee, In your anti-Intel mantra, you are overlooking Intel's highly diversified, HUGE R&D budget. Think back to when net burst P4's hit the power and thermal wall. They looked around and noticed their Israeli R&D shop was doing some relevant work, and, almost overnight, the Core2's arrived.

As far as AMD's concerned, they have some pretty good engineers - unfortunately hamstrung by less than stellar management.

 
OK, I admit, it was harsh, mainly because alot of people jumped AMD for doing what everyone is doing, tho AMD was forced to do so, , as Im sure the others are also.

Its possible Intel can hold onto their fabs, but Ive a feeling many will be gone, or closed. The Intel as we know it today wont exist. Thats a better and more likely a truer statement.

I shouldnt bash Intel for AMDs failings, but people sure bash AMD, even when this trend isnt exclusive to AMD, and most likely will include Intel as well. Better?
 
True, but heres where they have real problems. Theyd be competing against GF, TSMC (which theyre already currently sourcing out to) etc, all of which have serious backing, mainly thru governments, which Intel truly cant compete, and, eventually may force Intels hand here, and is the point of my thread, as well as the article
 

wuzy

Distinguished
Jun 1, 2009
900
0
19,010
tbh I'd rather Intel fall flat faced on x86. Without M$ support for x86 they'd have nothing, they'd be nothing.
I'm a supporter of ARM and it's quite obvious to me the performance/watt (and power management) advantage held by ARM is clearly way over what Intel can cook up using x86 for small CE devices.
Although WinCE and *nix has been running on ARM for ages now recent rumors emerged M$ might just port full-blown Windows to ARM in a year or two.
ARM will always have the advantage over x86 in the small CE market.

All that's left for Intel is the desktop and selected server/HPC market running x86(-64). I want to see someone come up with a royalty-free architecture and a amjor software vendor like M$ supporting it, then it'll be really over for Intel. [/day dreams]

Anywhoo, back to topic...
 
wuzy, here's where you forgot something. x86-64 was AMD's idea. Intel was going to do something different for a 64 bit instruction set, but AMD was first to market with a 64 instruction set and MS told Intel they were not going to support two different instruction sets.

And for the consumer and business market, I cannot see MS doing anything different.
 

wuzy

Distinguished
Jun 1, 2009
900
0
19,010
Ah yes, I did forgot about that. :lol: Killing x86(-64) would mean bye bye AMD as well. Wheras Intel still has their fabs, AMD has none now, whooops.

But still, wouldn't it be great to have a free market like what ARM is enjoying now in the small CE market and have that same theory of free market being applied to desktops/workstations as well? [/dreams on]
 

jennyh

Splendid
Most interesting part of that was this :-

"In 1966 a new fab cost $14m. By 1995 the price had risen to $1.5 billion. Today, says Intel, the cost of a leading-edge fab exceeds $6 billion."

We can pretty much assume that 28nm, 22nm and below is going to cost even more. Even if they all go to one fab at 450mm wafers, the cost of progress is going to hit a $ barrier soon enough. TSMC built two gigafabs at the cost of $8bn-$10bn each....on revenues of $10bn. What? That can't go on much longer yet there is no other way to put more transisters in the same space.

edit, i did of course mean billions, not millions.