IBM and partners including AMD first to 22nm

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
For Intel, not TSMC. My point is, like I said, more and more are catching up. All the comments about IBM either theyre going to sit on it, or deliver is just that, speculation. This is TSMCs, ATIs and nVidias plans.
 
interesting but i do not think intel will allow them to be the first to debut actual cpu's on the market. we shall see although what significance it will have at that time in the future is uncertain at best.

>interesting but i do not think intel will allow them to be the first to debut actual cpu's on the market.

Why? Intel allowed AMD to be the first to debut native quad core processors on the market.

IBM is much larger then Intel is btw.

 


IBM and AMD first at 22 nm, challenge Intel’s manufacturing lead

Hardware

By Wolfgang Gruener

Monday, August 18, 2008 16:49

Yorktown Heights (NY) – IBM and its chip development partners made a stunning announcement today, apparently beating Intel in the successful production of the first functional 22 nm SRAM cell. 22 nm processors are still three years out in the future, but IBM’s news is a good sign that chip manufacturer will be able to easily scale to this new level by the end of 2011. It appears that, for the first time in several decades, Intel may have to put some extra time into its research and development efforts to make sure it can keep its manufacturing lead at 22 nm and beyond.

SRAM chips are typically the first semiconductor devices to test a new manufacturing process as a precursor to actual microprocessors. The devices developed and manufactured by AMD, Freescale, IBM STMicroelectronics, Toshiba and the College of Nanoscale Science and Engineering (CNSE) were built in a traditional six-transistor design on a 300 mm wafer and had a memory cell size of just 0.1 μm2, which compares to Intel’s 45 nm SRAM cell size (the test chip that was used for today’s 45 nm processors) of 0.346 μm2.

A 22 nm chip is two generations out in the future and AMD even has to catch up with Intel’s 45 nm. Intel presented the first 32 nm SRAM cell wafer in September of last year and in fact is not expected to show 22 nm SRAM cells for at least another year, while first 32 nm CPU prototypes could be shown at IDF this week.

IBM said that it is on track with its 32 nm process and promises that it will use a “leading 32 nm high-K metal gate technology that no other company or consortium can match.” IBM did not provide further details to substantiate this claim, however, Intel has been using its high-K metal gate technology since the introduction of the 45 nm Penryn processors in late 2007.

While we are far from actual 22 nm and 32 nm products, it is clear that IBM and its partners are turning up the heat on Intel. For the first time in decades, there could actually be an interesting race towards a new production node.

http://www.tgdaily.com/content/view/38941/135/
 
^IBM is right that currently no other "Consortium" Can match them in their 32nm HiK process but Intel is not a consortium they are a company.

TSMC may hit 40nm but they are using a different process and how long will that process hold out? Will it be able to hit 40nm while not leaking more power and causing more heat?

GPUs already are very hot running in the 60-80c under load so how well will this process node hold?
 
Currently, the threshold or highest Ive seen is over 100c!!!. Itll hold, but theres always cooling solutions out there. Id add that even tho youll see tham at 40nm, it doesnt mean that theyll be less hot or use less power, not in terms of performance diminishing. Gpus havnt even reached that level yet. As far as server related gpus, the performance is sooo much higher per power usage, it isnt a question either. Who knows? Everyone says 22nm could be the end, but, gpus havnt incorporated HKMG yet, and may never need to. Time will tell
 



They just don't play well with others. As soon as they got the money they began to crap on everyone. Unfortunately, IBM couldn't just stop X86 based boxes. The world we be a better place if Intel wasn't such a ..a.. I can't even describe them. It takes a lot of money to make CPUs and Intel takes full advantage of that fact. I wonder why the company that created the LCD or the TV tube or the EFI for cars isn't like Intel.

All of those businesses enjoy healthy competition.
 




The process has very little to do with heat production. That's MHz and amount of active logic. The process can be tuned for better transistor perf but any 40nm process will be approximately the same for the same circuits. As HiK shows, it's more materials than process (the same companies sell everyone their litho stuff). It just sounds like you can't stand anyone but Intel making any kind of ICs, especially if they're good at it.
 
^Um but the owner of the patent gets royalties. Like Blu-Ray. Sure a lot of companies actually produce the players but Sony is the main company and gets royalties for each unit/movie sold.

Same goes for the other items you listed. And in the CPU industry its no different. Only problem is it cost more to produce a CPU and research it than it does to just make a new TV. Most of those use technologies developed by another company, the DLP HDTVs for example as the DLP is made by TI for that.

But hey if you want to try and break into a market where there are only 2 dominant players and having to keep up with ever changing technology go for it. But think of this, there are only 2 GPU makers ATI and nVidia....essentially its the same as CPUs qith AMD and ATI.....
 



So you mean I missed the big battle between EFI, LCD and TV tube makers. Crap. That would have been good. Especially since there still seems to be multiple players with no price wars designed to hurt competitors while doing nothing for consumers.

Business is business. There were more than two GPU makers and there still are. They just are in different markets. there used to be like five x86 makers, but Slot 1 killed most of them. Intel purposefully left the VX chipset just to see who could afford to make a chipset. Hell, ATi used to be on EVERY Intel mobo. (Rage Pro anyone) Then Intel decided they didn't want to share that either and graphics have been the worst ever since.
 


But like you said business is business, Intel is still makeing gobs of money and meeting their markets requirements, think if their GPCPU even does 10% over their current IGP they will have a winner in respects to the market demands/needs. All and all the first to any node is more bragging rights than anything, its the player that can leverage their production capacity and IC design to that node that will win. Frankly IBM and their console chips doesnt spell leverage just cost reduction on their part.

Word, Playa.
 



I'm not really sure what point you're trying to make. Aren't you tired of hearing horror stories about treatment of OEMs and the general contempt Intel seems to have for every company in the PC\Server business? Only MS has been as bad in terms of product tying, FUD and abuse of monopoly.

I know I am. I just want fair and open competition. If the other guy is better you don't threaten chipset shipments to keep them down. You don't jon a non-profit consortium and crap on the product extolling your own after you turned down the request of the makers.

That's too much for me. I would support a boycott of Intel just for the XO debacle. I can't even express how low I think that was.
 
^No but I get sick of people talking about Intel and MS that way and trying to make other companies such as AMD out to be angel companies when its not true.

Was it not wrong for AMD to say no to the little OEMs when they tried to buy some of AMDs X2 CPUs? Yes it was wrong. Do you talk about that? No you only talk about Intel.

BM its a freaking company out for one thing and one thing only. Your money. AMD wants it too. Don't bring up BS when you know AMD would have done the same damn thing as every other company out there and thats do whatever gets you more money to expand and create more and then sell more for what? More money.
 


*none angry tone*

K let me get this out in the open, I frankly don’t care anymore about the politics in the industry, it’s all the same and been like this since before I could read. Fact is Baron all I want to do is talk about the technical aspects, the in's and outs of the technology's why x works better than y type deal. That’s all that interests me now hence me generally never even saying boo to you or anyone for that matter; simple because the thread content doesn’t interested me.

Hell if we could have a civil conversation about coding I would be happy, but the only fellow that ever engaged me in such topics is at XCPU's and their forums are still buggy so I don’t post there.

So this is how it is Baron, you want to talk the politics of the industry I am all for it, but I won't actively try and engage in a conversation. When the conversation gets into the ins and outs of the IC hardware we all adore by golly son ill be there with my blue hat on and my green shirt (yes I am aware I have nothing to cover my shame, and darn it I likes it that way).

*smile*
Word, Playa.
 


Baron, I have to call you on this. The process means literally everything about heat production. (Well, aside from what kind of design you are going for). While you are absolutely correct that heat grows (supra linearly) with switching speed, it is most certianly NOT true that a set of transistors from different companies will give you the same thermals.

How, then, do you explain the greater thermals on the AMD chips, clocked at the same speed as intel chips? At 65nm? Where "materials" don't come into play?

I hate to break it to you, but *materials* ARE a fundamental part of the process. You'll also find (if you ever became a FSE and worked on the litho tools) that the tools are only *part* of the story -- it's how the company sets them up -- and the recipies are a VERY CLOSELY GUARDED SECRET. You don't know how many times they tell the engineers NOT to leave screens unattended, or let the FSEs go wandering about. The tools are almost secondary, it's the setup that counts.

And then, you're discarding all of the strain/stressors that go into improving performance at a lower voltage (to give you better thermals). Or the changes in gate oxide thickness (modulated thermals). Or dopant levels. Or channel lengths. Or contact resistance. I could go on and on. Each and every one of these is a finely tuned piece of the overall process we call "45nm".

To say that frequency is the sole driver of heat production is a complete fallacy.

And if you're not convinced -- go check out the IDEM data. And how you see some transistors with a higher switching speed, but lower leakage.