Intel Makes 22nm 3-D Tri-Gate Tech for Ivy Bridge

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

aaron88_7

Distinguished
Oct 4, 2010
609
0
19,010
I'm sure 3D will also equate to 3 times more expensive. I have a feeling the cheapest Ivy Bridge will be $500....

I'll still buy it though.....as long as you get what you pay for!
 
G

Guest

Guest
3D thin transistors so more and more transistors can fabricated to gather ,low operating voltage means low operating temperature,this is the revolution in semiconductor and fabrication industries.
A 3D revolution in Semiconductor.
Intel
 

kronos_cornelius

Distinguished
Nov 4, 2009
365
1
18,780
IBM has similar technology, so AMD has options to license the 3D gate. I like AMD because they don't have the brute power cutting edge tech, so they are forced to think more intelligently about design (Athlon and multi-core was the first instance). That is why I think AMD will ultimately go with ARM arch. No matter how small the chip gets, if an x86 uses 1nwatt at 10nm, ARM can use .1nwatts at that size because it has a better design. And Intel does not have a monopoly on innovation, so IBM will catch it a few months after.

Intel will have a tough time staying on top despite this innovation. Still, Intel is a very impressive company, staying #1 of it business longer than IBM or Microsft did.

 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285
I do agree that AMD has their backs up against the wall as regards coming up with something to combat this, however the size of your R&D budget isn't the be-all-and-end-all. Remember that AMD bought NexGen whose technology went into the K6 and K7, and NexGen were fabless (just like ARM and now AMD).
 

rantoc

Distinguished
Dec 17, 2009
1,859
1
19,780
[citation][nom]kronos_cornelius[/nom]No matter how small the chip gets, if an x86 uses 1nwatt at 10nm, ARM can use .1nwatts at that size because it has a better design.[/citation]

The word better should be used with care here, Arm is better from a power perspective making it great in hardware like phones & tablets. The x86 on the other hand is better in performance making it ideal in workstations, desktops ect. Its clear you care more about power consumption than performance but that don't make the rest of the world feel the same way.

Its fun when people compare x86 vs arm, its like comparing a Toyota Prius vs a v12 powered street racer car. Both have their pros and cons, both designed for different uses. The question is what the hardware manufacturer wants to achieve. I'm personally in for performance but that don't make me scream that x86 is better designed, as stated above - its designed for one purpose while arm is designed for another, both do great in what their designed to do!
 

pelov

Distinguished
Jan 6, 2011
423
0
18,810
[citation][nom]rantoc[/nom]The word better should be used with care here, Arm is better from a power perspective making it great in hardware like phones & tablets. The x86 on the other hand is better in performance making it ideal in workstations, desktops ect. Its clear you care more about power consumption than performance but that don't make the rest of the world feel the same way.Its fun when people compare x86 vs arm, its like comparing a Toyota Prius vs a v12 powered street racer car. Both have their pros and cons, both designed for different uses. The question is what the hardware manufacturer wants to achieve. I'm personally in for performance but that don't make me scream that x86 is better designed, as stated above - its designed for one purpose while arm is designed for another, both do great in what their designed to do![/citation]

That's true, but what happens when the 2 big x86 companies try to build a small pocket rocket? Or do we start with a Prius and build an electric sports car? The two markets are separate and they have different needs. But as of right now the x86 chips they're producing and have in the pipeline have an absolute mountain to climb when it comes to power consumption due to architectural design. Regardless of how much you shrink them the RISC/ARM stuff will have a big advantage in wattage consumed (and especially during idle where ARM stuff consumes in the tenths of a watt).

The fact is I don't need to run F@H or play a 3D dx11 game on my phone or tablet. And i have a fear that the focus on the mobile market will ultimately cost us, the PC crowd, in the end. both intel and amd will be pinning their hopes on x86 dominance from mobile > server, but that sort of approach will mean that performance isn't as sought after when designing a chip. It's great but is it mobile? As a PC enthusiast, that's not the sort of question I'd like a CEO to be asking.
 

zak_mckraken

Distinguished
Jan 16, 2004
1,592
0
19,780
[citation][nom]Twist3d1080[/nom]Just a question to all the Intel fanboys and AMD haters, why waste your time commenting on things you don't like? Normal people stay away from things they dislike, yet you guys show up in droves to display your illogical emotions. Did AMD steal your girlfriend? Sleep with your mom? Wreck your car? It just seems weird thats all. Not a fanboy, more of a observer of peculiar human behavior.[/citation]
Welcome to the internet. To the left, you have the typical troll, spewing non-sense in the hope of baiting an actual human being in arguing with him. To the right, you have your average fanboï, who will mostly post dillusionnal comments about its BFF company. Kinda like a women defending his abusive husband because "she can change him".
 

verbalizer

Distinguished
all this speculation and suppose 'reading between the lines' to come up with all these possibilities, it's getting ridiculous..
only time and actual product releases + benching will tell.
I am patiently awaiting...
 
AMD joint venture with IBM paid off with SoI but i guess that't not any where near as spectacular as Tri-Gate, i guess AMD should have made some kinder garten infomercial like this one

@fazers_on_stun
Athlon XP clocked lower than Intel but was able to keep up with the P4 performance, that could only be achieved by architectural optimization, clock for clock Athlon XP was able to get more relevant instructions executed then P4. but i do agree with you AMD has fab issues (the capital required to fab silicon is ridiculous and the main reason why there is only a handful of silicon fab plants out their, but then again ARM seems to be doing quite nicely and they dont got a fab plant to their name), the fact that AMD was able to maintain performance for their gfx card on par with nvidia while remaining at 45nm without growing the die size has to be a testament to the kind of architectural optimization that AMD can do. As for the 4 core thing, last i checked i dont see a decent i5 with less then 4 cores....

There's a 'white paper' I recall reading a year or two ago - think JimmySmitty originally linked to it - that shows SOI's benefits decrease as the node size shrinks. IIRC SOI wafers cost some 30% more than strained silicon as there's extra processing steps involved.

I agree with you that Athlon was faster than netburst even at slower clock speeds - as I mentioned a few posts above, it was the branch mispredictions that caused the P4 to stall out while flushing all those pipeline stages (32 or so IIRC). Athlon had much shorter pipes and was able to recover much more quickly from a bad branch. Bulldozer has pretty long pipes, but probably much improved branch prediction, so it shouldn't suffer from the stalling-out effect.

Interestingly enough, both Intel and AMD seem to be using a number of the ideas introduced with the P4 netburst arch, in Sandy Bridge and Bulldozer. So although netburst got a bad name due to underperforming and overheating :D, it actually had quite a lot of merit - it was just some serious flaws like too-weak branch prediction that brought it down.
 

pelov

Distinguished
Jan 6, 2011
423
0
18,810
Yea. It can often be those 'failures' that provide significant advancements.

Some fruitful convo...
http://arstechnica.com/business/news/2011/05/intel-re-invents-the-microchip.ars?comments=1#comments-bar

With shrinking progressing so rapidly it was inevitable that the FinFETS designs would be utilized. So it is a huge step for intel and CPUs, but it's certainly not the only advancement and intel aren't (and certainly won't be) the only ones.

The performance increase from 22nm is much larger at lower voltage, but the architecture itself still bears the burden of x86 and is too hungry for phones.
 

srgess

Distinguished
Jan 13, 2007
556
0
18,990
main reason why amd ill go arm is because they know they cant compete intel anymore into x86. If i was intel and i bet they think about the same they can do the same open an arm divison to compete amd lol. I dont think so intel will let their major competitor be the top.
 

pelov

Distinguished
Jan 6, 2011
423
0
18,810
[citation][nom]srgess[/nom]main reason why amd ill go arm is because they know they cant compete intel anymore into x86. If i was intel and i bet they think about the same they can do the same open an arm divison to compete amd lol. I dont think so intel will let their major competitor be the top.[/citation]

they won't. furthermore, people have been talking of FinFETS for nearly 20 years because they saw the same dilemma that intel/amd/TSMC/samsung/ARM and any big chip manufacturer knows: you can't keep shrinking. eventually you hit a theoretical wall (somewhere between 15-10nm for CMOS and slightly lower for the gate channels). Which means that a waffle-shaped FinFETS design is necessary. Intel, of course, will claim it isn't FinFETS and AMD will claim they've already been making FinFETS based chips for a while now. Either way, the idea is well over 15 years old and the general design has been around since the early part of this century.
 

pelov

Distinguished
Jan 6, 2011
423
0
18,810
err, forgot to add. The big innovation here is the manufacturing process itself. It allows intel to sidestep the inevitable issues at least momentarily. This can be applied to RISC/CISC, ARM, x86, whatever you'd want.
 

lamorpa

Distinguished
Apr 30, 2008
1,195
0
19,280
[citation][nom]JohnnyLucky[/nom]I wonder how this compares to the new carbon nano technology that was recently announced [that won't be viable for about a decade][/citation]
 

fir_ser

Distinguished
Apr 7, 2011
739
0
18,980
This is an impressive technology that Intel is going to be using in its upcoming Ivy Bridge and the other architectures after it.
But the main reason it is going through that root is because it wants to compete with ARM, so Intel needed a revolutionary solution to reduce power consumption and elevate performance, and here comes the 3D Tri gate transistor as Intel calls it.
 

fir_ser

Distinguished
Apr 7, 2011
739
0
18,980
In the video Mark Bohr, the Intel Senior Fellow, gets stuck as a 100nm man after the shrinking machine got broke. Hopefully this 3D transistor technology isn’t as futuristic as the shrinking machine, so it won’t fail and would be more reliable.
 
G

Guest

Guest
old tech with a new spin ... bravo on pulling of another bellbottom comeback. (anyone ever use a stereo headphone jack ... same principal .. just on a smaller basis)
 
Status
Not open for further replies.