Is Moore's Law Still Relevant?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
G

Guest

Guest
IndignantSkeptic... A single transistor is not a computer. It takes a whole number of transistors just to do a simple task like adding two values together. Even to reverse a value you need at least two transistors. To do an "AND" operation, you need six (2 for the inverter, 4 for the NAND)
 

ProDigit10

Distinguished
Nov 19, 2010
585
1
18,980
Well I can't say I fully agree on the transistors being 3D, first of all, they always where manufactured in 3D, only now they are a bit extruded from their previous designs.

What I see coming may be REAL 3D transistors, that could conduct electricity on a vertical, as well as on a traditional horizontal plane.
That way CPU processors won't become cookie shaped flat rectangle squares, but cubes!

The technology is not there yet to make such things on a mass scale, however, I would not be surprised if Intel or IBM have been able to manufacture a chip with two or more vertical layers of transistors on top of each other. The hardest part would probably be to align the top layer with the bottom, which is kind of like technologies today thought to be impossible, or at least very difficult to achieve and unaffordably expensive to manufacture many years ago.


I believe we have reached a point where the average human being is no longer searching or hoping for the fastest computer, but for the one with best energy efficiency; because computers today have reached a level where most people can do with them what they want, without feeling limited to slow hardware.
Kind of the same way that the automotive industry is going to, what matters more today is having environment friendly devices that save money in the long run, rather than powerful mammoths in the bedroom that suck $1000's of dollars of electricity per year, even when just browsing online!

For that purpose I stated that Moore's law is going to be over very soon. Especially since Intel's newer technology offers greater performance, for lower power consumption, we might assume that most people can do with processors built on this technology with LESS transistors, instead of more, since performance increased by 20-35% on these newer technologies.

The few customers that will want transistors to increase will be web servers (like cloud, search engines, etc...), rendering farms, businesses, and small businesses like those that convert multimedia like movies and audio on a regular base.

The end user is going to have more experiences with Cloud servers, and aside from playing an occasional game, is probably going to have more than enough cpu processing power for the jobs he needs to do!
 
[citation][nom]tirinti[/nom]Intel have been making pathetick AMD style CPUs for few yeara. I had 3.4GHz Pentium 4 6 years ago. Now I have just bought i7 2600K ...[/citation]
If the i7-K is so pathetic, Why didn't you keep your old P4?
 

ProDigit10

Distinguished
Nov 19, 2010
585
1
18,980
Many people here complain about transistor count not being the same as cpu speed improvement, but I think we need to understand that back when moore's law was invented, all the way up to the early 90's when out of order processors where getting the new norm, transistor count was a good way to benchmark a cpu; and in some ways still is.
Although with corei processors with integrated graphics, the transistor count of the whole processor also includes the transistors of the GPU. So in some sense intel 'cheated' Moore's law (which is not really a law, but a rough prediction based on a mathematical calculation).
Intel knew the CPU did not need that many transistors, and people would not buy computers with CPU's of a 100-140W TDP, with an additional 100-200W for the graphics cards.
That's why they incorporated the GPU within the CPU, so that the transistor count of the whole unit went up, but not overall power consumption. And when using a plugin graphics card, ~33% of the CPU's transistors would become deactivated, resulting in ~20% less power draw (rough estimates based on guess work) for the CPU.
 

ProDigit10

Distinguished
Nov 19, 2010
585
1
18,980
[citation][nom]jprobes[/nom]Sounds like someone needs to divide their drug stash by 0 and take a vacay....[/citation]
lol!
Funniest comment I've read in a while!

The other guy clearly is a noob!
 

therandomuser

Distinguished
Apr 11, 2011
68
0
18,630
[citation][nom]rescvs[/nom]"However, Moore's Law has also been somewhat abused as a marketing tool to justify new processors and force innovation into a tight pair of shoes, that was not always the best choice, such as Intel's Netburst products that turned out to be a dead end and almost brought the company down to its knees"


That's very inaccurate, yes Intel lost some marketshare and mindshare at the time, but it was in NO WAY in any financial trouble. It was INTEL it could have been selling cow's turd as processors and the vast majority of people would still choose intel procs over AMD's counterparts.[/citation]

Yes, Intel should have suffered more if it wasn't for their advertisement campaign and history. Along with IBM, they were the pioneers of the computing industry. Their reputation and the hertz myth are what kept AMD from coming out top. Intel Innovation pushed through that storm to bring us the Core series, while AMD Applicability lost steam and became the Luigi of processors. Second string, but still quite the powerhouse in other tasks. Think about it, most of the top supercomputers are powered by Opterons.

[citation][nom]tirinti[/nom]Intel have been making pathetick AMD style CPUs for few yeara...[/citation]

Yeaaaaahhhh, no. Go back to your magical oldschool P4, and we'll be over here outside the bomb shelter playing our nice games and computing our newest problems.
 

cbxbiker61

Distinguished
May 20, 2007
139
1
18,695
Just to clear thing up with all of the FUD. Intel didn't "invent" tri-gate transistors. The were developed in the lab and funded by Darpa. Reseach started over 10 years ago. Tri-gate transistors were then made Public Domain. Do a little research on "tri-gate transistors" rather than "3-d transistors" (Intel marketing spin) if you want to know the history.
 
G

Guest

Guest
All, I've worked in the semiconductor fabrication business for 28 years doing everything from designing cleanrooms to helping to keep the process tools running. There is a great deal more to this than is readily apparent. (like most manufacturing) There are two big reasons to reduce the size and power consumptions of transistors. One is to improve capability, or to make smaller die with the same capability. One exposure tool can cost on the order of $100 million dollars and you have to have a bunch of them to make a fab. So you have to continually reduce the manufacturing costs while reducing power consumption and improving capability if you are going to stay viable. Demands for capability keep rising, and as long as they do companies like Samsung, TSMC, IBM and Intel need to keep improving their offerings. That is why manufacturing technologies keep changing, Silicon on Insulator, 3D transistors and exotic materials for films are necessary. If someone is happy to always play 'Pong' on their black and white screen we wouldn't need to do this. That doesn't seem to be the way things work though...
 

DavidC1

Distinguished
May 18, 2006
494
67
18,860
"The tri-gate transistor will enable Intel again to double the transistor count by building 3D structures and creating more transistors in the same area space. You may interpret Moore's law that Intel is cheating a bit as you could consider Moore's paper to be simply referring to 2D structures and the even area they occupy."

The press are completely idiots. Including Tomshardware. They aren't getting 2x density by going tri-gate, they are getting 2x density by going to 22nm. Tri-gate benefits are different. I'd be ashamed if I was the author recycling such wrong info. Seriously, know the meaning of r-e-s-e-a-r-c-h?
 

rpgplayer

Distinguished
May 27, 2010
224
0
18,680
[citation][nom]tirinti[/nom]Intel have been making pathetick AMD style CPUs for few yeara. I had 3.4GHz Pentium 4 6 years ago. Now I have just bought i7 2600K wchih is stupid Athlon style name for i7 3.8GHz... just 400MHz faster than i have 6 years ago. I had also pathetic 3.33GHz Core2 Duo E8600. It was slower than my P4, but just a litle bit and I bought it because of new motherboard features. Where are times when CPU were known by its speed 386DX 40MHz, 486DX2 66MHz, Pentium 100MHz, Pentium 200MMx, Pentium II 300MHz, Pentium III 450MHz, Petium III 700MHz, Pentium 4 1.6GHz, Pentium 4 2.8GHz and finally Pentium 4 3.4GHz... then stupid consumers started complaining about power consumption. I wish nVidia started making x86 CPUs. they would made 20GHz 1kW TDP CPU just like they do theier GeForce GPUs.[/citation]


you do realize that the Core architecture was about 30% more efficient than the Pentium 4's were... i.e. the Core2 could do about 30% more work per clock than the Pentium 4 could. And the "I" series is about 15% more efficient than the "Core" architecture. Same for the Sandy Bridge. In other words your Pent IV would have to clock around 5.5Ghz to do the same amount of work that a 2600K could do at factory clocks
 

DavidC1

Distinguished
May 18, 2006
494
67
18,860
[citation][nom]rpgplayer[/nom]you do realize that the Core architecture was about 30% more efficient than the Pentium 4's were... i.e. the Core2 could do about 30% more work per clock than the Pentium 4 could. And the "I" series is about 15% more efficient than the "Core" architecture. Same for the Sandy Bridge. In other words your Pent IV would have to clock around 5.5Ghz to do the same amount of work that a 2600K could do at factory clocks[/citation]

The Core Microarchitecture, used in Core 2 generation CPUs, were 30-40% faster than Pentium 4's with a frequency difference. Core 2 CPUs were 2x as fast as the Pentium 4 per clock, or whatever was based on Netburst architecture. 3.4GHz 2600K with 4 cores would be parity with 8-9GHz Pentium 4 with 4 cores.
 
Personally I think Moore's Law is given too much credit and importance. I don't think Moore implied it to be this quoted when he mentioned it in 1965 as well. It merely serves as a reminder that we can push technology to the brink every ~2 years.
 
G

Guest

Guest
One of the points i am missing here is the development of non copper interconnects. I have read
articles about nanophotonics at IBM that could be the next step in the search for speed race.
Can anyone make a comment on these developments and how it is going to impact the cpu market?
 

zinabas

Distinguished
Apr 17, 2009
58
3
18,635
Am I the only that thinks moore's law is completely wrong when they look at what AMD graphics do to Nvidia graphics. AMD uses 20% (guessing, can't remember exact figures) less transistors than Nvidia and usually can do more with them.
 

razzb3d

Distinguished
Apr 2, 2009
163
0
18,690
Overlapping transistors in a 3D structure would cause the chip to dissipate heat at much lower rates than a 2D structure would. The base transistor layer will heat up the upper layers, so the to layer will be the hottest. The most you can do with this architecture is build a dual or triple-layer transistor structure and use better heat conductive materials for the CPU cover. As for power consumption, I doubt that stacking the same number of transistors on the same fab. process will lower consumption. As far as I understand, this technology just allows for stacking and smaller die-size, but complicates fabrication and lowers heat dissipation efficiency.
 

razzb3d

Distinguished
Apr 2, 2009
163
0
18,690
Also, a smaller die dissipates head a lot slower / harder than a large die does (using the same number of transistors and the same manufacturing process).
 
G

Guest

Guest
razzb3d: That's an excellent point.

Intel routinely over-hypes their technology, the colossal fail projects known as Terascale, Larrabee, Itanium and the 10ghz Pentium IV come to mind. I think Intel hires Fox News writers for their PR department, it's mostly overhyped claims to appeal to their base, even if they never come to fruition.
 

Burodsx

Distinguished
May 31, 2009
250
0
18,780
Of course, there is the question of the relevance of Moore's Law today. Do you really care whether Intel can keep this 2-year cycle going for another 10 years? Probably not - and (enthusiast) consumers probably care less than they did 10 years ago.

Darn right I care. Better manufacturing processes have allowed us better prices and better parts. So keep on pushing Moore's law!
 
Part of the problem with Moore's "Law" is that it isnt a law by any stretch, it was a series of observations in which he noticed that the size of a transistor shrunk to about 0.7 of its previous size every 18 months, we are just silly enough to treat an observation of something that happens reliably as a law "that cannot be broken", it can and will be broken eventually.
 

Kamab

Distinguished
Aug 16, 2010
381
0
18,810
"Overlapping transistors in a 3D structure would cause the chip to dissipate heat at much lower rates than a 2D structure would. The base transistor layer will heat up the upper layers, so the to layer will be the hottest. The most you can do with this architecture is build a dual or triple-layer transistor structure and use better heat conductive materials for the CPU cover. As for power consumption, I doubt that stacking the same number of transistors on the same fab. process will lower consumption. As far as I understand, this technology just allows for stacking and smaller die-size, but complicates fabrication and lowers heat dissipation efficiency. "

This is my understanding as well. Intel's most important IP is their microfab processes and it's where they spend a huge amount of money in R&D.

http://www.eetimes.com/electronics-news/4213295/Intel-to-build-new-Arizona-fab-
 

bit_user

Polypheme
Ambassador
I think the reason Intel didn't highlight their 3D transistor as the realization of his vision is that it's not. When you look at their new 22nm chips, they'll still be a 2D layout of components. The only thing Intel did was improve on the design of the most basic building block in a way that happens to involve more layers of material (there are already multiple layers of insulators and conductors).

And yes, the more densely you can fit the components *does* matter. It means shorter distances and allows them to communicate more rapidly. That would be the benefit of going to a true 3D design. But as it also incurs many challenges, I don't expect to see the industry go there until they've hit a wall on feature size.
 

bhaberle

Distinguished
Nov 15, 2008
288
0
18,780
[citation][nom]tirinti[/nom]Intel have been making pathetick AMD style CPUs for few yeara. I had 3.4GHz Pentium 4 6 years ago. Now I have just bought i7 2600K wchih is stupid Athlon style name for i7 3.8GHz... just 400MHz faster than i have 6 years ago. I had also pathetic 3.33GHz Core2 Duo E8600. It was slower than my P4, but just a litle bit and I bought it because of new motherboard features. Where are times when CPU were known by its speed 386DX 40MHz, 486DX2 66MHz, Pentium 100MHz, Pentium 200MMx, Pentium II 300MHz, Pentium III 450MHz, Petium III 700MHz, Pentium 4 1.6GHz, Pentium 4 2.8GHz and finally Pentium 4 3.4GHz... then stupid consumers started complaining about power consumption. I wish nVidia started making x86 CPUs. they would made 20GHz 1kW TDP CPU just like they do theier GeForce GPUs.[/citation]
You are forgetting that GHz is not the only thing. There is increasing the core count, hyper threading technology, additional computer instructions, making a more efficient process, and other factors.
 
Status
Not open for further replies.