Moore's law is finished

gourdo_1

Distinguished
Dec 14, 2002
11
0
18,510
I apologize for the long rant, but I felt compelled to say something. If nothing just to get it off my chest…

I think home desktop computing is in the toilet -- at least from a growth perspective. Full and utter commoditization is beckoning.

I upgrade my computer every 2-3 years, usually when I can see a significant improvement in performance moving from an existing system to a new one at a reasonable price point. For example, in the past I've moved from a 486-DX100 to a P-75 then to a P-233, Celeron-500, Piii-1GHz and finally to a p4-3GHz. Each move was 2-3 years apart. I could get solid performance gains and utilize new applications, making the move worth it for me. Now I've had a P4-3GHz chip for over a year and there is not a single compelling reason to perform a significant (Mobo/memory subsystem/CPU) upgrade anywhere on the near horizon, even a year out. Intel seems to have all but given up on improving raw CPU performance at any rate near what it has been over the last decade. Now they seem mired in a struggle to re-brand their CPU cores frequently enough so that the folks who don't read Tom's Hardware frequently (the vast majority of regular folks) will think there are all kinds of new and exciting products available now versus last year.

This reeks of commoditization. Take the low end boombox market for example. Do you think Sony, Panasonic, Technics, Aiwa or any host of other throw-away brands have done anything to improve FM reception quality, sound quality or CD playback in the last 20 years? Barely a thing. Many cheap amplifiers from the 70s can go head-to-head with your typical boombox. But I bet you can't even find the boombox you bought 6 months ago in any store today. Why? Because, when companies realize that their product is a commodity, they start focusing on glitz and glamour, marketing campaigns and fluff to distract you from the fact that their product hasn't improved in years, just the appearance seems more up-to-date. I sure don’t know what I would do if my stereo didn’t have a little animated spaceship LED landing pad…
So back to CPUs and desktop computing as a whole, this is where I think things are headed – toward a future in which we’re all stuck at 4-5GHz, but who cares, because my case doubles as a neon dance club for my pet gerbil.

What gives, have we run up against physical limits or limits of the x86 architecture? There are 3.4Ghz chips on the market, and 3.8Ghz chips on the roadmap, representing 10 and 20% performance gains on top of what I already have. This represents barely noticeable performance gains in day-to-day use and certainly not worth the price tag they're putting on them. What happened to the days of the 486-33 and 486-50 where the difference in performance between two successive chips was over 50%! Intel now charges nearly $100 more for the imperceptible 0.2GHz performance gain of a P4-540 vs. P4-550 (http://www.sharkyextreme.com/guides/WCPG/article.php/10705_3402941__3). And these are not even the high end chips. Have you seen AMD's Opteron pricing schedule? It starts out reasonable, but at the high end? What a joke. Yes, I know this pricing based on supply and demand has been going on for a while, but it stinks. The new marketing/numbering schemes just add to the confusion so that no one who doesn't frequent sites like Tom's have any chance at figuring out what they're getting. Brilliant marketing by the CPU makers, but consumers lose in the end.

So, will 64-bit chips be our performance savior? Hah! We don't have an OS or applications that remotely take advantage of it, and besides, does a proper implementation even improve performance that much over existing 32-bit architectures? I think it's marginal improvement at best for most applications. 16 to 32-bit was a big jump and led to much improved system stability, due to the better memory architecture. 32 to 64 is barely even necessary except for large simulations and high end multi-processing environments.

By the looks of it, even cutting edge CPUs are going to become cheap commodities within a decade, and the focus will be on custom chips for custom needs. We're already seeing some of that with GPUs for graphics, Sound DSP processors, Network accelerators, MPEG compressors, etc... All designed to offload work from the new bottleneck at the core: the CPU.

Gordo
 
Now thats off your chest lets pick at your rant...

Intel now charges nearly $100 more for the imperceptible 0.2GHz performance gain of a P4-540 vs. P4-550
News flash, companies have been doing that for years. Remember when they did it for 100 mhz jumps (original p4's), 50mhz jumps (p3's) and less as time goes back.

Now they seem mired in a struggle to re-brand their CPU cores frequently enough so that the folks who don't read Tom's Hardware frequently (the vast majority of regular folks) will think there are all kinds of new and exciting products available now versus last year.
THG isn't the only site people read but I understand what your getting at. That's another thing most companies do. Look at GM, Ford and Chrysler. Besides for cosmetics, there is no difference between a GMC 1500 or Chevy 1500, or a Plymouth neon or Dodge Neon. I believe Intel's new naming scheme only applies to their new LGA 775 design which qualifies as new technology. The only reason I think they're doing that is because they realize they can only go so far with the Netburst architecture and they dont want to have consumer shock if they release a processor based off the P-M a few Ghz lower than their older product (thats my opinion of course.)


By the looks of it, even cutting edge CPUs are going to become cheap commodities within a decade, and the focus will be on custom chips for custom needs. We're already seeing some of that with GPUs for graphics, Sound DSP processors, Network accelerators, MPEG compressors, etc... All designed to offload work from the new bottleneck at the core: the CPU.
Just look at the Pentium Pro. They're pretty cheap. We're getting more specialized chips because software is getting more complex. One component of a computer can only do so much. No CPU, whether RISC or x86, can run the specialized tasks that a the graphics cards, sound cards, etc. run today at acceptable performance. If you want that then I'll send you my old 486 and some software with it.

Even if computers are cheap commodities (when they hit wal-mart they became just that) who cares. You can buy a cheap slow one or you can go all out on one. Im sure a little old lady could care less that she has an Intel celeron processor running at 2.0ghz with a 100 mhz fsb quad pumped to 400 mhz with 256mb of DDR memory running at 100 mhz (200mhz DDR) with a CAS latency of 3.0 as much as she cares about the 350 ci V8 ohv 2 bbl carb, single plane intake that drives her '78 Buick Electra 2-2-5. If it does the tasks you ask from it then good. If it doesn't then its time for her to get something better.

Then there will always be the enthuasist. People like you or me who get fast processors and graphics cards, tons of memory. Sticking with the car comparison people who buy fast cars and mod the engines, get dual exhaust and all that hoopla.

P.S. - Moore's law has been dead for a while. It's been "revised" already but I don't know if they'll do it a third time.

-----------------------
oh, its a nice day. TO EAT CHILDREN!!!
 
You failed to mention moore's law in your post.

</font color=red><i><font color=red>GOD</font color=red> <font color=blue>BLESS </font color=blue><font color=red>AMERICA
 
Yeah but he's talking about exponential growth and increased speed of Processors. Ultimately I believe Intel and AMD will stop creating Silicon processors. The speeds will be too high, the chips would be very unstable. This has a lot to do with Singularity. I believe that after Silicon-based chips become dangerous (ie. too hot, too unstable), Quantum processors are going to be making a big entrance. And THAT will be a very very very big breakthrough in science. What I'm trying to say is that in about.. I don't know, the way the rate of collecting technology doubles every ten years, I'd say in about 50 years, maybe less maybe more, Personal Computers will be obsolete. And I'm not talking out of my ass.

----------------

So.. Who's hungry?<P ID="edit"><FONT SIZE=-1><EM>Edited by hurleyskate284 on 09/04/04 02:59 PM.</EM></FONT></P>
 
we’re all stuck at 4-5GHz, but who cares, because my case doubles as a neon dance club for my pet gerbil.
LOL!

We seem to have hit a wall in CPU manufacturing. That are they are just trying to keep this old technology around for as long as possible hoping people will keep making very small upgrades, at very high prices. Why should they make a 5.0 ghz chip right now, when they know someone might upgrade from a 3000+ to a 3500+? Plus that person that buys the 3500+ will still buy the 5.0 ghz later.

My Desktop: <A HREF="http://Mr5oh.tripod.com/pc.html" target="_new">http://Mr5oh.tripod.com/pc.html</A>
Overclocking Results: <A HREF="http://Mr5oh.tripod.com/pc2.html" target="_new">http://Mr5oh.tripod.com/pc2.html</A>
 
I just upgraded from a 1.2ghz Athlon after 3.5 years, and in between I upgraded the graphics card once from a TNT2 to a Geforce4. I was happy w/the performance all 3.5 years and was able to play most all of the latest games over the past few months and get good performance (1024, 16bit color, med details). Finally got the new system and aside from games, I can't think of a reason to need an upgrade again for a looooong time. And even then, I wonder how long the processor will hold up and I will just need to upgrade the GPU.

Do you think the P4 3.2 will last another 4 years in gaming with 1-2 graphics card updates?


P4c 3.2Ghz @ 800MHz Northwood / ABIT AI7 / 1GB Corsair XMS-Pro CL2 Pc3200 / 160GB Seagate SATA 7200rpm 8mb cache / BFG-Tech Nvidia GeForce 6800GT 256mb / Antec Sonata case w/Antec TruePower 380W PSU
 
Do you think the P4 3.2 will last another 4 years in gaming with 1-2 graphics card updates?
Well no one can say for sure, but I would guess that it will. I used a 1.2 for quite a while with a TNT2 32 MB. Eventually upgraded and was running a 2100 XP, with a GeForce TI4600. The 4600 ran Doom3 with no problems at medium detail, even performed decent in 800x600 at high detail.

Games are becoming more and more dependent on the GPU, moreso than the CPU. I think this trend will become even more evident over time, so you may end up just buying a new graphics card in a couple of years.

My Desktop: <A HREF="http://Mr5oh.tripod.com/pc.html" target="_new">http://Mr5oh.tripod.com/pc.html</A>
Overclocking Results: <A HREF="http://Mr5oh.tripod.com/pc2.html" target="_new">http://Mr5oh.tripod.com/pc2.html</A>
 
I wish graphics cards were updateable. I don't see this happening as it makes no sense for the GPU manufacturers.

I think mainly the only thing holding my TI4600 back was that it wouldn't support the new DirectX9 stuff. It would have been nice to be able just flash the BIOS on the card and add new capabilities.

However, I think this is just a dream though, as I was forced to upgrade. Why would a manufacturer want to make cards upgradeable, when they could get me to buy a new card.

My Desktop: <A HREF="http://Mr5oh.tripod.com/pc.html" target="_new">http://Mr5oh.tripod.com/pc.html</A>
Overclocking Results: <A HREF="http://Mr5oh.tripod.com/pc2.html" target="_new">http://Mr5oh.tripod.com/pc2.html</A>
 
That's why PCI-EX was invented.

<font color=blue>The day <font color=green>Microsoft</font color=green> will make something that doesn't suck is the day they'll start making vacuum cleaners.</font color=blue>
 
Maybe I'm just lucky, but I've never lost a CPU. Even my old 120mhz P-5 still works. My (3) year old 1.3T-Bird is still in my back-up PC. How long over-clocked CPU's last is still up for debate.

Abit IS7 - 3.0C @ 3.6ghz - Mushkin PC4000 (2 X 512) - Sapphire 9800Pro - TT 420 watt Pure Power
Samsung 120gb ATA-100 - Maxtor 40gb ATA - 100
Sony DRU-510A - Yellowtail Merlot
 
By the way, Moore's original paper was an observation about the exponential growth of the number of components that could be put on a single chip. It really didn't say, or predict, anything about frequency or performance levels.

There's a link to the original paper here: <A HREF="http://www.intel.com/research/silicon/mooreslaw.htm" target="_new">http://www.intel.com/research/silicon/mooreslaw.htm</A>

I especially like this line "The electronic wristwatch needs only a display to be feasible today." He also predicted that integrated circuits would lead to "home computers" and "personal portable communications equipment." Things were a bit different in 1965. Who knows where we'll be in another 40 years.
 
Firstly, thank AMD for being there because without them the drive for speed would be even more stagnant than it is now. Think Windows with no Linux / Unix / MacOS etc...

You touched on the reason in your post - the physical limitations of silicone is being reached which is why we have had lower gains in recent times. All the talk at the moment is about dual core / chip systems because the thermal output of the latest offerings from AMD and Intel is getting rediculous.

If you take the normal Moore's description of "processing power doubles every 18 months" then we still might be near that if you combine the whole PC subsystem including memory, mobo and CPU. Although we haven't seen the raw clock speed increases, we have seen 1MB cache, DDR3 memory, 3.4GHz and hypertransport + Hyperthreading. And if you take gaming in to account as well, we have made sigificant gains in recent times with PCI express giving cards an additional boost.

You also have to look at supply / demand. In days gone by, even a good PC was still slow by todays standards, but now as far as office machines go that are used for genaral web browsing and Word etc..., we have reached a peak in speed - when I click on Word it loads instantly, it saves instantly and it spell checks instantly - why do we need more PC speed - you can't get any faster than instant !! SO it follows that if business does not need to upgrade as often then the drive to get faster chips will slow down to a certain degree as well. I think when there are massive gains to be made, then there will be massive increases in performance, but whilst the market is accepting the current CPU speed and not upgrading, Intel / AMD would be foolish to pour huge amounts of cash in to R&D in the same Qtys as they did in the 90s when they would not be able to sell the resultant chips in huge qtys. And as far as gaming is concerned, we all accept that the GPU has a bigger impact than the CPU so there is not the drive even for gamers to upgrade like there was before on the CPU front.

Fear not however, for the next few years could see something very special in computing terms - once we have found a suitable replacement for silicone or found a better way to utilise it then the speeds will take off again. AFter all, we already have the extra speed in supercomputers - now we just have to make it small and affordable.

4.77MHz to 4.0GHz in 10 years. Imagine the space year 2020 :)
 
By the way I forgot to mention my sig.

I had an old 8088 in the early 90s that ran at 4.77 MHz(even slower than a SNES / megadrive) and last year Tom clocked a P4 to 4GHz - in pure clock speed terms this more than represents doubling every 18 months which would be 4MHz to 512 MHz. :)

4.77MHz to 4.0GHz in 10 years. Imagine the space year 2020 :)
 
I'm afraid Moore's law is dying. i see the physical limitations of current technology and x86 architecture is being reached. Thats why Intel renamed their processor numbers and introduced Dothan. That's why AMD is offereing 64bit extensions with new A64 processors. I don't think we will see 10ghz Processors anytime soon.

------
A64 3400+
1GB PC 4000 Kingston HyperX
Asus K8V basic Bios 1004
PNY Geforce 6800 GT 256MB DDR3
63,524 Aquamarks
 
I don't think you people even know what Moore's law is.

</font color=red><i><font color=red>GOD</font color=red> <font color=blue>BLESS </font color=blue><font color=red>AMERICA
 
Sure we do. "What can go wrong will go wrong"

Maybe you are confusing the "what goes up must come down" law

"for every action their is an = and opposite reaction" Sir Isac Newton

If I glanced at a spilt box of tooth picks on the floor, could I tell you how many are in the pile. Not a chance, But then again I don't have to buy my underware at Kmart.
 
Maybe you are confusing the "what goes up must come down" law
I think thats due to law of gravity (wrt 1st law of motion) not the 3rd law (for every action their is an = and opposite reaction) as u said.

In anycase Isaac Newton would commit suicide after seeing these days chinese movies.

Moores law was i think in relation to transistor count. not sure how much it emphasizes on speed. also intel developed 65nm processes. and japan discovered method to create flawless Si Carbide chips.

MOORES LAW LIVES !

<i> :evil: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :evil: </i>
<b>Hard work has a future payoff. Laziness pays off now.</b>
 
Sure we do. "What can go wrong will go wrong"
Isn't that Murphy's Law ?


My Desktop: <A HREF="http://Mr5oh.tripod.com/pc.html" target="_new">http://Mr5oh.tripod.com/pc.html</A>
Overclocking Results: <A HREF="http://Mr5oh.tripod.com/pc2.html" target="_new">http://Mr5oh.tripod.com/pc2.html</A>
 
I wish I had a dollar for every time in the past 10 years I've heard that "we've reached the speed limits of x86" and "moores law is dead".

It may be true, it may not be true, but I know the end of speed increases in CPU is not in sight. Dual core, or really it should be called multi-core because they're talking about having up to 4 or more cores per chip, is going to usher in a new era. And it will dovetail nicely with 64bit, as when you have 4 or more processing cores, the ability to access large amounts of ram in a system becomes critical. What good is a 4 processor system with 2 gigs of Ram? Not nearly as much as what good it would be with 8 gigs.

I've been following computers since 1982 when I got my first commodore Vic-20. Development and advancements are tied in to the economy, I'm convinced of that. And right now, the economy is not so hot, so we're in a lull. Everything ebbs and flows in life and right now we're in an ebb.

New advances in UI, coupled with new "killer apps" are going to drive the next round of advancements, but it won't come until after the lull has finished. The global economy, including our own, needs to pick up before we're going to get into the flow again. That's my opinion. I first heard people saying Moores law was about to die in 1994, then again in 1996, and so on and so forth.

Today I have a dual opteron 248 system that laughs at all those past predictions. And when AMD releases the dual core upgrades for it next year, I'll be jumping on that bandwagon too, assuming business holds up and they deliver on their promise.

Yeah, we've been in a lull, but it won't last forever.

<i>To err is human, to really fowl things up requires a politician.</i>
 
Moores Law is Far from dead.

Moore speakes of cramming more and more components onto intergrated curcuits, hes not just refering to processors, however the processor is one example of what he was talking about, that being said, with the introduction of multi-cores, we will be cramming more and more transistors into a single chip, and there is no end in sight for how many cores can be on-die, also as ciruitry break throughs are made who knows what the future holds but one thing is for certain.
Moores law is no where near dead.
 
why does moores law matter anyway? All it is, is a guy predicting what will happen in the future. As long as he is around he can make his law true. as being a co-founder of intel he can make it true.


All i want a CPU be able to do is load windows instantly. it can open word instantly. So why not make it load windows instantly too. If you think about it if intel made there CPU's load windows instantly. Everyone would buy them (exspect amd fanboys). A 3Ghz P4 paired with a desent graphic's card, Can run anygame in desent frame rate. The human eye can't tell the differnce in anything above 24fps. So why get an AMD that can do 75fps more when you can't tell.

All im saying here is why don't the CPU manufactor's improve performance where you will notice a difference. eg. loading up windows.
 
*Adjusts bicycle helmut* Duh, look at teh shiny thing on my desk!


Seriously, loading times depend a lot less on the CPU than on the HD, memory, etc. And the eye can tell the difference of more than 24 FPS. 24 is the number used in movies, and they don't have the maximum FPS the eye can see, but rather the minumum for it to look smooth. I can tell the difference between 24 and 60. Above that - not really.

<font color=blue>The day <font color=green>Microsoft</font color=green> will make something that doesn't suck is the day they'll start making vacuum cleaners.</font color=blue>
 
well he no longer works for intel. so it is intel who have to keep his law true by cramming up the chips. though they tried to tell everyone that they werent having problems with pumping up the clock, but looking at prescotts drwing >100W of power, it seems likely intel was struggling a lot. so they HAD to get the 65nm stuff out.

<i> :evil: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :evil: </i>
<b>Hard work has a future payoff. Laziness pays off now.</b>