AMD vs. Intel Strange Conspiracy theory?

Status
Not open for further replies.

smartkid95

Distinguished
Sep 2, 2009
153
0
18,710
This is a strange one that I have been wondering and looking into, and observing for a long time. I was told by a source that is obviously not dependible but humor me, that intel cpu's are no match to amd due to the theory (it's rediculous but humor it) intentional replacement,and the cpu is designed to get slower after time which is why they never release cpus on the same mobo platform like amd and theychange constantly to get more money. I know this a lame conspiracy but I wan to get some honest opinions about that. Is there any truth to that at all or is is trash. the biggest question is I want to know the history of amd vs. intel because it also was claimed that overtime amd cpu's will work better than intel and amd has always been better. Intel fanboys rejoice because i'm ready for anyting I have strong feeling so don't be afraid to attack me. amd fanboys are wecome to have their say as well. have at it.



As a side note, I'm not really swayed either way as a fanboy to either company, my decision is decided by the almighty dollar. so far amd has consistantly been the better value enjoy. feel free to flame coment the post on my blog about this too.
http://rokk-itscientistblog.blogspot.com/
 
Solution
Complete garbage. CPUs don't get slower over time. Operating systems however can get slower over time, but this is easily remedied with a reinstallation of the OS.

randomizer

Champion
Moderator
I vaguely remember hearing the same thing about AMD CPUs years ago. It's utter rubbish. CPUs will eventually degrade, but they don't get slower, they just won't run at high overclocks any more and require you to "slow" them (drop the overclock) to maintain stability. I have had this happen with an Athlon Thunderbird, but it took 8 years.
 
Recently did a reformat on my P4 system that i gave to my dad, it still runs rather well for what it is. The old PII in my basement(thats a pentium) runs ancient games as well as it ever did.

The conspiracy theory is just that, a conspiracy theory, there would be no way to make the chips slow down overtime, they have a hard enough time getting the silicon right for the chips, its not worth the extra R&D to make auto degrading chips when most people end up buying new systems every few years anyway.
 

sanchz

Distinguished
Mar 2, 2009
272
0
18,810
CPUs don't get slower over time, software applications get more demanding on CPUs and PCs over time, so "new" software runs slower on "older" hardware (CPU) than "old" software.
Basically, if you got a Pentium 4 with Windows XP and 2004-class software, you'll have a relatively "fast" PC. However, if you've got the Pentium 4 with Windows Vista and current software, the PC will seem much slower.
 

verndewd

Distinguished
Mar 27, 2009
634
0
18,990


cpu's will get slower(what rando said) over time due to the friction of electrons moving through metals, its called electromigration.Electrons will slowly weaken circuit pathways by moving particles of metal in the direction they move, Creating a pathway that can handle less electrons and will infact slow the ability to compute by limiting current desnity. The hooplah here is saying that Intel has perfected deposition to the point where they can create a lifetime , or shorter lifespan.
smartkid95
A person has got to understand deposition to understand how this claim is on the high improbability factor. Sure its possible but exceedingly improbable . metals are deposited in plasma and acid vapors onto the chip the hope is that they can deposit an atom at a time( some metals simply do not uniformly break down to the single atom ; like platinum ; last I checked it was 1-5nm dispersion, which is excellent , but you get too many 5nm pieces in one place and you have a defect), were talking about an extensively complex process that has many phases. sure they could design thinner through wires (er whatever) but at the cost of more dead chips per wafer?

Deposition is a law of averages for the most part , thats why there is speed binning and oc binning etc(whatever), Deposition varies from one part of the wafer to the next.. if you knew what little i know about deposition youd know your buddy was told the wrong stuff.
 

That won't cause the CPU to run slower. It will cause it to be less stable at a given clockspeed. As long as the CPU is running at its original clockspeed, it will perform identically to the way it did when it was brand new.
 

verndewd

Distinguished
Mar 27, 2009
634
0
18,990

When current density is lessened you have to lower voltage and underclock to make it stable after the point where the metals can no longer carry enough electrons to support the designed speed.

Hence SLOWED :pt1cable: You can split hairs all day but electromigration over time does cause a cpu to degrade and force a user to SLOW the clockspeeds. LOWER the voltage. If it takes a higher current density to maintain designed speeds and 20 yrs pass on a constant running machine and you have to downclock that is technically slower than the designed speed, so yes it does slow the cpu down. because it cannot carry the current needed to produce computatuions at the designed speed.
 

verndewd

Distinguished
Mar 27, 2009
634
0
18,990

Haha yeah intel has cornered the deposition market and can design a 5 year chip while having good yields , they use smurf juice to break the deposition metals down and have no deviation from the 1nm atom size , AND they have smurf spirits direct the atoms in the deposition chamber so that the deposition is perfect across the wafer.

now if they could only remedy the defects from dielectric blowouts .
 

amdfangirl

Expert
Ambassador


So, when your computer slows by 1Mhz, it's time to call

Ghostbusters

:)
 

And how long will that take?

Case in point: last summer I discarded my old, still working P233MMX (too old to give away). This is the one where I experimented with motherboard jumper settings, mounted an S370 HSF, and ran for more than 6 years at 333 MHz.

Now, admittedly, I wasn't using it much over the past 3 years, but it was still running at 333 MHz. when I dumped it.

Electromigration is a problem if operating at abnormally high voltages for extended periods of time. But if you stick to recommended maximum voltages and temperatures, short of a catastrophic failure, the PC will become obsolete long before any such problems appear.
----------
Overclocking since 1978 - Z80 (TRS-80) from 1.77 MHz to 2.01 MHz
 


+1. Electromigration is an exponential process, not linear. When you operate in the design range of voltages and currents, electromigration is essentially nonexistent for non-defective CPUs of any brand.
 

smartkid95

Distinguished
Sep 2, 2009
153
0
18,710
Other than the s1156/s1366 stupidity that still baffles me, AMD and Intel have had pretty much the same amount of sockets and socket changes in a given time.

Mod's, please enter this one in the dumbest post of the year award contest.



YESSSSSSSSS!!!!! lol I knew this was a dumb post before I posted it
 

verndewd

Distinguished
Mar 27, 2009
634
0
18,990



the variables on time degradation are numerous but the ideas are simple

1. current density
2. heat
3.optimal deposition and manufaturing
4.time
A stock chip can concievably run for 20 years before degradation forces downclocking , and that is dependant on optimal temps and manufacturing.
An overclocked chip will degrade faster regardless of cooling because there are more electrons moving more
metal particles. The higher the oc the faster the death of the chip add more heat and its exponentially faster since metal particles move faster in heat.

Also nodes shrinking mean less metal in pathways meaning under the same conditions as larger nodes there is in fact less life to be had because the metals breakdown is a constant. So where as 90nm could last 10-20 yrs 40nm will theoretically last half that time if the shrinkage of metal conduits is proprtional to the gate length shrinkage. That would be more theory out of ignorance than fact from my perspective but its given credence by the duplucation in transistor count and die shrinkage. so imo a decent hypothesis that everything shrinks on the die proportionally leaving less metal for electrons to move. current density has lowered some but only in the tenths of a voltage on mainstream chips http://www.cpu-world.com/info/id/AMD-K7-identification.html

Overclocking is a life shortener but as you said staying within certain parameters negates that since most chips dont stay plugged in over 2=4 years. Most everyone upgrades with OS requirements that needs a pc and those that dont may run a pc for decades as they are most likely to not oc. mild oc's can stand up for a long time or can fail in months depending on the deposition quality or fab quality as you prefer.
 
Just so that everyone's clear on this, the CPU will NOT slow down on its own. Electromigration may make the CPU less stable at a given clock speed, but that's going to show up as hangs or crashes, not as a slower CPU. The CPU will gamely attempt to keep marching to the beat of the clock no matter what.

The SOLUTION to the instability caused by electromigration may be to reduce the clock speed, and that WILL slow the CPU down. But that's due to the fix, not the cause.
 

verndewd

Distinguished
Mar 27, 2009
634
0
18,990


Splitting hairs. the fact is that the throuput of electrons is lessened or the current density is lowered and to keep the chip operating you need to adjust to the current density by slowing it because it cannot slow itself to macth that, where as if it were automatic aging would in fact slow the cpu. as it would adjust itself to the required frequency to stabilize clock versus current density.

on another note , hypothetically if you reduce a few hundred mhz below stable it will lengthen the lifespan of the chip due to moving less metal since fewer electrons pass through themetal. In the end as long as electrons move through metal metal moves with the flow of electrons.
 
There's a very practical difference - people reading this thread need to understand that it's completely useless for them to be worried about "slowdowns" and start running performance benchmarks to see if electromigration is affecting their old systems. Even if they do have issues due to the problem, that's not the symptom they're going to see.

FUD spreads way, way too easily on the Internet, and I can just see this thread as a source of bogus "Is your CPU running Slow? Maybe you're suffering from Electromigration!" posts.
 


Agree. As you and other people have said, your more likely to replace the cpu due to program's higher cpu requirements than from this.
 

verndewd

Distinguished
Mar 27, 2009
634
0
18,990


Fud indeed, as I said if machines self adjusted for age it would be the same as age causing slowed computation. Technically what I said is factual and you couldnt disprove it on your best day. Science says otherwise. The only pin you can approach my facts with is the fact that machines dont self test and adjust for current density through put and you have to manually adjust for electron wear. Electro migration is a fact of life and happens with every circuit regardless of make. it happens in house wiring.
 

verndewd

Distinguished
Mar 27, 2009
634
0
18,990

I never said other wise

I am 100% correct
technically over time EM causes slowed computation

because computation to spec requires spec current density

The more you OC the more you limit spec performance over time

electrons move metal ,,, fact !!


I never said that youd kill your system in any amount of time I simply stated the facts. And in fact no conclusive study has been done on oc lifetimes .
but extensive research was done by toshiba and other companies about EM and other electrical and heat related failures.

Why hasnt any lifetime paper been published to guide people to design an oc to a chip lifespan?

becuase any oc'er doesnt keep a chip long enough for it to matter nor do most pc based work related users.

Where it does matter is in the used chip buyer who gets stiffed at a premium for a long time oc'd chip and expects to oc it at length and cannot because that time was used by thge guy who charged too much to offload his used garbage onto others.

There is an underlying rip off element with extreme gamers when they unload taxed merchandise for more than its worth.

that last part is my opinion, and having decreased another persons pc use by oc'ing in the past and selling it I can say that as a person of consciens it didnt sit well with me so I made good on the dead mobo.

Extreme freaks just dont like their dirty laundry hitting them in the face on ebay, but they more than deserve it. ebay noobs just need to stick to the never oc'd stuff and make the extreme oc'ers eat their own lunch.
 
What you said was: "cpu's will get slower(what rando said) over time due to the friction of electrons moving through metals, its called electromigration.

I think you really meant is "you have to run them at slower clock speeds to keep them stable", but someone who interpreted it literally would not get that meaning from it. It's not just semantics, it actually is a different meaning, and I think it's important to be clear about it.
 
Status
Not open for further replies.

TRENDING THREADS