AMD vs. Intel Strange Conspiracy theory?

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

smartkid95

Distinguished
Sep 2, 2009
153
0
18,710
This is a strange one that I have been wondering and looking into, and observing for a long time. I was told by a source that is obviously not dependible but humor me, that intel cpu's are no match to amd due to the theory (it's rediculous but humor it) intentional replacement,and the cpu is designed to get slower after time which is why they never release cpus on the same mobo platform like amd and theychange constantly to get more money. I know this a lame conspiracy but I wan to get some honest opinions about that. Is there any truth to that at all or is is trash. the biggest question is I want to know the history of amd vs. intel because it also was claimed that overtime amd cpu's will work better than intel and amd has always been better. Intel fanboys rejoice because i'm ready for anyting I have strong feeling so don't be afraid to attack me. amd fanboys are wecome to have their say as well. have at it.



As a side note, I'm not really swayed either way as a fanboy to either company, my decision is decided by the almighty dollar. so far amd has consistantly been the better value enjoy. feel free to flame coment the post on my blog about this too.
http://rokk-itscientistblog.blogspot.com/
 
Solution
Complete garbage. CPUs don't get slower over time. Operating systems however can get slower over time, but this is easily remedied with a reinstallation of the OS.
So much inertia, though. And the early adopters of any new architecture just aren't willing to give up their favorite games. :)

amdfangirl, even you are a case in point. On another thread, you posted specs of your new computer. Your computer was ... decidely "average". But half of your budget went into a really nice graphics tablet. (My brother-in-law in New Hampshire has a one man graphics shop, and he'd be envious.)

Basically, your PC is a peripheral for your tablet. Now, my question:
Would you upgrade to an entirely new platform if you couldn't get drivers for your tablet?
 
I rest my case.

Until my brother-in-law pretty much completely migrated to the internet, all but one of his systems were running WinXP. He had one system still running Win95 because he had a specialized printer with no driver support past Win95.
 

amdfangirl

Expert
Ambassador
^2

I'd end up doing that. I still use Windows ME for some special tasks...

^ Hehe, not that simple. x86-64 is the bridging point.

I reckon the x86 instruction set still has a long way to go. People just don't want to move archs. Unless we follow Apples arch changes, I don't think we'll properly move away from x86. It gets harder the more computers that get built. Userbase increases. It's not impossible, just costly and the first wave of CPUs will probably be slower at x86 emulation.
 
Yep, that's exactly what killed Itanium for the desktop market. If it ran x86 programs as fast as native x86 processors, there's a good chance it would have been a viable alternative to AMD's x86-64 extensions.

It's a no-win scenario. You need "x" transistors to implement x86, and "y" transistors to implement a new 64-bit architecture, and you'll have to compete with someone who'll put out a chip with the same die area that uses "x + y" transistors to build an even faster x86 chip.
 


Actually they already have such an incandescent bulb on the market - those 135V bulbs, some 15V higher than the standard 120V bulbs (here in the US anyway). The builder in my old house put 135V bulbs in all the bathroom fixtures and when we sold the house some 15 years later, not one had burned out despite some being used maybe 5-6 hours a day.

Of course, they give off more of a golden light than a standard bulb due to lower filament temp, but most fair-complexioned women find that attractive in the bathroom :D.
 


I think that in effect, SIMD could be said to be non-x86 that is grafted on to x86 processors. So perhaps that is another way to gradually wean the masses off x86 - supplant increasing portions with far more efficient code & hardware.

Or instead of some 32nm 8-core monster, make it 6 x86 cores plus 2 completely new architectures, selling the latter as being 128-bit or some such marketing extravaganza to lure in the customers :D. If the new arch in the hybrid CPU is significantly faster than the x86 portion, then gradually you can wean the public off x86. But there's way too much inertia to try and just force the market to change overnight, which is sorta what Intel tried with Itanium.
 

It's not the "on" time that kills things like light bulbs or electron tubes. It's the current flow in the OFF-ON cycle caused by the very low resistance of a cold filament.

Years ago, I worked for Illinois Bell Telephone. We had vacuum tube electronics that had been running constantly for 15 - 20 years - high reliability electronics by Western Electric in an air conditioned building and never turned off.
 

sanchz

Distinguished
Mar 2, 2009
272
0
18,810
I think backwards compatibility may be adressed byembedding a X-86 core(s) into the first generations of [insert new architecture here], or some sort of software emulation, etc (I really mentioned emulation but I don't know if it's possible/cost-effective).
Actually, there is an increasing number of ARM-based netbooks, so...

If there was to be a transition to better architecures, there should be REAL and TANGIBLE benefits for the change, so that Microsoft makes an x86 Windows and a [insert new arch here] Windoes version, just like 32 vs 64 bits.

Otherwise, x86 relies on graphene (100GHz speeds) or DNA-computing....
 


Actually since filaments literally evaporate with time and temperature (temperature being distributed as a Gaussian relationship, with the outlier extremely hot molecules being closer to the melting or evaporation point), and since resistance is also equal to the cross-sectional area, then a thinner spot in the filament becomes hotter and evaporates faster which makes it more resistive. IOW, the filament burning in two and creating an open circuit is the most common failure mode in incandescent bulbs anyway. Next up would probably be the cycling on and off as you stated, but more for the mechanical stress caused by repeated temperature cycling between ambient temp and glowing white-hot temp.

IIRC halogen bulbs work by introducing a halogen gas into the bulb, where it literally forces the evaporated tungsten or whatever metal is being used as the filament, back onto the filament. So the filament can operate at higher temperatures and/or a longer lifetime.

Ordinary inert gases such as nitrogen, used in most bulbs, just keeps the evaporated metal atoms from being deposited all over the inside of the bulb, forcing it to rise to the top of the bulb instead. That's why you get the small darkish circle at the top of a bulb that's been used for a long time.
 

bgd73

Distinguished
Feb 18, 2008
201
0
18,690



exactly. there are factors of wear, psu related, and cooling, for chipsets. Clocks crystals on and on and on. To localize and target cpu is rather silly. hard drives can even go magnetic on old crt...and no slow downs.
I believe in high hours, many many days run time. I have a prescott 3.4e at 23000 hours now as we speak. It is just getting the os it has been calling for, the vista premium for the last year.
specs and discipline of whatever either one is plugged into is the longevity, even the case has something to do with slowdowns and time.
I have run both since 1999, amd and intel, nvidia combos and ATI. VIA chipsets etc. The cpu is actually always the smartest piece in the system, in my own history...no slowing down.
 

Based on observation, I'd say that's first. An incandescent light almost always fails the instant after turn-on.
 


I'd say that's the ultimate failure mechanism, but it comes into play after the first process - evaporation - has weakened that particular spot over a period of time.

In college, I once lived in an apartment with a cheap electric rangetop in the kitchen. Near the end of my year's lease, I turned on one of the burners and instead of gradually heating up uniformly across the heating element, it developed a white-hot spot and then exploded, sending molten metal onto the linoleum floor and leaving burnt spots on same. The apartment owner tried to blame me for the damage, saying it was something I must have done. So I did some research and showed him the facts about resistance heating element failure modes, and he finally relented and refunded my security deposit.

However in the above case I'd say it was the mechanical stress caused by repeated heating & cooling cycles that caused the failure - rangetop elements don't get anywhere near the temperatures that incandescent light filaments do, which is probably a good thing considering my cooking skills at the time :D.
 


LOL - well I guess light-bulb science is a bit nerdy :). However studying failure modes is an important part of engineering, as it helps you understand the fundamental processes at work and design around their weaknesses. Or else switch to a new principle entirely, such as fluorescent or LED lighting.

What I find interesting is how LEDs have pumped up the emitted lumens something like 50x in the last 20 years. I just bought a 3-pak of LED flashlights for $20, each running off 3 AAA batteries, with 40 lumens output on bright - uncomfortable if you look into the beam. In contrast my incandescent 4-D-cell flashlight emits maybe 20 lumens with new batteries. Of course the incandescent bulb emits most of its light in the infrared portion of the spectrum, whereas the LED probably emits in just 3 wavelengths - in the red, green & blue portions of the spectrum.

Compared to the dim red, yellow & yellow-green LEDs available 20 years ago, quite remarkable progress IMO.
 

Upendra09

Distinguished
LEDs, i have heard aren't all that great. They have some kind of a sweet spot that u have to hit to get lots of light and efficiency.

But once u go over a certain wattage or output the LED dies quickly, uses more energy and give out less light.

and now there is a new type of light on was on toms sit a month ago or so
 


Sure. That's why you see a dark film on the inside of the glass bulb.

And we're getting farther away from the original topic.
 

arjun_jamil1

Distinguished
Aug 11, 2009
39
0
18,530
Light Bulbs...... At their peak complexity......
seriously ppl.....
U guys are like - look theres a penny on the road...
no no dont touch that, it has nickel which changes its quantum
structure, the azimuthal no. doesnt follow the bohr bury law,and
photonic yeild isnt following its periodic fuctions.....i dunno how that
effects u.... but dont touch that.....
 
Status
Not open for further replies.