In Pictures: 16 Of The PC Industry's Most Epic Failures

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
[citation][nom]DrChips[/nom]This list is so terribly short sighted.. No graphics card in that list failed as absolutely as the 3DFX voodoo 3/4/5 What about the Cyrix 586 and 686 CPUs?[/citation]
The Voodoo3 sold very well, its not a failure - even thou its an updated Banshee (Banshee II) with SLI abilities and it lacked 24bit output (unlike the TNT2s). V4~5.... fail. Too little too late. Also 3Dfx burned themselves by cutting off all their partners... who all went to Nvidia and then attacked 3Dfx.

Cyrix 586... had one. The C486s chips were fine... but after that. I think they are not epic fail because they never became mainstream enough for many people to remember them. Best logo and name thou.

HD-DVD. It didn't fade away (as a poster said). It didn't fail either. HD-DVD simply lost. It had a more tongue twister name (do you mean DVD or HD-DVD? issues), held less data (30GB vs 50GB) and had less partners... Toshiba and MS... many HD-DVD players were made by... Toshiba.

From someone who had enough experience in format wars, I had predicted the death of HD-DVD a year in advance and did my part to help :)

Blu-Ray has a limited life-span thou. I'm not seeing streaming replacing Blu-Ray, there is WAY to much data and with data-caps by the internet companies, don't plan on BR going away soon. And the format can handle 4K video already... which is even bigger file sizes.

The replacement of Blu-Ray will be solid-state flash-like SD-Cards. Once they can make 64GB for $1-2, bye bye disc media drives. Also it means no more players. Just a converter box with a slot.


IBM used to make GREAT drives... but then the DeathStar series of mistakes. Oh well. Today, there are only 2-3 players on the field. :(
 

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
[citation][nom]blazorthon[/nom]Bulldozer is a failure, but it is NOT a "fake" eight core CPU. It has 8 cores and 8 FPUs. ~ Bulldozer wasn't AMD's take on the Netburst concept of horrible IPC, moderate clock rates.AMD's high latency caches really don't help the problem either.
There are some situations where the eight core FX's will beat everything besides the i7s. The FX-8120 and FX-8150 will beat all i5's in highly threaded integer workloads.[/citation]
The BD design is pretty much Hyper-threading / netburst with long pipelines. And yeah, just LIKE the P4, give it encoding/decoding jobs and it'll out-run an i5... But its not a TRUE 8/6/4 Core CPUs - the modules share resources and other technical issues which kills performance.

Again, there is a design FAIL when your "8 core CPU" is slower than your own OLD 4 core CPU. Come on, all the tech sites had to write pages just to explain it. Lets go over the numbers (from Tomshardware):

3DS Max (render flyby 1440x1080) - default clocks on all.
I7 2600 = 2:37
AMD PII-X6 = 2:50
AMD 8150 = 3:10
i5 2500K = 3:14 (wow, those double cores killed the i5! 4 seconds?)

Note: the i7-2600 = $300, $30 more than the FX "8core" 8150.

Photoshop CS5
AMD 8150 = 1:24 (almost twice as fast as the X4 980)
i5 2600 = 1:34
i5 2500K = 1:37
AMD PII-X6 = 1:56
(wow, those double cores killed the i5! 4 seconds?)

Adobe Premiere (H.264 squence)
i7 2600 = 1:10
FX8150 = 1:13
i5 2500K = 1:23
AMD PII-X6 = 1:27

In the above 3 tests… the FX “8 core” CPU is slightly faster than the i5-2500k. But keep in mind the FX8150 costs $270 and the i5-2500K is a $180~220 CPU.

In LAME (WAV to MP3)
The FX8150 is slower than the AMD X4 and X6 CPUs.

WinZIP (650MB / 3970 files)
The FX8150 is slower than the AMD X4 and X6 CPUs.
In fact, the AMD X4 980 (even theX4 975 at $150) is faster than the i7-920, meaning that the previous AMD tech is more competitive to the i5 CPUs than this turkey. AMD would have done better to simply die-shrink STARs… it would have been cheaper and easier than bothering with BD.

WinRAR
The FX8150 is slower i5-2500K, but faster than the X4/X6 CPUs.

Visual Studio 2010 (compile)
The FX8150 is slower than the AMD X4 and X6 CPUs.
Seriously, if someone owns an AMD X4 960~980 CPU, there is very little reason to “side-grade” to the $280 FX-8150.

Wait, earlier I said for encoding and decoding, BS should really fly… I was wrong.

iTunes (CD conversion)
i7 2600 = 1:03
i5 2500K = 1:05 (This is why most people simply buy the i5)
AMD PII-X4 = 1:24 (3.7Ghz)
AMD PII-X6 = 1:27 (3.3Ghz)
FX8150 = 1:42 (3.6Ghz)

Again, explain why anyone would spend nearly $300 for a CPU that is THAT slow?!

With the FX series of chips comes a phrase I have NOT heard in a long time… Throttling. Yep, the FX throttles as it gets too hot… a phrase that used to apply ONLY to Pentium 4s.
My friend who has had his FX6100 for a week has already experienced some throttling.

Games:
F1 2011
i5 2500K = 86fps
i7 2600 = 87fps
AMD PII-X4 = 61fps
FX8150 = 61fps
AMD PII-X6 = 56fps

WOW: Cataclysm
i5 2500K = 100fps
FX8150 = 74fps
AMD PII-X4 = 74fps

DIRT 3 (1920x1080)
i5 2500K = 114fps
FX8150 = 108fps

DIRT 3 (1280x720)
i5 2500K = 150fps
FX8150 = 123fps

Skyrim (1920x1080)
i5 2500K = 65fps
FX8150 = 42fps

StarCraft II (1920x1080)
i5 2500K = 165fps
FX8150 = 114fps

For gaming, the nearly $300 CPU loses out to intel’s $150~$220 CPUs. (i5-2400~i5-2500K, but note I can buy a 2500K for $180 from Microcenter down the street off the shelf)

Why would anyone buy spend more for less. I’d understand more if it was INTEL with the crappy chip, ala Pentium 4. AMD is lucky for every FX chip it sells.

If it was a TRUE 8 Core CPU, it should constantly beat any 4 Core CPU on the market. But its no better than yesterdays AMD X4 in many benchmarks. Hence, Bulldozer = FAIL.

The FX8150 should be labeled as a 4 Core chip that sells for $130. The A-3850 is a better value for $135… it slower, cooler and has built in graphics.

I'm not happy about it AMD's screw up on this. As much as I prefer the under-dog in the CPU side of things... my money is not going to be wasted on a bulldozer.
 
[citation][nom]belardo[/nom]The BD design is pretty much Hyper-threading / netburst with long pipelines. And yeah, just LIKE the P4, give it encoding/decoding jobs and it'll out-run an i5... But its not a TRUE 8/6/4 Core CPUs - the modules share resources and other technical issues which kills performance.Again, there is a design FAIL when your "8 core CPU" is slower than your own OLD 4 core CPU. Come on, all the tech sites had to write pages just to explain it. Lets go over the numbers (from Tomshardware):3DS Max (render flyby 1440x1080) - default clocks on all.I7 2600 = 2:37AMD PII-X6 = 2:50AMD 8150 = 3:10i5 2500K = 3:14 (wow, those double cores killed the i5! 4 seconds?)Note: the i7-2600 = $300, $30 more than the FX "8core" 8150.Photoshop CS5AMD 8150 = 1:24 (almost twice as fast as the X4 980)i5 2600 = 1:34i5 2500K = 1:37AMD PII-X6 = 1:56 (wow, those double cores killed the i5! 4 seconds?)Adobe Premiere (H.264 squence)i7 2600 = 1:10FX8150 = 1:13i5 2500K = 1:23AMD PII-X6 = 1:27In the above 3 tests… the FX “8 core” CPU is slightly faster than the i5-2500k. But keep in mind the FX8150 costs $270 and the i5-2500K is a $180~220 CPU.In LAME (WAV to MP3)The FX8150 is slower than the AMD X4 and X6 CPUs.WinZIP (650MB / 3970 files)The FX8150 is slower than the AMD X4 and X6 CPUs.In fact, the AMD X4 980 (even theX4 975 at $150) is faster than the i7-920, meaning that the previous AMD tech is more competitive to the i5 CPUs than this turkey. AMD would have done better to simply die-shrink STARs… it would have been cheaper and easier than bothering with BD.WinRARThe FX8150 is slower i5-2500K, but faster than the X4/X6 CPUs.Visual Studio 2010 (compile)The FX8150 is slower than the AMD X4 and X6 CPUs.Seriously, if someone owns an AMD X4 960~980 CPU, there is very little reason to “side-grade” to the $280 FX-8150.Wait, earlier I said for encoding and decoding, BS should really fly… I was wrong.iTunes (CD conversion)i7 2600 = 1:03i5 2500K = 1:05 (This is why most people simply buy the i5)AMD PII-X4 = 1:24 (3.7Ghz)AMD PII-X6 = 1:27 (3.3Ghz)FX8150 = 1:42 (3.6Ghz)Again, explain why anyone would spend nearly $300 for a CPU that is THAT slow?!With the FX series of chips comes a phrase I have NOT heard in a long time… Throttling. Yep, the FX throttles as it gets too hot… a phrase that used to apply ONLY to Pentium 4s.My friend who has had his FX6100 for a week has already experienced some throttling.Games:F1 2011i5 2500K = 86fpsi7 2600 = 87fpsAMD PII-X4 = 61fpsFX8150 = 61fpsAMD PII-X6 = 56fpsWOW: Cataclysmi5 2500K = 100fpsFX8150 = 74fpsAMD PII-X4 = 74fpsDIRT 3 (1920x1080)i5 2500K = 114fpsFX8150 = 108fpsDIRT 3 (1280x720)i5 2500K = 150fpsFX8150 = 123fpsSkyrim (1920x1080)i5 2500K = 65fpsFX8150 = 42fpsStarCraft II (1920x1080)i5 2500K = 165fpsFX8150 = 114fpsFor gaming, the nearly $300 CPU loses out to intel’s $150~$220 CPUs. (i5-2400~i5-2500K, but note I can buy a 2500K for $180 from Microcenter down the street off the shelf)Why would anyone buy spend more for less. I’d understand more if it was INTEL with the crappy chip, ala Pentium 4. AMD is lucky for every FX chip it sells.If it was a TRUE 8 Core CPU, it should constantly beat any 4 Core CPU on the market. But its no better than yesterdays AMD X4 in many benchmarks. Hence, Bulldozer = FAIL.The FX8150 should be labeled as a 4 Core chip that sells for $130. The A-3850 is a better value for $135… it slower, cooler and has built in graphics.I'm not happy about it AMD's screw up on this. As much as I prefer the under-dog in the CPU side of things... my money is not going to be wasted on a bulldozer.[/citation]

You make some good points and some bad ones. For some of the comparisons, you complain about BD's performance, but you need to realize that those programs are single threaded... In these you fail to mention how this is proven by the Phenom II x6s being outperformed by the Phenom II x4s... It's unrealistic to expect it to out-perform something like that. Granted, the difference is much larger than it sohuld be so it's slower than it should be even with this justification, but you make it seem as though the chip only failed because it's Bulldozer and that's wrong.

CPUs with more cores don't beat CPUs with fewer cores and higher performance per core in lightly threaded applications and that's a fact.

Bulldozer fails at gaming because games are not multi-threaded, not because of how bad it is. Games tend to max out at two threads so Bulldozer never had a chance, it's beaten by i3s at stock. Even a $60 SB Pentium can keep up with the FX-8150 in gaming at stock frequencies because of this. For gaming, the FX-4100 can keep up with the FX-8150 and it's about one third the price of the 8150 so if you must compare FX to Intel CPUs in gaming, choose the 4100 instead.

If you want to compare FX-8xxx processors with Intel then you should use the 8120, not the 8150. Just increase the 8120's multiplier and it's basically a 8150, there's not much binning going on between the two anyway.

There's ways to make FX CPUs better. If you take the FX-8120 and disable one core from each module you get around 10% more single threaded performance and it uses less power, meaning more overclocking headroom and less throttling. This doesn't make FX fast enough to compete with Intel, but it does make it at least as good as Phenom II with more cache and a better memory controller thrown in and improved floating point.

I'll agree with you in that AMD would have been better with a Stars die shrink, but it's worth noting that a die shrink would not be enough to compete with Intel right now. AMD needed a new architecture and they failed at that. If AMD had taken Stars and made some serious changes then they may have done better. Phenom II has something like 30-40% worse IPC than Sandy and Bulldozer has something like 4-50% worse IPC than Sandy at stock so AMD would need a huge improvement just to compete with Sandy, let alone Ivy and Haswell.

I'm also not happy about AMD's screw up and I don't intend to waste money on Bulldozer, but I'm not going to be ridiculous about it and intend to give it what little credit it deserves. Seriously you compared an 8 core CPU to a quad core CPU in dual and single threaded applications and complained about the results? Do the same thing with Guftown hexa-cores against Sandy Bridge quad cores and see what happens, the quad core will win in everything that can't use more than four threads.
 
@blazorthon: afaik, disabling modules is possible on fx cpus, but single core(s). i think disabling 1 module or undervolting creates extra thermal headroom for the remaining modules and improves performance.
amd does have 'stars' die shrink - llano (including athlon ii x4 631 etc).

fx (zambezi) has found a cult following among some people.
it's still a fail. amd overhyped. results were disappointing. sales are undisclosed. now, there are new things to look forwards to - trinity, piledriver, ivy bridge, more southern islands and kepler etc.... future looks good. :)
 
[citation][nom]de5_roy[/nom]@blazorthon: afaik, disabling modules is possible on fx cpus, but single core(s). i think disabling 1 module or undervolting creates extra thermal headroom for the remaining modules and improves performance.amd does have 'stars' die shrink - llano (including athlon ii x4 631 etc).fx (zambezi) has found a cult following among some people.it's still a fail. amd overhyped. results were disappointing. sales are undisclosed. now, there are new things to look forwards to - trinity, piledriver, ivy bridge, more southern islands and kepler etc.... future looks good.[/citation]

Llano doesn't count, it is changed to under-perform even the Phenom IIs it was based off of. Take a Phenom II, include the improvements in the cores made in Llano, give it the improved memory controller from Llano/FX, make a few improvements here and there (especially cache latency), and it would be much better than BD and even better than Phenom II.

[citation][nom]bodyknight[/nom]No it was not. At least not compared to late (Barton) athlon-XP. Northwood was a perfectly fine CPU.[/citation]

Northwood was not perfectly fine, the first few Northwood chips were slower than the latest P3s. P4 went on to become even worse compared to competition and the Pentium M when it came out.
 

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
@ de5_roy : After BD... whatever. We'll see it when it happens... I'm not holding my breath. I'll go on with a stupid intel-named high performance / low-powered part.

@ blazorthon : You didn't need to QUOTE my entire post, especially SINCE you responded right after me. Its a jumbled mess that sucks up screen space. Example, YOU said that the FX-8150 is not a failure... but you actually said it did.

[citation][nom]blazorthon[/nom]Bulldozer fails at gaming because games are not multi-threaded, not because of how bad it is. ~~ AMD needed a new architecture and they failed at that. ~~ I don't intend to waste money on Bulldozer, but I'm not going to be ridiculous about it and intend to give it what little credit it deserves. [/citation]
First, I edited down your quote to something manageable and how it related to my response. At least twice, you said Bulldozer "fail" at various things and you wouldn't buy it. hence, FX-8150 = FAIL.

[citation][nom]blazorthon[/nom]For some of the comparisons, you complain about BD's performance, but you need to realize that those programs are single threaded...
It's unrealistic to expect it to out-perform something like that.
~There's ways to make FX CPUs better. ~~This doesn't make FX fast enough to compete with Intel, but it does make it at least as good as Phenom II with more cache and a better memory controller thrown in and improved floating point.[/citation]
(A) I did take into account the single-threading performance... due to its long-pipeline and latency, it sucks in that department as well. Yes BD is somewhat better than STARS - especially with a direct comparison to the AMD A-3850 CPU.

(B) Yes, its VERY realistic to expect a $280 CHIP to PERFORM like a $280 CHIP, not be sub-standard to a competing $150 CHIP or even $150 from the same brand but older model. Hint, this is why the GeForce 5200 & 5600 sucked compared to the GeForce 4200 / 4600... the older cards cost less, yet were faster than the new cards. The FX8150 performs like a $140 quad core CPU, I don't care if it has 8 cores or 24 cores *IF* its performance is crap.

(C) I really don't care about the WAYS to make the FX8150 a better chip. OC / dropping cores... okay, whats the point?! I'm supposed to bother changing settings every time I do something specific on my computer, much less anyone else? One of the KEY point of Core i# and FX chips is the ability to AUTOMATICALLY shut off cores and speed up the remaining units.

Out of the box, Overclocked - the owner of an FX8150 has a slower performing part. No customer has ANY business having to TWEAK a $280 CPU in order for it to out-perform its $150 older sister! This is the failure. This would be fine if the the FX was faster than the X4-980 in every way, but its not.

And you should note that the FX8150 is a 3.6Ghz CPU and the X4-980 is a 3.7Ghz CPU... so again, its sometimes faster, sometimes slower... Most users shouldn't have to deal with such garbage... and those are what I used to compare the CPUs, stock speed. In some threads the X6 and 3.3Ghz is slower because of its lower clock rate, but in others - its 6 REAL cores are able to out-perform the FX8150.

An educated buyer will KNOW that the $70~140 cheaper I5-2400/2500 CPU will be mostly faster and sometimes slightly slower than the 8150. And if they drop $20 more over the cost of the 3850, then they'll always have the faster the CPU.

All the FX8150 had to do was equal the performance of the i5-2500k (give or take slightly) and sell for $180~200. It does neither. If it did, then it would be something I would buy or consider it anyway.

 

bodyknight

Distinguished
Nov 6, 2009
14
0
18,510



I think you are confusing willamette and northwood. And pentium-m come out over one year after the northwood core.
 

raedwulf

Distinguished
Mar 8, 2010
17
0
18,510
Smell-o-vision? I would definitely get that! You know, provided some good games added support for it and it didn't cost more than say, a good 750 w PSU or thereabouts. But otherwise, it would be amazing... well most of the time. I don't think the more child oriented games need to bother adding support. But imagine DA or Skyrim, or MW or BF. The smells of blood, gunpowder and steel infiltrating your awareness as you fight for your very life. Exciting stuff.
 
@belardo

I never said FX wasn't a fail. What I did say is better ways to compare it against Intel. No, I don't care how you do it, comparing a CPU that doesn't beat the $110 version of it instead of the $110 version and then claiming it's junk because it fails in a place where it never could have succeeded is wrong. Oh yes, Bulldozer is a fail, but at least use proper comparisons to come to that conclusion.

Bullozer's cores aren't really better than Stars. Stars are actually a little better than Phenom IIs at the same clock speed. The reason Llano performs like an Athlon instead of a Phenom is it's cache problems, among other things.

You still think that just because the 8150 is a $280 chip that it should outperform CPUs that have better performance per core than the 8150 in single or lightly threaded work. Well, the Phenom II x6s are more expensive than x4s that beat them in the same exact places, do you have a problem with that too? Quad and six core Bulldozer CPUs should beat the Phenoms here, using the eight core chips for this comparison is pointless and misguided. Bulldozer is a fail here because it's performance per core even with it's quad and six core chips is inadequate.

So, you don't care about ways to make the chips better? You don't want to change anything? Well, that also defines overclocking pretty darn well so I hope you don't mind overclocking. Should we have to do it? no, but it's not as bad as you make it out to be. And where is this idea that you would need to ever change the settings again? I suppose there could be good reason to do that if you did highly threaded work, but then it wouldn't be just a gaming computer. Is this a fail, yes, I'll admit to that. However, anyone buying Bulldozer without knowing what they are getting into, such as the cult-like following you mentioned, should have done more research before wasting their money. I'd never pay for the garbage that is FX, but I would like to have one just to play around with it, see what I can get it to do.

To be honest, the 8150 is just a 8120 with a higher multiplier, not even really a better binned chip. It would be best to ignore it and use the 8120 instead since the difference is almost $100. They overclock to the exact same range of frequencies. That way you get the 8150's performance at the $200 range so it's part of what you wanted, now we just have to wait and see if AMD can get a decent CPU out.

In conclusion, is Bulldozer a fail? It obviously is, but I'd rather call it a fail after doing everything I could to give the CPU a fair chance before I called it a fail.

[citation][nom]bodyknight[/nom]I think you are confusing willamette and northwood. And pentium-m come out over one year after the northwood core.[/citation]

You're right I did confuse Williamette with Northwood. Either way, it was not too good either. Left at stock speeds, it was the only time P4 was competitive with the Athlons. Overclock them though, and and northwood would slowly reduce in stability, forcing you to deal with stability problems or slowly reduce it's overclock until it died a few weeks or months later. I think it was called Sudden Northwood Death Syndrome or something like that.
 

JonnyDough

Distinguished
Feb 24, 2007
2,235
3
19,865
[citation][nom]blazorthon[/nom]Maybe, but what would dictate what smells are coming out? If I'm in a FPS game, would having olfactory stimulus make it any better? I admit that sometimes such a device providing such an experience may be better than not having it, but I don't think it would be a huge success no matter how well implemented it is.[/citation]

I guess that depends on the level of realism you want. Having a slightly bad yet easily tolerable odor would suffice. However, smelling rotting carcasses is not something anyone really desires. What if you are fighting in say, a cereal factory? Or what if you could smell diesel (without the affects of a headache, etc that you get from actual diesel)? I think it would be kind of cool to smell a tank roll by, or catch a whiff of fish and ocean salt before seeing the docks or beach? It would add a whole new realism to the game. What if you were to enter a room and smell the fresh scent of an alien presence, signaling to you that you are in danger?
 
[citation][nom]JonnyDough[/nom]I guess that depends on the level of realism you want. Having a slightly bad yet easily tolerable odor would suffice. However, smelling rotting carcasses is not something anyone really desires. What if you are fighting in say, a cereal factory? Or what if you could smell diesel (without the affects of a headache, etc that you get from actual diesel)? I think it would be kind of cool to smell a tank roll by, or catch a whiff of fish and ocean salt before seeing the docks or beach? It would add a whole new realism to the game. What if you were to enter a room and smell the fresh scent of an alien presence, signaling to you that you are in danger?[/citation]

It may improve the experience of some games/movies/etc, but I'm not sure about it. I'm not sure if it would distract to much or if I'd really want to smell anything in some games. It's an odd concept that I'd try out in a heart beat and it most certainly could add a whole new level of realism, but I'm not sure how it would work out.

We should also look into the circumstances surrounding it's failures. Why has it failed several times? Can it even be done? If it's a realistic goal, does it simply need more advanced technology than we have now?

This raises so many questions, I guess it's time to go to Google and work my way from there.
 

olaf

Distinguished
Oct 23, 2011
430
1
18,795
have to agree the LGA cooler is a big pike of Donkey S. Worse cooler holding mechanism EVER , it inspires no confidence when packed with the bigger heatpipe-d ones.
 

diesel_travis

Distinguished
Feb 6, 2012
2
0
18,510
[citation][nom]blazorthon[/nom]Not all of their cards were failures.[/citation]
No, but 3dfx as a company was ultimately a failure as a result of their final cards.
 

coldmast

Distinguished
May 8, 2007
664
0
18,980
The article doesn't mention that the Pentium 4s were beat by much lower clocked Pentium 3s.
Pentium 4 has always been bad architecture.
 
[citation][nom]coldmast[/nom]The article doesn't mention that the Pentium 4s were beat by much lower clocked Pentium 3s.Pentium 4 has always been bad architecture.[/citation]

That was only with the early Willamette chips. Everything after them beat the P3s. Of course this was only through the sheer clock speed differences, Netburst was still a horrible architecture.
 

lp231

Splendid
There is a glitch with the picture article. If you click on read more and move to the next image it will show the whole text, but still give a link "read more". Once you click on that it's actually "read less". :D
 

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285
[citation][nom]belardo[/nom]Er... no. RD-RAM failed because they screwed everyone. Intel had their hand in RD-RAM and pushed it down everyones throats. It caused production problems for SDR & DDR memory which resulted in overall price increases, but still far cheaper than RD-RAM.The overall performance difference between RD-RAM and DDR was minimal for the earlier P4s. DDR2 only got faster as RD-RAM faded away. May RAMBUS die painfully.And NO #2. The first P4s were SLOWER than P3s. Intel promoted the P4 as "they will get faster as clock rates ramp up" - the Netburst design was made for high clock-rates, not performance. Hence, unless the P4/Netburst was doing a single hard job - such as rendering or encoding video/audio (and nothing else) it was a Piece of sh** CPU that sold for very high prices. A buddy way back then had a P4 1.4 or 1.6Ghz, it was easily on par with my P3-900Mhz. And AMD beat the P4 up and down the street. Even in the later days of crapburst, a $250 AMD at 2.2Ghz was better for general/gaming than the $1000 P4 3.6~3.8Ghz Extreme Edition CPUs. Then lately, AMD took a page out of Intel and did netburst 2.0 on their latest CPUs... which is why many AMDers have jumped to Intel.Then intel shoved RD-RAM into their P3 line, which was stupid since it offered NO performance improvement, so they tacked on the MTB onto the crappy 810 boards - which were unstable and performed slower than the BX-boards it was supposed to replace. Hence, VIA grew quickly to fill the void as intel was busy with law-suits and recalls until the i815 boards came out - the last great P3 board. RD-RAM offered nothing that DDR couldn't handle at 1/4 the price. At one of my offices, we still have a 2.4Ghz P4 Dell with RD-RAM I can't wait to toss. Its the slowest POS in the office.Cyrix: actually made some good CPUs. They took care of the bottom-end market. Their 486DX chips were quite good. I sold hundreds of them in our little PC shop. Never had one fail / return. The Cyrix DX/4 100 could easily OC past the intel/AMD and were SUPER cool... ie: an intel would burn your finger, while the cyrix was barely warm. not bad. But Cyrix could NOT make anything to compete against the Pentiums and rightfully died. Man, they made various junk hybrid thingies.[/citation]

No offense, but you're an idiot.

Look at the benchmarks of the Pentium 4 at 1.5 GHz versus the Pentium III at 1 GHz. Pentium 4 easily wins, and that's the speed it was released at. It also grew to 2 GHz, Pentium III got to 1.1 GHz on that same technology. So, sorry, you're wrong.

Intel dominated AMD after they ramped up clock speeds, and particularly with the Northwood. At least for a while. Prescott ended it, and gave AMD overall superiority.

RDRAM was faster than DDR, and costed the same by the time Intel moved away from RDRAM. Simple as that. It was improved with XDR and XDR2, but since Intel had left it for political reasons, it was not used. It didn't really go away.

By the time RDRAM went away, DDR was not cheaper. They were very close in price. RDRAM was faster, though.

i815 was a great chipset? Do you know anything? It was slower than the BX, and constrained to 512MB. It was decent, but not great. It was outperformed by the i840, and the Apollo Pro 266T.

You're also wrong in a later post about the i840 and i820. The MTH could be used on both chipsets, and the i820 was not an i840 with it. Please, stop being an ignorant expert. The i840 was dual channel and was intended for Workstations, and multi-processor configurations. The i820 was used mainly with RDRAM, but was single channel, and their mainstream chip. Both could use an MTH.
 
They could add the Bulldozer to that list.
I use that IDE-SATA converter, just to have quick access to an old HDD. One can find them paired with a power supply, hence using that as a temporary external docking solution for IDE drives. It works well that way.
 
[citation][nom]house70[/nom]They could add the Bulldozer to that list.I use that IDE-SATA converter, just to have quick access to an old HDD. One can find them paired with a power supply, hence using that as a temporary external docking solution for IDE drives. It works well that way.[/citation]

... Bulldozer is on the list, Tom's saved the latest fail for the last page.
 

compton

Distinguished
Aug 30, 2010
197
0
18,680
I personally own a Killer 2100 NIC. It functions fine, and there was this one time when I thought it helped my gaming. But I was probably really drunk at the time.

If I'm honest, the only problem with Killer NICs is their price. I got my 2100 for $70, so at that price it's basically 2x Intel PCIe GbE NICs in terms of price. But Realtek, Marvell, Broadcom, and of course Intel on board equipment is all pretty good.
 
[citation][nom]compton[/nom]I personally own a Killer 2100 NIC. It functions fine, and there was this one time when I thought it helped my gaming. But I was probably really drunk at the time.If I'm honest, the only problem with Killer NICs is their price. I got my 2100 for $70, so at that price it's basically 2x Intel PCIe GbE NICs in terms of price. But Realtek, Marvell, Broadcom, and of course Intel on board equipment is all pretty good.[/citation]

At $70 it's not a huge problem, but I would like to see them go under $55 USD before I even consider trying it out. GbE NICs can be bought for well under $40, some good ones were $20-30 last I checked. Considering the negligible gains of the Killer NIC over the much cheaper alternatives, do you think it should be priced too much higher? To be honest, I'm not sure having a processor on-board the NIC is too important unless you go for higher speeds than GbE. Maybe 10, 40, or 100 GbE would benefit more from something like this than 1GbE.
 
Status
Not open for further replies.