In Pictures: 16 Of The PC Industry's Most Epic Failures

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ChromeTusk

Distinguished
Jun 10, 2010
338
0
18,790
Having those iomega Zip/Jazz drives were nice for multimedia projects in school. I was even able to run Starcraft from an external Zip100 in the lab :p . IMO, the CD-RW truely killed it off.
Regarding the G15, I hope Logitech develops a USB3 version with a color LCD that works with more programs/games before the paint rubs off my keys. :hello:
Bulldozer... :sweat: a Phenom deja vu.
Gaming network cards sound good to some not familiar with networking, but once traffic hits a router, it's out of your PCs hands.
 

garywpalmer

Distinguished
Feb 7, 2012
1
0
18,510
AMD is setting themselve up for ridicule with the name "Piledriver". Pile is a synonym for hemorrhoid. Boy oh boy... AMD marketing dudes... get a clue!
 

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
ZIP Disks themselves and most of the drives were fine for that time. I had the blue SCSI version... which was super fast... compared to the printer-port version. Also the internal IDE versions were quite good. Keep in mind, since the blue-ZIP drives were external, it also meant they were subject to more abuse from movement, things falling on them.

They were replaced once CD-RW technology became cheap. Back in the mid 90s, we only had 1.44mb 3.5" floppies. The 100MB ZIP disk was badly needed... the drives were never more than $200, far cheaper than the early $800+ CD-R drives (which I used to own) with its $10 per blank CD-R disc in which you had a high rate of failure. :(
 
G

Guest

Guest
Intel Heat Sink Pushpins (2004-Present) - That's my #1 right there - HORRIBLE DESIGN!!!
 

dwedit

Distinguished
Feb 7, 2012
1
0
18,510
All of these pale in comparison to Miniscribe, a hard drive manufacturer from the mid 80s. In 1989, they literally shipped out bricks in boxes instead of hard drives, just so they could claim they had met their sales targets. Not "brick" as in broken hardware, but brick as in "brick wall".
 

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
[citation][nom]TA152H[/nom]No offense, but you're an idiot. Look at the benchmarks of the Pentium 4 at 1.5 GHz versus the Pentium III at 1 GHz. Pentium 4 easily wins, and that's the speed it was released at. It also grew to 2 GHz, Pentium III got to 1.1 GHz on that same technology.[/citation]
You appear foolish for calling me idiot right off the bat. Perhaps there is a matter of opinion or incorrect information as its been 12 years since intel pooped out the P4. AMD had won the 1Ghz CPU title and Intel designed the P4 for SPEED to race for 2Ghz, which didn’t matter since it was still slower than AMD in general.

Netburst was always crap, and until they moved off Socket 423 (a screw the customer socket) and new chips – the priceperformance of the P4 was crap. Intel themselves pushed the P4 telling people to buy it today as it will get better in the future, never mind ZERO upgrade path.
For general work and gaming with Windows98 (the dominate OS at that time), the PIII was as fast or faster for a much lower price in many areas. Yes, the P4 killed anything when encoding MP3, raytracing… that’s it.

[citation][nom]TA152H[/nom]RDRAM was faster than DDR, and costed the same by the time Intel moved away from RDRAM. Simple as that. It didn't really go away.By the time RDRAM went away, DDR was not cheaper. They were very close in price. RDRAM was faster, though. [/citation]
It should have been faster, but the latency of RD-RAM killed the performance (just like the latency of Netburst and AMD’s Bulldozer) By the time RD-RAM reached acceptable pricing, it was already dead and faster DDR was on the market. But I did something YOU did not, I went back in time as THIS website and Anandtech has a long history of articles.

I checked reviews comparing RD-RAM/etc. First thou – Lets remember that 512MB of RD-RAM costs about $300~400 more than 512MB of SDR/DDR memory.

Content creation: (P4 1.5Ghz)
58.2 = RD-RAM
58.0 = DDR (VIA Chipset)
52.7 = SDR-RAM (i845) {because of latency, it was suspected that intel has crippled the i845 so its not competitive against the i850/RD-RAM}

Office Productivity: (P4 1.5Ghz)
140 = RD-RAM
134 = DDR (VIA Chipset)
131 = SDR-RAM (i845)

Office Bench 2001 (P4 1.5Ghz / Time in Seconds – lower = better)
23.07 = RD-RAM
23.24 = DDR (VIA Chipset) {again, slightly slower that cant be noticed by a human}
33.07 = SDR-RAM (i845)

3D Studio MAX R4.02 {time in minutes / lower = better}
22.22 = RD-RAM
22.75 = DDR (VIA Chipset)
23.15 = SDR-RAM (i845)

3DAquaMark (640x480)
39.2 = RD-RAM
40.7 = DDR (VIA Chipset)
34.0 = SDR-RAM (i845)

Lets compare games : UT99 (640x480) FPS (higher is better)
110 = AMD Athlon 1.2GHz
100 = P4 1.5 GHz (i850 board w/RD-RAM)
100 = AMD 1.0 GHz
098 = P3 1.0 GHz (i815 board)
098 = P4 1.4 GHz (i850 board w/RD-RAM)
094 = P3 1.0 GHz (i820 board w/SD-RAM)

Expendable (640x480) in FPS
124 = AMD Athlon 1.2GHz
111 = AMD 1.0 GHz
101 = P3 1.0 GHz (i840 board w/RD-RAM)
098 = P3 1.0 GHz (i815 board)
097 = P3 1.0 GHz (i820 board w/SD-RAM)
094 = P4 1.5 GHz (i850 board w/RD-RAM)
088 = P4 1.4 GHz (i850 board w/RD-RAM)

SYSMark 2000 benchmark
245 = AMD Athlon 1.2GHz
233 = P3 1.0 GHz (i815 board)
221 = AMD 1.0 GHz
213 = P4 1.5 GHz
204 = P3 1.0 GHz (i820 board w/RS-RAM)
203 = P4 1.4 GHz

These numbers show (A) the P4 sucked coming out of the gate. It was a VERY expensive chip with VERY expensive memory that did NOT blow the much cheaper P3 or AMD out of the water… in many cases; it was at the bottom of the pack. Yeah, once the P4 hit 2.0Ghz, it was faster than any P3… (B) Gaming/bottom benchmarks: The AMD and P3 (815) came with SD-RAM, the P4’s with RD-RAM PC800. Which SHOWS that the P4 with RD-RAM was slower than the SD-RAM systems in many cases, especially with AMDs.

Yes, RD-RAM has 2-4x the memory bandwidth over any of the P3/AMD SD-RAM systems, but in the real world, the latency built into RD-RAM and Netburst negated all the gains.

[citation][nom]TA152H[/nom]i815 was a great chipset? Do you know anything? It was slower than the BX, and constrained to 512MB. It was decent, but not great. It was outperformed by the i840 ~~ You're also wrong in a later post about the i840 and i820. The MTH could be used on both chipsets, and the i820 was not an i840 with it. Please, stop being an ignorant expert.[/citation]
Yes, the 815 was a great Chipset, intel made 6 versions of that set. It completely outsold their 820/840 chipsets. It’s the only they had that competed against the VIA chipsets.

Again, I use data and experience for my opinions. Look at this old Tom’s article: http://www.tomshardware.com/reviews/beefed-bx,207.html (I’ve been coming to this sight since before it became TomsHardware as it started out as a Voodoo/Graphics site)
The wonderful BX chipset was quite old and not designed for the newer PIII CPUs, the board makers manage to OC the chipsets and pushed them to the limits. BX lacked AGP 4X. BX lacked native USB and 133Mhz memory. OC the AGP bus on the BX was problematic.
BX was on par with the 815, 1-2% faster or slower depending on the game/application.


Wow, 512MB was a problem? What year are we talking about? Oh yeah, 2000! The main OS sold at that time was Windows98se. Anything more than 512MB on Win9X was a waste. Typical systems had 128mb, maybe 256 if they were a power user. Cost of 128mb back then was about $100.

WinXP didn’t start to take hold until 2002 (released NOV 2001) and 256mb was the minimal a WinXP should have, but 512MB was fine for many users. By 2002, AMD and P4 support for DDR2 was standard with boards that supported 2GB (i845x)… by the 815 memory limitations was NOT an issue since it was an Win9x tech released in 2000. As I showed above, the 815 was is faster than the 820(SD-RAM) and sometimes faster than the 840 – but overall, minor.

So again, only an idiot would buy RD-RAM for the Pentium 3 platform. RD-RAM deserved to die and RAMBUS as a company deserves to be out of business. RD-RAM = fail and they deserved it because it was crap technology.
 
@belardo
Although Rambus probably deserves to go out of business, they didn't and are still around. They aren't doing much, but they still make money from lawsuits. Now that they have great products, XDR and XDR2 memory, not many people want to do business with them. Otherwise, I'll agree with your comment. P4 sucked coming out and pretty much sucked after coming out. For a short while it battled with AMD until Athlon 64 came out and tore through P4.
 

lysinger

Distinguished
Nov 26, 2010
101
0
18,690
It seemed to me that RAMBUS was done in by arrogance on the part of the business owners who created enemies of those who controlled the computer hardware industry, who in turn put them out of business. Those systems were very quick in their day though. I keep a couple around the shop for grins.
 
[citation][nom]lysinger[/nom]It seemed to me that RAMBUS was done in by arrogance on the part of the business owners who created enemies of those who controlled the computer hardware industry, who in turn put them out of business. Those systems were very quick in their day though. I keep a couple around the shop for grins.[/citation]

Like I said right above you, Rambus is not out of business yet. They are patent trolls, getting patents and then suing companies whom used technologies that the patent covered prior to the patent's filing. This is how they get their money because most companies refuse to do business with them even thoguh they now have superior memory products.
 

lysinger

Distinguished
Nov 26, 2010
101
0
18,690
[citation][nom]dwedit[/nom]All of these pale in comparison to Miniscribe, a hard drive manufacturer from the mid 80s. In 1989, they literally shipped out bricks in boxes instead of hard drives, just so they could claim they had met their sales targets. Not "brick" as in broken hardware, but brick as in "brick wall".[/citation]

Don't forget the Kalok IDE hard drives that fried if the IDE cable was inserted in reverse, or just fried for no good reason. I had a 120mb that I kept on the shelf just to show people one that still worked.
 

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
[citation][nom]blazorthon[/nom]@belardo I never said FX wasn't a fail. What I did say is better ways to compare it against Intel. No, I don't care how you do it, comparing a CPU that doesn't beat the $110 version of it instead of the $110 version and then claiming it's junk because it fails in a place where it never could have succeeded is wrong. Oh yes, Bulldozer is a fail, but at least use proper comparisons to come to that conclusion. [/citation]
I don’t know what you want. Intel and AMD compete. So perhaps the better way to compare FX is to Core2? Okay, fine. FX is better than any Core2 CPU… AMD rocks!

What proper comparison are you looking for? It’s a current product that can only compare to intel’s current products. It cannot compare to a Core2 CPU because intel has stopped making Core2 tech CPUs AFAIK and if there is anything left, it’ll be a bottom end $45 chip.

FX 8120 or 8150 doesn’t beat Core i5-2400 in price, performance, heat or power. It doesn’t do well against PII consistently either and those are being phased out.

I don’t care if you or AMD call it an 8 core CPU, or how well can perform a single thread or multiple threads… in all ways, its substandard.

[citation][nom]blazorthon[/nom]Bullozer's cores aren't really better than Stars. Stars are actually a little better than Phenom IIs at the same clock speed. [/citation] AMD should not have bothered with BD then. Just make a good general purpose CPU for a good price. The A-Chips perform good enough for more than 80% of the people out there.

[citation][nom]blazorthon[/nom]You still think that just because the 8150 is a $280 chip that it should outperform CPUs that have better performance per core than the 8150 in single or lightly threaded work.~~ So, you don't care about ways to make the chips better? You don't want to change anything? [/citation]
That is obvious… a $280 CPU should outperform a $150 competing CPU (both AMD and Intel) no matter what. I don’t care how it does it, the FX could have “100 cores” if it performed like a $280 CPU. There is a reason why a Ferrari is a $100,000 and a Ford Focus is not. The Focus is a fine car, does very good – will do 110mph, etc… If Ford sold it for $50,000 – then it becomes a crappy $50,000 car, rather than a fine $18,000 car.

The performance of the FX8150 shows its value to be around $130~160.

[citation][nom]blazorthon[/nom]Well, that also defines overclocking pretty darn well so I hope you don't mind overclocking. Should we have to do it? no, but it's not as bad as you make it out to be. And where is this idea that you would need to ever change the settings again? I suppose there could be good reason to do that if you did highly threaded work, but then it wouldn't be just a gaming computer. [/citation]
Most people are not 20year old kids who have nothing better to do than spend hours / weeks tweaking their computers. Did that, took pictures and moved on. Person A can spend $220, drop in an i5-2500K CPU and be done or spend 1 minute OC it and be done. Person B can spend $280, spend time OC and tweaking to maybe equal the performance of Person A who did nothing.

When I talk about making changes, I’m responding to YOU in that a user should go into BIOS/whatever to tweak the CPU a few times a day depending on WHAT they are doing? The design of the NEW CPUs is that they change their core config/speed depending on the work load.

[citation][nom]blazorthon[/nom]but I would like to have one just to play around with it, see what I can get it to do.To be honest, the 8150 is just a 8120 with a higher multiplier, not even really a better binned chip. It would be best to ignore it and use the 8120 instead since the difference is almost $100. ~~Is Bulldozer a fail? It obviously is, but I'd rather call it a fail after doing everything I could to give the CPU a fair chance before I called it a fail.[/citation]
Go out and buy one then… but websites like this one are here to provide the data that many people don’t have the ability to do so via funding/business/access to products. So I don’t have to buy a bunch for $100~300 CPUs to know the performance of those products. Its nice that the 8120 is $80 less, its default clock rate is 500Mhz slower but its doesn’t change much.

I have re-visited some reviews. There are situations in which the FX 81xx shows promise, where it can even hang with the i7-2600. But then there is the results of gaming which puts the FX below an old PII-X4 chip. Even the FX4100 can be faster than the 8100 in some games. That remains the problem… if the FX was always faster than the PII chips, that would make FX a much more attractive chip. That one little thing could have made a difference in how the FX is thought of.

Looking at the performance of games vs productivity, it seems that AMD has a problem with how their CPUs are communicating with their north bridge and GPU. Afterall, the only people who really need $150+ CPUs are those who do PS/video/gaming. Otherwise, a typical modern $60~100 CPU runs Windows7 (and Win8) just fine. I run on a POS e2160 CPU on my notebook, its good enough… not great.

Perhaps Piledriver and a new chipset & socket will provide the memory bandwidth that AMD needs? Has AM3 reached its limits? When Piledriver comes out, intel will be replacing Sandybridge…

[citation][nom]blazorthon[/nom]You're right I did confuse Williamette with Northwood. Either way, it was not too good either. Left at stock speeds, it was the only time P4 was competitive with the Athlons. Overclock them though, and and northwood would slowly reduce in stability, forcing you to deal with stability problems or slowly reduce it's overclock until it died a few weeks or months later. I think it was called Sudden Northwood Death Syndrome or something like that.[/citation]
As shown, a Netburst CPU running at 4Ghz would still under-perform compared to a 2.6~3.0Ghz AMD Athlon. OC can only get you so far. And for those who tend to keep their hardware for more than 6 months… longevity is more important. And then when you have throttling issues – then you can forget about performance.

Again, looking at additional info of FX performance, its not as bad a chip as some make it out to be. But its price and its performance against previous AMD CPUs hurts it. Once PII are gone in the next few months, the Stars (A-series) CPUs will remain and will easily be slower than BD. The last roadmap, if I remember right – is that socket FM2 will come out this year and that both future FX and A chips will fit. Meaning current Socket FM1/AM3+ boards are coming to EOL. Thus, perhaps add 10~15% performance to the AMD platform.

April, Ivy Bridge comes out with up to 20% performance increase over SB…. Ouch.
The i5-3570K is 3.4Ghz/77w part with a project MSRP of $225. This would put it over today’s i7-2600K.

Side note: Intel’s model numbers get worse (ugh). There are 6 versions of i5-3XXX chips out the door, yet the numbers are all over the place such as:
3300
3450
3470
3550
3570 plus S, K and T versions.

I still wish AMD all the luck and talent they need. We need competition.
 
G

Guest

Guest
How many of these were "exposed" at the time. I had several IBM disks with the squeak of death and there was nothing in any news article i read at the time. Hindsight reporting is easy there are very few tech news organisations will to report on vendor trends as they happen for fear of lawsuits or losing advertising revenue. I wonder if Toms hardware is brave enough to report on the current Nvidia/ Windows 7 GTX 500 series driver restart issue which is afflicting multiple users, or the several SSD vendors who have system restart issues or will i only read about these in a few years time ?
 
@belardo
So the FX-8150, even though it isn't better at gaming than the much cheaper FX-4100 should be compared against Intel's i5s and i7s in gaming, somewhere where it's obviously useless since you refuse to do what it takes to make it a better CPU for such a comparison. By your logic, why don't I say that the 10 core Xeons suck? Sure, there's cheaper xeons with fewer cores that outperform the 10 core Xeons in gaming, but the 10 core Xeons cost more so we should use them as a comparison instead. Do the 10 core Xeons not perform like multi-thousand dollar chips because they're being used in a situation where they're at a disadvantage? Their cost is justified when they are used in different situations.

8 core FX chips are better gaming CPUs after disabling 1 core from each module, a quick and easy BIOS setting to change. Definitely faster than overclocking as far as you can go. For their cost, they perform well in multi-threaded applications without disabling one core from each module. And really comparing FX to Core 2? To be honest, I think Core 2 has better IPC than FX without the disabling of the cores.

Bulldozer performs where it should for it's prices in multi-threaded environments. The FX-8120, at $190 with the AMD rebate is even cheaper, out-performs the comparatively priced i5s in highly multi-threaded applications. Overclock it a little and it's even better. If you want to use it for gaming then disable the four cores and overclock it further, it's now as good or better than a Phenom II. Still not as good as the comparatively priced i5s in gaming, but it's better than the Phenoms that way. It would have similar IPC to a Phenom II with the cores disable and would overclock further.

It wouldn't lose anything besides the integer cores when you disable them, either. That means that well threaded floating point work will be the same, it's just integer work that would change. Should we need to disable the cores to get it better than it's predecessors in gaming? Obviously not, but since it does need to be done to out-perform them then buyers might as well do it.

Under these circumstances, FX would always out-perform Phenom II in gaming. With the cores enabled it will always out-perform PII in highly threaded work. Both ways it will always outperform PII in FP work. Is Bulldozer a fail? Yes, it shouldn't need such things done to make it as good or better than it's predecessor. It should be much faster than it is regardless of the situation it's in. It should also use less power. While I'm at it, it should also have lower latency cache, but so should PII.

I should buy an FX just because I'd like to play around with one? Well, I'd like to play around with a DP Xeon board that's capable of overclocking and some liquid nitrogen, but that doesn't mean I'm going to go buy that stuff just to play with it.

You're right about how AMD shouldn't have bothered with Bulldozer, or at least not with it as it is now. Bulldozer's only saving graces are having two more cores and greatly improved floating point over the Phenom IIs. It belongs in servers, that's about it.

Northwood never came close to 4GHz and the top Athlon XP (not the Athlon, Northwood competed against the XP) was usually slightly behind the top Northwood. The Northwood P4 HT came out and it was well beyond the Athlon XP, this went on until the arrival of Athlon 64 and FX. Once those two series came out this performance advantage was all over, although Intel's Gallatin P4EE that came out just before the 64s did was still the winner in some benchmarks, but not even close to the winner in value.

You think that Bulldozer has performance problems because of the difference between productivity and gaming performance? Productivity is generally much more threaded than gaming so it's natural that it would work better than gaming on an 8 core processor. If floating point plays into that too then there's another reason, Bulldozer's floating point is almost as good as Sandy Bridge. PII's FP is deplorable in comparison. There could be a problem in the CPU's communication with the GPU and northbridge, but your stated reason for this isn't convincing.

I don't see AMD's FX and Heterogeneous computing platforms merging into one socket. Sure it's possible, but it seems unlikely. Even if this did happen, how would it increase memory bandwidth? The AM3+ platforms can support high-speed memory already, as can the FM1 platforms. Bandwidth can't be improved unless we get better memory here.

Ivy Bridge is not supposed to be 20% faster than Sandy, it's supposed to be 10-15% faster. This is Intel's own claim, not mine. According to current benchmarks those numbers are somewhat inflated, but close enough. Of course, Ivy's biggest advantage will be it's power consumption. It will overclock better than Sandy Bridge unless it has some as of yet unknown problem(s).
 

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
[citation][nom]blazorthon[/nom]@belardo So the FX-8150 ~~ should be compared against Intel's i5s and i7s in gaming, somewhere where it's obviously useless since you refuse to do what it takes to make it a better CPU for such a comparison.[/citation]
What ARE YOU talking about? Refuse to do what? The FX is a performance chip, $200 / $280 means *IT* gets compared against $200~300 Intel chips. No AMD chip comes close to the I5-2500 or higher in gaming. So I and everyone else is supposed to disregard that aspect of the chip?

People use there computers for many things, browsing, gaming, work, music, porn, whatever. When a person drops $200 for an i5-2500, they KNOW for a fact, its faster than the i7-920 in every type of benchmark. That cannot be said about the FX.

Nobody compares a $15k car to the $200,000 car. They only compare all other $200,000 cars together. If the FX wants to price itself like a big boy, it has to perform like one.

[citation][nom]blazorthon[/nom]8 core FX chips are better gaming CPUs after disabling 1 core from each module, a quick and easy BIOS setting to change.[/citation]
Again, that is STUPID. A person has to reboot, go into BIOS every time they switch to gaming to productivity?! Why bother with such a useless product that must have functions disabled for performance (This is why many gamers disabled HyperThreading on the P4 since it hurt performance, hmmm.... sounds familiar) Gee, or a person can get an i5-2400 for $150 and not screw around with BIOS settings.

[citation][nom]blazorthon[/nom]Productivity is generally much more threaded than gaming so it's natural that it would work better than gaming on an 8 core processor.[/citation]I looked at the productivity benchmarks (single and multi-threaded) - again, the FX8150 is all over the place, usually near the bottom and below i5-2400.

Okay and agreed about Ivy Bridge... as with all new tech, I expect new models/versions to be faster - but until I see the results, I take what the marketing with a grain of salt. The FX chips were disappointing. But the boxes look really cool. AMD says Piledriver is supposed to be 20~30% faster than BD at same clocks... so we'll see.
 
Vista RTM was, indeed, a huge failure - Microsoft basically sold (at a high price! I still feel for those unlucky bastards who forked over for the Ultimate Edition, and ended up with a few themes and other worthless extras) an unfinished product - which, compared to WinXP (which had gotten a complete rewrite in SP2) was mature and well finished. However, WinXP's RTM had been such a long time ago that people had forgotten how it sucked at first, too - eventhough hardware manufacturers had had time to tune their drivers with Windows2000.

However, Vista itself isn't so much of a failure anymore: case in point, I recently had to reinstall Vista on a laptop which had been working ever since 2007 under Linux (due to a bad BIOS bug, recent versions of the kernel "loose" the hard disk). When I first had it in hand (it was a single core Celeron + 512 Mb RAM + Ati IGP without dedicated RAM bargain bin POS), it took 24 minutes (I timed it. Twice) from power button push to the desktop appearing (let's not mention its responsiveness at that point - you must add 5 minutes to start a web browser).

First, removing all the manufacturer's software cruft (Norton, toolbars etc.) to replace it with lighter, more streamlined stuff (or get rid of it altogether) shaved 15 minutes off booting, and a browser would open in less than a minute - still long, but quite some progress. Updating drivers, installing SP1 then SP2 and other updates shaved another 4 minutes off booting and the browser would start in under 30 seconds. Disabling UAC didn't yeld much more extra performance, so I left it on - it's still safer than going back to full admin rights.

The extra 512 Mb of RAM reduced boot and app lauch times further, the computer now boots in under 2 minutes and browser opens in under a dozen seconds.

Was Vista a bad product? Absolutely. Did it get better? Yes, definitely - the difference with 7 afflicts only those PCs that have less than 1 Gb of RAM due to the RAM savings awarded to the return of GDI acceleration (i.e. compositing taking place in video RAM instead of system RAM) - it thus has no impact on gaming, and hardly any impact on browsing due to most current browsers using Direct2D acceleration (thus not impacted by WDDM 1.0's limitation on GDI).
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
Ah, I so remember many of these. I too remember the excitement over the concept of large-size rewritable media, back before home networking and broadband became more commonplace: >1.38 MB files were an ordeal to transfer, (and impossible until I learned about spannable archives) and the epic failure rate I got with floppy disks had me often carrying around multiple redundant disks with my school-work on them. The advent of cheap CD-Rs made life so much easier.

As for RDRAM, I really think that it was the price that did it in; that was why I'd dreaded it. It's quite possible that had the Nintendo64 not given the format a leg up, (as 500 MHz in 1996 was a jaw-dropping speed) it might not have had the standing to catch Intel's eye for the Pentium III and later 4.

The sad news is that even today, the Death Star tradition seems to continue; I made the mistake once of thinking that it might've ended after IBM sold the division to Hitachi, but to date that marked the only HDD failure I've personally suffered in an otherwise-lucky experience.

As others have said, a look at the Pentium 4 isn't complete without "PresHot;" while a lot of news was made over NetBurst's significantly worse per-clock performance compared to the older P6 architecture, (one which I recall Tom's gave a rather prescient look over not long before Intel announced their reversion to a spin-off of P6 for Core) many benchmarks DID show an improvement for the Pentium 4 overall. However, it was the Prescott core that marked the beginning of the end there. Perhaps it wasn't so much the chip's then-incredible TDP of 89+ watts, but rather the infamous case where, in spite of Intel's bold claims of going to hit 4.0 GHz in 2004, that Prescott showed the same issues and inability to run stable at even 3.8 GHz. To date, I don't recall Intel ever having released a commercial CPU that clocked in at 4.0 GHz or higher by default.

As for the Itanium, I think that it was one of those ideas that was great on paper, but just never really worked out in practicality. It might still have a future had Intel's development of it not been lagging so badly behind their mainstream; it wasn't until 2010 that Tukwila brought the series into the 65nm range, when at the same time Westmere brought Nehalem/Core i3/5/7 into the 32nm arena.

The opposite end involves the Radeon HD 2900XT, and AMD Phenom; both were good architecture ideas, but had lousy execution that utterly failed to live up to the hype. A 512-bit memory interface should've resulted in amazing power, but as the 256-bit 3870 shown, that much memory bandwidth was rather superflous. I'm wondering if Piledriver will show Bulldozer to be much the same way here; though it's a shame that AMD already used up what was possibly the best code-name for an 8-core CPU. (Orochi)

The PhysX card, as I always felt, was pointless to begin with; unlike a video card, it occupied what I saw as a non-market, at least from an engineering perspective. Video cards rely on a lot of specialized, fixed-function hardware (Texturing units in particular, but also ROPs and now tesselation units) to accomplish a lot of their functionality, and their stream processors are sufficiently different enough in both design and demand that integration into the CPU core proper would simply not make sense. Additionally, there was a bit of physical hardware, namely the extra ports, that necessitated adding a separate expansion card. A "Physics Processing Unit" fit none of those justifications: there was no fixed-function circuitry there, which means that all its general-purpose circuits could be incorporated into another chip. And a GPU's SIMD/vector-centric design matched up perfectly; the targeted applications (games and 3D rendering) likewise would've needed a GPU anyway, so it was a guarantee that the user would have a GPU there nonetheless. For the price and amount of use needed, even simple CPU physics often sufficed; this was demonstrated in some hacked versions of Ageia's showcase game CellFactor, which permitted the originally PPU-only game to run on the CPU alone... With only a modest hit to the framerate. (a later patched game included an "official" no-PPU version, which apparently included a hefty, artificial framerate cap; perhaps by using x87 in lieu of SSE)

Since you mentioned the KillerNIC, why not mention where someone took this progression of "needless expansion cards" to its logical conclusion with HIS's iClear Video Signal Noise Reduction Card. One thing's for sure; while the idea is a failure, I'm pretty sure HIS is making a lot of money.
 

ChromeTusk

Distinguished
Jun 10, 2010
338
0
18,790
Regarding all the comments about Win-ME and Vista, I view them as beta versions for Win-XP and Win7. Microsoft wants to push a new OS onto us, but they WILL NOT catch all the problems before official release. Now add in the different hardware configurations Dell, HP, Asus, and other companies want to put it on along with customers upgrading older PCs. Lessons are learned (hopefully) and fixed, then the next version works even better: ME --> XP; Vista --> 7. With that said, I will likely skip Win8 and wait for its successor.
 
[citation][nom]ChromeTusk[/nom]Regarding all the comments about Win-ME and Vista, I view them as beta versions for Win-XP and Win7. Microsoft wants to push a new OS onto us, but they WILL NOT catch all the problems before official release. Now add in the different hardware configurations Dell, HP, Asus, and other companies want to put it on along with customers upgrading older PCs. Lessons are learned (hopefully) and fixed, then the next version works even better: ME --> XP; Vista --> 7. With that said, I will likely skip Win8 and wait for its successor.[/citation]

If you want to view ME and Vista as betas, fine... But by that logic I'd look at Vista as a beta, but ME was something else. XP wasn't too great either until SP2 came out so I'll call it the beta there.

Besides, you're skipping NT and 2000 in there too. 2000 was a pretty good OS.
 

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
[citation][nom]ChromeTusk[/nom]Regarding all the comments about Win-ME and Vista, I view them as beta versions for Win-XP and Win7.[/citation]
That would be incorrect. ME is part of the Win9x family. Its really Windows98se with added functions that some ended up being in XP. Nowhere near close to beta of XP. XP is derived from the NT family - its not an MS-DOS based OS, supports NTFS, etc. But NT didn't become a usable platform until Win2000. NT4 and older were nightmares, nothing more.

Vista being a beta of Win7... not really, but I can see how it could be viewed. Not much public testing, it offered little over XP. It did many things badly and computer operation was virtually the same as with XP. Also, Vista Taskbar looks like a black version of XP-MCE's blue taskbar. MCE = Media Center Edition, which has a new skin from 1995 that could have been added to all versions of XP.
Also MCE was the OS to get for XP... its like a $30 cheaper version of XP-PRo as it only lacked VPN.

Win7 SP0 is still better than any version of vista.
 
G

Guest

Guest
I like how everyones forgetting the onboard GPU on the Bulldozer chips.... That is the focus AMD are chasing, sure their slow as bat**** and should be no where near a proper desktop rig, but when it comes to All in one pc's and laptops that don't have room for an external GPU they kick arse to what is currently available. So fail??? NO! reviewed in the wrong application they're a failure, but reviewed as a full blown all in one chip they're the best. Unfortunately for AMD, the touch screen all in one market is not expanding as fast as the laptop (with regular desktop CPU's, ie over 13" moniters) market is shrinking.
 
I like how everyones forgetting the onboard GPU on the Bulldozer chips.... That is the focus AMD are chasing, sure their slow as bat**** and should be no where near a proper desktop rig, but when it comes to All in one pc's and laptops that don't have room for an external GPU they kick arse to what is currently available. So fail??? NO! reviewed in the wrong application they're a failure, but reviewed as a full blown all in one chip they're the best. Unfortunately for AMD, the touch screen all in one market is not expanding as fast as the laptop (with regular desktop CPU's, ie over 13" moniters) market is shrinking.

... Bulldozer doesn't have an IGP, that's AMD's APUs you're thinking about. The current family of APUs is Llano and the upcoming family is called Trinity. FX chips do not have any on-die graphics. So yes, it's a fail. Besides that, many laptops have discrete video cards. The fastest of these cards is Nvidia's GTX 580m, followed by AMD's Radeon 6990m. According to the GPU hierarchy chart provided by Tom's, these mobile GPUs are somewhere around the Radeon 6870 and GTX 560 in performance.

Llano APUs are great against Intel's laptops that don't have discrete graphics so they win in the low end, but on the desktop side you can get an Intel Pentium and a Radeon 6670 cheaper than the top A8 Llano chips while being faster. Only the slowest of the Llano APUs has no competition form Intel because Intel doesn't make any CPUs that slow and discrete graphics cards that slow aren't made anymore either.

Your last sentence confuses me a little, I don't know what you were trying to say there. It seems like you think that laptops with over 13" screens have desktop CPUs. I hope I misinterpreted this because that's wrong. Very few laptops have desktop CPUs anymore, although this was a much more common practice back when Intel didn't have proper mobile chips. You also think that the laptop market is shrinking, I'm quite sure that it isn't shrinking, unless you don't count the growing ultra-book market as part of the laptop market.
 


Zip drives had a high failure rate compared to other media, IDE/PATA adapters often don't work properly if at all, and the Phenom was a buggy, slow processor that was made even slower to fix it's bugs. The Phenom IIs were great when they came out, but now they are old yet still AMD's best, although Intel has advanced well since then they came out. Phenom IIs are comparable to Core 2.
 

MrBurns

Distinguished
Mar 25, 2003
232
0
18,680
I am surprised tht no software product was included in this article. Maybe it is hardware-biased, because it is from Tomshardware. There where really terrible software products, like Windows Vista, Windows Me and the less known software SoftRAM.

RDRAM didn't only fail because it ws expensive an not that much fater then DDR-SDRAM, but also because there was price fixing upon the memory manufacturers, they sold the chips for less then the production costs for some time because they wanted to get Rambus oout of the market to save royalties.

The driver compatibilty problem is alo microsoft's fault, if they would make their driver model more backards compatible, you might be able to use e.g. a Vista driver on 7. But I think one reason why microsoft doesn't do anything for driver compatibilty is because when some hardware doesn't work on the new OS, many oonsumers would buy a whole new computer anbd in most cases with a new copy of a Microsoft OS.

(to be continued)
 

MrBurns

Distinguished
Mar 25, 2003
232
0
18,680
And you don't have to buy a motherboard with IDE if you really need IDE, you can also buy a PCI IDE controller card. You can get one for <20$ even with 4 chanlles and RAID (although you might not get the max. perfrmance fpor some RAID modes, because PCI is limited to 133 MB/s).

And Ageia PhysX wasn't that bad, it was a little expensive, but the main reason why it failed was because game developers didn't adopt it quickly enough (this was a chicken or egg problem: game developers don't adopt new technology quickly, becaus ethere are not many users and not many users buy a card, which only works with a few games). What many people forget is that PhysX was introduced in 2004, whichwas long before GPUs where PPU-capable.

Edit: can anyone please tell me, why some other users can make posts hich aree much longer then the ones I can make (I had to split my text, because the lenght of my previous post is about the maximum I can post at once)? Do you need veteran status or something for longer posts?
 
Status
Not open for further replies.