Top 5 Worst CPUs Ever (In my opinion)

njsullyalex

Prominent
Aug 13, 2017
42
1
560
Warning! I am writing this as if it were an internet article, so it will be long. I am doing this simply out of boredom. This is not a question, don't worry about solving anything. Also, understand that a lot of this is my opinion, and you are free to disagree, as many of you probably have (and use) CPUs that will be mentioned in this list. Also, no CPUs released prior to 2000 will be on this list. This is for your entertainment. Enjoy!

5. Intel Atom (2008-2011)

(Sorry, can't post pictures). The Intel Atom line of CPUs were originally released in 2008 to be used in small, lightweight, portable laptops known as "netbooks", that were supposed to be the future of mobile computing (until devices such as the iPhone and iPad came out). These machines generally contained Intel Atom CPUs, which were small, ultra low voltage, x86 CPUs capable of running Windows. And by capable I mean barely so. The original Silverthorne and Diamondville CPUs that came out were single core, dual thread CPUs that most of which didn't even have x64 built in. This doesn't really matter, as with just have the performance of a Pentium M, even Windows XP struggled to run on these machines, making them virtually unusable. My mom owned a netbook, and while I have good memories of playing Solitaire on it, that was about all it could do, as it struggled even to go on the internet. More recent Atom CPUs have generally been better - for example, an Intel Atom x5 Z8300, which has 4 cores and 4 threads, and would generally be found in tiny PCs like the Intel Compute Stick, scored around 95 CB in CB in Cinebench R15 multicore test (the minimum I would say to be 'usable' in Windows 10 is 90 CB). But this isn't the end of the Atom's issues. Back in 2017, it was discovered that C2000 series Atoms would self destruct over a period of time due to a design flaw, and systems with these CPUs installed would be bricked! (They were soldiered to the motherboards). As a result, the Atom will live in infamy for its abysmal performance, use in the failed netbook fad, and for their self destruction. If you want a low power small PC, look for one that has a much superior Intel Core M CPU onboard.

4. AMD E-Series Ultra Mobile APUs

If you thought the Atom was bad, then you haven't heard of the E-Series. Like Intel, AMD wanted to make a CPU that would work in low power, low end, small laptops without much hassle. AMD has been well known for their APUs, which integrate decent Radeon graphics into their CPUs, meaning their CPUs had superior integrated graphics to those made by Intel. However, since a large part of the die on these CPUs was now being taken up by the GPU, AMD had to compromise on CPU performance. For the A series of APUs, this isn't much of an issue, as the CPUs are still not weak enough to bottleneck their onboard graphics, and actually will outperform Intel CPUs on integrated graphics in games. However, the compromise on performance on the E series was too far. When the 1st gen of E-Series of CPUs came out in 2011, their performance was even below sub par; an E-300 scored a 39 in Cinebench R15 - well below the usable level needed to run a 32 bit version of Windows 7 smoothly at the time! CPU performance was so bad that it was enough to bottleneck the onboard Radeon HD 6310 graphics on board! For more information, check out this video made by RandomGaminginHD: https://www.youtube.com/watch?v=QN_-0n4XvEQ . The last of these E series CPUs to be released was the 9000 series in 2015, with the E2-9000 still scoring a subpar 84 in Cinebench R15, not much better than a decade old Intel Pentium D. And this was on the eve of the release of Windows 10! How were they expected to run that? On the bright side, the Bobcat architecture that many of these chips were based on is closely related to the Jaguar based CPUs that power the Xbox One and PS4, which, say what you want, are not half bad gaming machines. Still, for AMD to think they could get away with a CPU that performs this badly this recently is a crime, and I feel terrible for the consumers who bought laptops with these CPUs expecting them to be good gaming machines with their Radeon graphics.

3. AMD FX

You saw this one coming. In 2011, both Intel and AMD released their newest CPU architectures, hoping that theirs would outperform the other. For Intel, this was Sandy Bridge. Sandy Bridge featured Intel's new "Ring Bus" architecture that resulted in very good core to core latency, resulting in IPC that at the time, was off the charts! Sandy Bridge CPUs at the time outperformed anything and everything released before them, with the 4c/4t Core i5 2500K and 4c/8t Core i7 2600K going on to become legendary for their raw power and ability to overclock to 5ghz! Even today, Sandy Bridge still bangs heads even with the latest CPUs and remains relevant in all of the latest games, which is why so many people still hold onto their old 2600Ks and pair them with GTX 1080s. If I were to make a top 5 best CPU list, Sandy Bridge could very well take 1st place because of how revolutionary and good it was (and still is!). AMD also launched their latest line of CPUs that year, coming out with their "Bulldozer" architecture, kicking off the FX series. FX pitted itself as having more cores than ever before on a desktop platform, with the high end FX-8000 series having 8(!) cores and threads, topping the previous generation Phenom II X6. It also released on the revised AM3+ socket. With more cores and threads, AMD sought to take the performance crown from Intel, but even early on in the benchmarks, people noticed something very wrong: FX's IPC (single core performance) was even worse than Phenom II! As a result, in single core applications (which were more popular back then, compared to now where things are more multithreaded) FX simply could not keep up with Sandy Bridge, and while the FX 8000 series had good multi core performance thanks to its 8 cores, it was still generally outperformed by The i7 2600K. This was due to a design flaw in Bulldozer: Unlike Sandy Bridge (and every other CPU architecture ever), in which each core had its own Fetch, FP Scheduler, and L2 Cache, Bulldozer had two cores share those three components. While this was supposed to help increase overall performance, this meant that the cores were being bottlenecked by themselves, ruining their single core performance. However, this is not the end of FX's woes. AMD, to steal some thunder from Intel, released the FX 9590, an 8 core, 8 thread CPU that they pitted as "the world's first out of the box 5 ghz CPU". Unfortunately, the 5ghz was only through turbo speeds, and most of the time, it could not reach those speeds. While Sandy Bridge was very power efficient and cool-running, FX tended to chug power and produce a scary amount of heat. So much power and heat, in fact, that FX-9590s were too much so for most AM3+ motherboards at the time, and as a result, caused many motherboards to fry! That's just not supposed to happen. AMD failed to release any new desktop line from then on (though FX did get a small refresh with slight improvements), and Intel gained their market dominance, releasing small improvements to their Core series every year. It would not until 6 years later that AMD would finally come back to compete with Intel again with the release of their Ryzen line of CPUs. There is a silver lining to FX though. FX still has good multi core performance, so they can still be used for light workstation tasks. Their high core counts also make up for the lack of IPC in games, and while they might fall behind in the latest games, they can still run most games out there smoothly enough when paired with a good GPU. Finally, AMD has surprisingly stuck it out with FX, and even to this day in 2018, AMD still has FX listed on their website, meaning they are still both promoting it and selling it! To make things even better, once high end FX CPUs are now dirt cheap and can still be bought new on Amazon, with the FX-8350, an 8 core, 8 thread CPU, is selling on Amazon for just $90, even cheaper than the lowest end Ryzen 3 1200! To top it off, AM3+ motherboard and DDR3 RAM prices are also cheap and can be bought new, so FX isn't such a bad choice for those looking to build a new system on a tight budget and don't want to go used (though I would still definitely look into spending just a little more on the newer Ryzen platform).

2. Intel Netburst (Pentium 4, Pentium D, Celeron D)

Sorry Intel fanboys laughing about FX. Guess what? A little more than 10 years ago, Intel made every same mistake as AMD, but so much worse! On release in 2000, it wasn't that bad, and initially seemed to be a good improvement over the previous Pentium III... that was until, like AMD many years later, it turned out that when matched clock for clock, the Pentium III outperformed the new Pentium 4s! Intel achieved a performance improvement by increasing the Pentium 4's clock speeds to ridiculous levels, meaning that the Pentium 4 was a power hungry, hot running monster (sound familiar?). It wasn't all bad, because the Pentium 4 also premiered Intel's simultaneous multithreading (SMT) technology, known as Hyperthreading, which is used in Intel's CPUs and provides a noticeable improvement in performance as it almost turns one core into two. However, there wasn't much else special about the Pentium 4... Which became noticeable when AMD released their newest CPU in 2003. That year, AMD released a CPU that would bring what is probably the greatest innovation to x86 CPUs since the release of the Intel 8086 itself, and this CPU was the legendary AMD Athlon 64. What does 64 mean exactly? This was the first consumer CPU to come with the x64 extension, making it the first 64 bit x86 CPU. AMD had done the impossible, making a 64 bit CPU that had FULL COMPATIBILITY with ALL 32 bit x86 applications with no performance loss, and it paved the way for 64 bit versions of Windows and applications that would take over the world of computing in years to come. This was both a big blow and an embarrassment to Intel, who with no better alternative, had to use AMD's technology in their CPUs, and they will never live down the fact that ever Intel CPU uses the "AMD 64" instruction set. On top of that, disregarding x64, the Athlon 64 generally outperformed the Pentium 4 and used less power and produced less heat. And in 2005, AMD released yet another innovation that would change the face of computing: The Athlon 64 X2, the world's first consumer multi-core CPU. This was equally revolutionary to x64, and like that, is seen in every modern CPU. So, how did Intel respond to these issues? Well, they did implement x64 into the Pentium 4 once it released on the new LGA 775 socket. However, more infamously, Intel decided that in a rushed effort to release a CPU to compete with the A64 X2, they thought it was a good idea to glue together two Pentium 4s and call it a day. The result: The Intel Pentium D. The Pentium D was a nuclear reactor; it ran even hotter and drew even more power than the Pentium 4, making it almost scary to have in one's system! And even then, performance wasn't great, still lagging behind the A64 X2. Intel also released the embarrassingly bad Celeron D, which was a 1 core, 1 thread CPU that had its L2 cache cut down. As a result, this dreadful CPU ran 50% worse than the already abysmal Pentium 4! I, unfortunately, had a Celeron D in my very first PC (to make matters worse, it ran Windows Vista!). This enough makes Netburst qualify for this list, but we aren't done yet. If you notice, Intel still had a market share lead over AMD despite AMD having superior products. How did this happen? Well, Intel modified the compilers so that in benchmarks, the Pentium 4 would "appear" to outperform the Athlon 64 and Pentium III. In addition, Intel infamously paid off system manufacturers to avoid using AMD CPUs, meaning general consumers were stuck using Netburst in their systems with no AMD alternative! These scandals showed that Intel was willing to do everything they could to keep AMD out in front, and they were lucky that AMD overspent on fabrication facilities and their purchase of ATi to help keep them in front, and that their next gen Core 2 series performed much better than Netburst and lives today as one of Intel's all time great CPU lines.

So, we have an even split between Intel and AMD so far, and interestingly, it seems that at one point or another, they have made similar mistakes. So who takes crown as making the worst CPU of all time?

And the worst CPU of all time is:

1. AMD Ryzen








JUST KIDDING!!!!!!!!!!



And the real worst CPU of all time is:

1. Intel Kaby Lake

Sorry Intel fans. I'm not kidding this time. On the surface, Kaby Lake is a totally good CPU architecture, with excellent IPC, resulting in great single and multi-core performance, and at launch, Kaby Lake prices were acceptable. However, once benchmarks came out, something became pretty clear: the high end Core i7 7700K, when matched clock for clock with the previous gen Skylake Core i7 6700K, performed identically. What was revealed was that Kaby Lake, unlike previous releases that gave incremental IPC increases per generation, was actually not a new architecture at all: It was just a slightly overclocked re-release of Skylake! In other words, it was almost a scam, and a lie from Intel, and marked the beginning of their recent downward trend. While most CPUs barely differed from Skylake, there were a few new releases. The best CPU to release from Kaby Lake was the Pentium G4560, which gained hyperthreading compared to previous CPUs, which at $60, made it an excellent value for the money, and as such, is exempt from this list. The worst CPU to release with Kaby Lake was the pointless i3 7350K. Like the G4560, it had 2 cores and 4 threads, but it costed double the price to have an unlocked multiplier. It did not come with a cooler (an addition $20ish dollars or more for a good overclocking one) and to take advantage of its overclocking potential, one had to buy an expensive Z170 or Z270 motherboard, completely negating its advantage as a budget CPU. Consumers were underwhelmed by Kaby Lake, and many chose to remain with their older platforms, as even the aforementioned Sandy Bridge users didn't see a huge performance improvement if they were to switch to Kaby Lake. Let me remind you, the first gen Core i7 (Bloomfield), which released way back in 2008 (a full decade ago!),was a 4 core, 8 thread CPU, just like the Core i7 7700K, 7 generations later (and the Bloomfield i7s actually still hold up decently even in 2018). This lack of innovation from Intel was plain lazy, and it became clear that Intel was exploiting their market dominance over AMD, who still hadn't released a major new CPU architecture since the aforementioned Bulldozer in 2011. But AMD wasn't sitting idle, doing nothing for those 6 years, and in April of 2017, they released their Zen architecture and Ryzen line of CPUs, with the top of the line Ryzen 7 line featuring 8 cores and 16 threads, offering twice the core count as the i7 7700K for the same (or even less) cost! And unlike Bulldozer, while Ryzen still lagged behind IPC wise compared to Kaby Lake, its IPC was significantly improved, and was competitive enough to hold its own in single core applications while achieving off the charts multi-core performance, even beating out CPUs in Intel's $1,000 Broadwell-E HEDT line for a fraction of the price! AMD rapidly began to eat up Intel's market share, and Intel had a complete panic attack. So in the summer of 2017, Intel released their new HEDT lineup featuring the Skylake X and Kaby Lake X lineups. While the new Core i9 of Skylake X, with 18 cores and 36 threads gave astronomical multi core performance, it ran too hot and with a price of $1,800, it was ridiculously overpriced compared to AMD's upcoming Threadripper 1950X, featuring a not half bad 16 cores and 32 threads at a much cheaper $1,000. Intel also released the completely pointless Kaby Lake X lineup, consisting of rebadged desktop Kaby lake Core i5s and i7s on an HEDT platform for a higher price with their iGPUs removed. What was a 4 core, 4 thread CPU doing on HEDT when 8 cores and 16 threads was available on the normal desktop platform? And in one final blunder, Intel released Coffee Lake, which was not a bad lineup in itself thanks to increased core counts across the board, but did even more damage to Kaby Lake users when Coffee Lake CPUs, which ran on the same 1151 socket as Skylake and Kaby Lake, were incompatible with 100 and 200 series motherboards, requiring a new 300 series motheboard to work. Kaby Lake users were not only gypped, but it was revealed that through a bunch of BIOS mods, Kaby Lake boards were TOTALLY CAPABLE of running Coffee Lake CPUs, and it was more or less Intel's choice to not update 200 and 100 series boards. I pity all of the Kaby Lake buyers out there who fell for Intel's greed, as it was not their fault, but the fault of Intel for completely misleading consumers to buy something that it was not. And for that, Kaby Lake may be the worst CPU lineup to release, ever.


I hope you guys enjoyed the article, feel free to give me any feedback below on your thoughts on what you might think the worst CPU ever made is!
 
  • Like
Reactions: Order 66
Solution
As you stated it was purely opinion and it was written out of boredom, I found it to be a good read. I agree with you on the kaby lake and would be an extremely upset intel i7 owner had I not received a phenomenal deal on my set up.
Every CPU released on both sides of the fence, were just steps in forward progress and learning for the manufacturers. So it's not really how bad one was vs the other, it's what did they learn to progress forward. Those were all stepping stones on the pathway to the technology we have today. Sorry I didn't read your write up, IMO, it was a waste of time and space to even cover this subject. Honestly who really cares.
 

njsullyalex

Prominent
Aug 13, 2017
42
1
560


I understand why you didn't read it, I admit this is an overly long post, I was just thinking about it and I wanted to get my opinion out somewhere, and I thought someone might find this interesting. I also agree that every bad CPU released would result in a better release later down the line. The Pentium 4 caused AMD to release the A64 and X2, and forced Intel to innovate with the Core 2 line. The FX line pushed Intel to release their legendary Sandy Bridge, and gave AMD a reason to really try hard with Ryzen. And Kaby Lake opened the door for Ryzen's success, and forced Intel to make the 1st big change since 2011 to their CPU line with Coffee Lake. (However, the Atom and E-Series in my opinion are inexcusably bad).
 

njsullyalex

Prominent
Aug 13, 2017
42
1
560



Glad you liked it! The i7 7700K is a beast of a CPU, don't feel too bad if you got a good deal on it.
 

larrycumming

Prominent
Aug 15, 2018
422
0
410
IMHO, amd e series and intel atom were the worst. I owned both and there is like no second hand market for these computers. Nobody buys them

On the other hand there are always takers for old i7-960 i5-2500 and even h55 h61
Boards


Kaby lake wasn’t a great improvement over skylake but they did improved from ddr4-2133 to ddr4-2400 which helped the non-K variants.

Besides there is always a second hand market for Kay lake unlike atom/amd-e
 


OP is probably too young to remember them, they were only really relevant for a brief period in the mid 1990s and then imploded partially because their chips sucked at running Quake and other software rendered 3D titles. The list that starts this thread is mostly relatively recent stuff, with only the Netburst era Pentiums being more than 10 years ago.
 

stdragon

Admirable


I think we can all agree that Netburst just sucked - period.

Personally, I'm waiting for the end of x86. I was never a fan of ARM, but I think a true open standard successor might be RISK-V. Apparently it's well thought-out to be future proof and has a huge backing of all the industry heavyweights. I'm rather excited to see how much adoption it gets.

https://riscv.org/membership/?action=viewlistings
 

njsullyalex

Prominent
Aug 13, 2017
42
1
560



I am too young to remember Cyrix, though I did recently watch a video on it, so I understand how bad it is. I was not too young to remember Netburst though, as my first PC ever had the dreadful Celeron D! On a side note, I don't think x86 will ever be really fazed out. ARM is becoming far more capable than ever before but we have yet to see an ARM CPU that can compete on the level of a Core i7, and with most modern desktop applications designed to be used on x86, it isn't going anywhere anytime soon. And at the moment, using x86 applications on ARM is in its earliest fazes (I am aware of an HP tablet running Windows 10 on a Snapdragon. But it is slow, can only run 32 bit applications, and even then compatibility is super iffy). On a side note, I haven't heard much about RISC-V, maybe I should look that up.
 

stdragon

Admirable
x86 will never go away, too much legacy support is needed. At best, it does go away except in emulation form.

Apple will be bringing CPU in-house for OSX and away from x86. To what architecture I'm not sure. Possibly ARM. Though personally I think that's a "moon shot" program even for them. It reeks of hubris IMHO.
 

larrycumming

Prominent
Aug 15, 2018
422
0
410


well the i7-4790Ks are still selling pretty well right now. they require big water coolers to come close to 5.0Ghz tho but those chips sure lasted many years. actually on fleabay the haswell-e' are cheaper than 4790Ks, what gives..
 

larrycumming

Prominent
Aug 15, 2018
422
0
410


doubt that'll happen, intel is far in the lead with AVX512. they'll go AMD before they even think about ARM in iMac Pro's
 


Yet , somewhat curiously, the 7700K still typically outframes (low, average, and highest fps) all of the newest Ryzen 2600-2700X CPUs in 19 out of 20 games...and that does not really change when both are OC'd. Heck, it barely changes when Ryzen is OC'd alone.