Why is FX8150 bad?

Because it is slow.

Simply put, the equation for CPU performance is as follows:

Performance = Number of Cycles * Instructions Per Cycle

The 'Instructions Per Cycle' is slower in BD then other AMD architectures, so at the same speed, a single BD core is slower then a single PII core. And because BD isn't clocked significantly higher, it doesn't offer significantly better performance over Phenom II. Problem is, Phenom II is only about as good as a 9000 series C2Q, which is obsolete by todays standards.

There are other issues that have been discusses as well: High cache latencies, lack of SW scaling, poor CMT implementation.

Long story short, BD's performance per clock is down, and that hurt performance more then a higher clock and more cores helped it.
 
i remember back in the day people would say e8400 4ghz dual core was better for gaming then a 3.4ghz quad core...now people say higher clocked quad is better than a slower clocked eight core

if you want the best fps go for intel
if you are budget oriented (sub 1 grand build) then go for amd.
Intel is faster but you will pay for that, amd is not horrible but it gives you a better value/features if your budget is low
 


Well, remember IPC wasn't significantly different between C2D and C2Q, so in many cases, the faster clocked duo WAS the better option for gaming. IPC matters, especially since clock speed isn't advancing that fast...
 


It depends on context. Back when multi-core was new and becoming mainstream, no games were being developed to take advantage of them, so fewer faster cores often were much better than more slower cores, because any game would still be single threaded.

Today that is very different, and as we go forward a significant chunk of game logic can be offloaded to the GPU via openCL, so in reality game performance cpu wise has plateaued - path finding, collision detection, state awareness, and other frequently done non-visual game actions are easily writable for massively parallel gpus, so you will end up having a vast majority of a game running there.

It will be slow to happen, but as it slowly does, we won't see higher cpu demands for games by much, since anything new can be done gpu side.

In terms of Bulldozer, the 8150 is at a fine price point if you overclock it and don't care about power. It is inefficient in terms of power usage but its price point makes it more economical than a 2500 (soon to be 3570k) and all the new chip brings to the table is even more of the power inefficiency showing itself.

But if you don't pay your electric bill, you really don't need to care. So an 8 core bulldozer chip is "ok" because the i5 2500k / 3570k will only be 10% or so better in "most" games.
 
the best thing to do is read every OBJECTIVE review of the FX8150 and try to comprehend the flaws that are pointed out and explained how it affects performance. (if it is an objective review there ought to be some good points also)

to simply just ask the question on an open forum is opening the opportunity for a lot of "internet experts" to defend their personal preference and subjectively try to flaunt their limited knowledge of false logic and attack differentiating opinions.

and EVERYONE with a brain knows that if the letters FX are in the name it HAS to SUCK!
:lol:
 
Stupid thread, deserves the rubbish that all the sheep contribute to.
agreed +1..... the bait was laid and the trap worked. So many threads about Intel vs. BD all over the forums that the OP couldn't have missed.
 
interesting the fx-4170 is one tier above the 8150 and two tiers above hte FX-8120, -6100, -4100. just shows you how inconsistent BD really is for gaming.
 


It's still beaten by the 3+ year old i7 920 which tells you something. Such a pointless duplicate thread.
 


yeah but still the phenom II chips fall more in line like they should, if i'm not mistaken the x6 series was slower clocked then its x4 cousin, yet the x6 are above most of the x4.
 


Agree with this 100%.




Note: DX9.

DX10 onwards moved a LOT of the logic from the CPU to the GPU. Hence why DX10/DX11 games tend to be bottlenecked by the GPU rather then the CPU. As more of the easily parallized tasks get moved to the GPU, the CPU will matter less and less.

We even saw this trend in the BD review a while back: DX11 titles, like Dirt3 and BF3 showed clear CPU bottlenecks, from the 2600k down to the lowly pentium G series. The DX9 titles, by comparision, almost always saw increased performance as the CPU power increased. Thats because most of their processing is still on the CPU, not the GPU.

Hence, using a DX9 benchmark to try and prove a point about CPU/GPU scaling is silly. Run that same benchmark in DX10/DX11, and you get a totally different result.
 
In terms of Bulldozer, the 8150 is at a fine price point if you overclock it and don't care about power. It is inefficient in terms of power usage but its price point makes it more economical than a 2500 (soon to be 3570k) and all the new chip brings to the table is even more of the power inefficiency showing itself.

But if you don't pay your electric bill, you really don't need to care. So an 8 core bulldozer chip is "ok" because the i5 2500k / 3570k will only be 10% or so better in "most" games.[/quotemsg]


NOT!!!!!!!!!!!!!!!!!
 
Stephen-Colbert-Popcorn.gif
 
As the proud owner of a system powered by an 8120 and another system powered by a 2500K, I can assure you that in day-to-day operations, you will not notice any difference between the two machines. In fact, the only time I noticed a real difference in performance was when I put an SSD in the 8120 box, but it inspired me to do the same thing for the 2500K, and we are back to no comparable difference.

Bulldozer is a very good chip. Granted, it's not a knockout when compared to SB, but then, it was developed using a fraction of Intel's R&D budget. Sure, go ahead and badmouth AMD as a failure because they aren't faster than their much larger competitor, and then sometime in the next few years, you will be complaining about how expensive computer processors have become since Intel no longer has any real competition. I'm rooting for the underdog, not because of any sense of loyalty, but because my ulterior motive is to see the power of competition keep prices within range of reason.
 
God all mighty ....I give up......but for me having an 8 core chip performing as a low end 4 core chip and STILL over the price of the 4 (8150)..it is not a good buy. Just because THAT 8 core is labeled as an 8 core chip and is the first "8 core" chip does not make it a great budget price!!! And it is not really an 8 core ...in performance ....If THAT 8 core was performing near the intel's 6' cores like (at best)

http://www.newegg.com/Product/Product.aspx?Item=N82E16819116492

, and still at that price then wow !! yes that is a good buy !!!

But then again ...we're all are still comparing apples to oranges!!! .. these 2 chips are 2 different pieces of technology that may have the same end ..but are still 2 different pieces of technology ..

can both play games ....YES!!! Can both do it at the same level of efficiency NO ..... games are till GPU bound than CPU mostly anyway ...

comparing the fx chips to previous phenoms II... well that ..still does not look good lol .....
 

problem there is that it isn't a true 8 core. it has 8 integer cores, but only has 4 FPU's and games rely heavily on FPU performance, along with many other applications. Not to mention the cache is slow. They are the 2 main factors that make bulldozer inefficient.
 
It makes me laugh when I see people talk crap about this CPU without knowing anything about CPU architecture. I am a CPU designer (not X86 or X64 but ARM and the same concept applies). While it is true that Intel chips are faster clock for clock than AMD CPUs, this CPU would easily outperform the newest core i7 Intel in many tasks. The problem is that there is barely any software support, if any at all. There are many important components such as scheduling and queuing that must have 100% support in the software. If an operating system and an application know how to fully schedule all the cores and the task at hand can be broken into multiple threads, there is no way that this FX CPU would lag behind as it shows in the tests. No way. Trust me, I do this every day. Because of lack of software support, the FX chips are crippled in your system and their full power cannot be tapped into. Throw in the right support in the OS and applications and the 8 core chip will easily match the latest core i7 and I am willing to bet that it will even outperform in some tasks. That's a guarantee. Real engineers do not pay attention to benchmarks tests because usually if the test is not biased, it is done incorrectly. Of course, in real world tests, Intel wins. Software optimization and OS support is built around Intel chips and schedulers optimized for the Intel CPU architecture. That's why all tests favor Intel.

A group of friends of mine built a rig using an FX-8150, contacted AMD and obtained programming manuals for the FX chips and technical specs. They took three weeks to write their own microkernel based partially on Linux. Very basic stuff, only support for the 8 core chip among other things were thrown in there. In real world test with the custom OS, the chip outperformed an Intel
Core i7-2700K by 6% for a task of generating encryption keys. This was just a proof of concept to show that with the right software support, this CPU would be significantly better. Get educated and start realizing that benchmarks are done incorrectly. You need to benchmark with the right OS and software support, not without it.

Ask yourselves this question: why is the ipad faster than the best Android tablets with quad core chips? Because Apple ARM chips are faster? No way. It is the software and optimization in the software that makes the biggest difference.
 
and it makes me laugh when people cry the chip isn't bad, it the software.

Q. what good is hardware that can't run software?

A. none.

deal with it.
 


Exactly what I was thinking. Gotta love the people that cry the software is the peroblem. Next they're going to cry the Intel conspiracy.