Amd a6 apu vs fx4100 vs phenom 955 ?

HCGamer

Honorable
May 30, 2012
2
0
10,510
0
hi,Im building a new pc at around 500$ budget..heres what im getting :
amd radeon 6770 1gb
4GB ddr3 ram
cooler master case
a 500gb HDD and a dvd r/w
I still didnt make my mind about the cpu(and the mobo too cuz it depends on the cpu u know)...so which of these 3 is better ? amd a6(dont remember the whole model number the seller told me about but its price is nearly similar to the other 2),fx-4100 or phenom II x4 955(if i found 1 becuase its discontiuned u know)...some might suggest an intel i3 but I dont wanna get a dual core even if its slightly better than the 3 i mentioned because games will surely need more than 2 cores in the future...I will be using my pc mainly for internet and gaming...

sorry if there are many typos,Im using my phone right now...
 

willard

Distinguished
Nov 12, 2010
2,353
0
19,960
96
I'd probably go with the A6 and crossfire it with the discrete card. Couldn't tell you which models are supported, but it's probably the best bang for your buck at such a low price point.

Also, don't skimp on your memory like that. The difference in 4GB and 8GB is only about $20. Is it really worth $20 to handicap your system's performance like that?
 

HCGamer

Honorable
May 30, 2012
2
0
10,510
0

ok,I guess thats what Im getting..thanks .
 

dannylivesforher

Honorable
Sep 21, 2012
777
0
11,160
79
Hey dude...sorry for the late reply.... I prefer a6...coz i'm having one....i have a6-3650,2.6 ghz with radeon hd 6530d in the apu...
it'll be a better option if you are not having a graphics card right now,coz a6 can run games decently even without a card....and later,you can get a hd6450,hd6570 or hd6670 and crossfire it with apu graphics....it can give you even more performance....
 
You cannot crossfire a 6770 with an A6 cpu and if you are getting a 6770 then the a6 is the worst choice between the 3 as it has the weekest CPU and a good built in GPU you will not use. Out of the 3 I would go with the Phenom x4 but unless its under 2/3rds of the price of an i3 I would go intel 100% as even pentium G series CPUs run games that can use 4 cores better than any of the CPUs you mentioned.
 

$hawn

Distinguished
Oct 28, 2009
854
0
19,060
49
If your planning for the FX, wait for the PileDriver release in about a month or so.....however, i'd really recommend an i3 for you.

The performance of an IB i3, which has hyperthreading, is almost as good that of the phenom/FX/A6 in games. The i3 will continue to be better than the AMD chips in gaming for the near future, plus it's also faster in general day to day stuff as well, while consuming around half the power of what AMD chips use. Its really a no brainer :) In the end, for games it's the GPU that matters most.

In case you haven't read,
http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-3.html :)
 

dannylivesforher

Honorable
Sep 21, 2012
777
0
11,160
79


Hey dude,what I said is hd6670 and not hd6770......
 

sarinaide

Splendid
Jul 14, 2011
3,820
0
22,960
74


He said he doesn't want a i3 and gave the reasons for it. Yes a i3 will glory shot where it is optimally coded for, but in titles that actually work on AMD modulation and older architectures then the FX 41XX and PII 955BE are still much better options.
 

rds1220

Splendid
If you really want to go with AMD I would go with the Phenom II 955. Really though for the price I would put a little more money into it and get an I3. You say you don't want a dual core because in the future games will use more cores but your fooling yourself with false hopes of future proofing. There is no such thing as future proofing. We aren't at the point yet where most games use more than two cores. Most games only use two cores and by the time the time we hit that point You'll probably be doing a new build anyway. The dual core I3 will handle most games just fine in most cases it will out perform all those AMD CPU's you listed. I would jut go with the I3 now and take the better performance.
 

sarinaide

Splendid
Jul 14, 2011
3,820
0
22,960
74
Posted tests I did with F1 2010/11/12, a game that scales cores well up to quad cores. The pentiums invariably struggle and can get outdone by older Athlon X3 and X4's, the i3 with its HT helps it a little bit but still struggles. While the FX 81XX and i7's sit comfortably around the 70 FPS mark. Games are moving towards multi core optimization, those that do it well will bring a dual core to its knees (metro 2033).
 

rds1220

Splendid
Very few games will max out an I3. The only ones that will give it hard time are metro 2033, BF3 in multiplayer and Skyrim. Other than that the I3 can handle and out perform the APU and Bulldozer in all but those highly demanding games. Games are moving towards using more cores but we aren't at that points yet. Like I said if you think you are future proofig your not. By the time we hit that point all these CPU's will be obsolete.

In case you missed it AMD still has nothing that compete with Intel until the third tier down.

http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106.html
 

sarinaide

Splendid
Jul 14, 2011
3,820
0
22,960
74
I did post gaming benches yesterday which showed games that scale cores well and the results were quite to the contrary, but it is tedious trying to get that message across, almost as tedious as going through the same rhetoric.

And in case you missed it, OP says he doesn't want a i3.
 

rds1220

Splendid


:pfff: :pfff:
 

sarinaide

Splendid
Jul 14, 2011
3,820
0
22,960
74
It really doesn't help posting smilies, nor does it help posting just about every skewed website bench showing yourself to be handy at the copy paste but not much good when it comes to actually doing anything yourself. I have worked as a freelance hardware tester an have tried to remain as objective about products as I can to give those whom may require choices outside the land of intel which you now occupy, if you actually do the testing yourself you will find what you said above to be correct in about 10% of instances namely Crysis 2 and Skyrim (world is ending).

Why don't you try look at things outside this intel bubble you so affectionately cling to.
 

rds1220

Splendid


Ha thats calling the kettle black. Lol skewed benchmarks,why are they skewed because they don't show your precious AMD Bulldozer in the perfect light, because it shows it for what it is a crappy Core 2 duo equivelent. I would never believe private benchmarks especially not from a die hard AMD fanboy like you,it's to easy to skew the numbers. You can keep trying to convince us that Bulldozer and APU'sare great but the benchmarks from reliable sources don't lie. So you can keep doing your phony benchmarks and trying to convince us but the only one you're fooling is yourself. I don't know what your fetish is with AMD but I find it funny and entertaining.
 

willard

Distinguished
Nov 12, 2010
2,353
0
19,960
96

It makes sense to claim bias when a data point falls far outside the trend. Claiming bias when the industry overwhelmingly finds the exact same result is nothing but denial.
 

sarinaide

Splendid
Jul 14, 2011
3,820
0
22,960
74
But it doesn't show overwhelmingly, it shows selective benches and the results are somewhat dubious. I don't deny that intel is possibly the better route with more features and growing platforms but the results are so badly done. Worked through a bunch with colleagues, about 25% of the results show consistencies, the rest were way off the mark.
 

rds1220

Splendid


Exactly right on all points, that is the epitome of fanboyism.
 

willard

Distinguished
Nov 12, 2010
2,353
0
19,960
96

"Doesn't show overwhelmingly" and "doesn't meet the arbitrarily high bar I've set for it" do not mean the same thing. I read a lot of BD reviews, and I didn't see much in the way of compliments. I saw a lot in the way of "why is this eight core chip getting beaten by a four core chip clocked 500 MHz lower" and "wow, Bulldozer has a lower IPC than Phenom II?" The single threaded benchmarks were outright hysterical.

BD was a flop and Interlagos was worse, barely keeping pace with Intel's Xeons released twenty months before. BD is so obviously inferior that AMD had to market them using a value angle compared to a $1000 processor from the previous generation of chips, rather than dare compare it to the chips they priced it next to.

BD only excels in perfectly threaded applications, which are still somewhat rare for the average user. The shared FP scheduler gives the chip a Jekyll and Hyde personality with floating point heavy workloads (common for the average user) costing as much as 50% of the chip's performance.

Worked through a bunch with colleagues, about 25% of the results show consistencies, the rest were way off the mark.
I'm sorry, but you're going to need to cough up some credentials or post your data if you expect me to accept your casual dismissal of the entire body of evidence that BD is inferior. I've seen this bias defense from AMD fanboys before. In no case have I been convinced that there's a coordinated conspiracy among dozens of independent websites to suppress favorable benchmarks for AMD, or to only use benchmarks which somehow favor Intel.

If AMD's chips were faster than Intel's, the benchmarks would show it. BD just didn't live up to the hype.
 

sarinaide

Splendid
Jul 14, 2011
3,820
0
22,960
74
If your justification on flop is not beating prior generations or cheaper chips of the current gen then look no further than 2011 with benches claiming a i7 3770K to be better than a 3960X in most, does that make it a flop no it doesn't. Sure a few benches and games are better optimized to AMD's archaic K2 architecture but most benches were also done October 2011, since then a series of patches, hotfixes and revisions to most apps have actually shown improvements, some significant.

While I didn't at any point say better they are certainly not across the board slower than Phenoms, and certainly not 55% slower than Intel as some have said. I will say IPC is around 12-15% slower, and that is against a company with all the wealth and around 6 years working on their architecture. I was against AMD releasing BD when they did, what was released was a product well below the initial engineering specs.

Power and heat, while yes that is not changing a great deal with AMD, they are compensating the loss of IPC with higher clocks, when a FX 8150 was on engineering specs supposed to be a 4.2ghz chip up to 4.5ghz on TC, at those clocks the difference in performance is significant.

I wasn't one that put expectations on BD, so for that I don't regard it a flop, perhaps underwhelming and badly marketed but that is history, BD was step one in a new architectural direction for AMD so to expect step one to be exceptional would be over ambitious.
 

willard

Distinguished
Nov 12, 2010
2,353
0
19,960
96

Now that's interesting, because the 3770k wasn't released until late April of 2012 and benchmarks were only leaked a week or two before. Also, I'd say not beating your own chips from previous generations or cheaper chips of the current gen is a pretty goddamn big flop.

The 3770k isn't a flop because it's an incredibly fast chip that mops the floor with every other mainstream chip ever made. Now why don't you post a "BD isn't a flop because..." and try not to make me laugh. Here, I'll get you started.

BD isn't a flop because it cost significantly more than the 2500k at launch while performing significantly worse in the overwhelming majority of applications?

BD isn't a flop because its TDP is almost twice as high as competing chips?

BD isn't a flop because the shared FP scheduler cripples performance on the most common workloads?

BD isn't a flop because it offers better value than chips with legendarily poor value marketed to people with more money than sense?

BD isn't a flop because the chips were so bad compared to similarly priced chips that AMD had to cut the price twice in the first year of sales?

BD isn't a flop because it gets beaten in single threaded benchmarks by AMD's own chips from the previous generation?

BD isn't a flop because it gets beaten in multi threaded benchmarks by chips with half as many cores?

BD isn't a flop because Interlagos was barely able to keep pace with Intel's nearly two year old offerings operating at a significantly lower TDP, and was dominated in 100% of benchmarks by the SB based Xeons?

Sure a few benches and games are better optimized to AMD's archaic K2 architecture but most benches were also done October 2011
And again, rationalization of your untenable belief that Bulldozer isn't as slow as every benchmark ever done has shown. A rational man looks at a pile of benchmarks showing that A is slower and B and accepts that A is slower than B. You're trying to blame everything but A for its slowness. This is commonly known as denial.

It's not even a question of optimization anyway, it's a question of a flawed design prioritizing server workloads over desktop workloads, dramatic reduction in IPC over the previous generation, a failure to significantly reduce TDP and inability to produce a chip that can compete on any level with Intel's top offerings. Even the Windows 8 threading optimizations (which are the largest performance gain to be had and by a large margin) only had a minor impact on performance, and only in some cases.

Optimization might make things better, but no amount of software changes can fix what's wrong with the hardware. If your workload is floating point heavy then BD is garbage and there's nothing you can do about it. If your workload doesn't parallelize well, then BD is garbage and there's nothing you can do about it. AMD's solution of "throw more cores at it" looks good on paper but fails to deliver in almost every way.

Like I said, Bulldozer is good at exactly one thing. Perfectly parallel integer math. This is an exceptionally rare type of workload, present almost exclusively in rendering and conversion software.

While I didn't at any point say better they are certainly not across the board slower than Phenoms, and certainly not 55% slower than Intel as some have said.
I said they lose 50% of their performance in FP heavy workloads, which is true. The modules are incapable of scheduling floating point work on both of its cores simultaneously. Half of your cores sit idle during floating point work, thus you lose half of the performance. Nobody said they were 55% slower than Intel. Bulldozer is 50% slower than itself when presented workloads that Intel's chips have no problems whatsoever with.

I will say IPC is around 12-15% slower
It's actually about 17% lower IPC than Phenom according to these benchmarks:
http://www.pcper.com/reviews/Processors/AMD-FX-Processor-Review-Can-Bulldozer-Unearth-AMD-Victory/FX-versus-Phenom-Perf-0

A nearly 20% cut in IPC is massive.

and that is against a company with all the wealth and around 6 years working on their architecture.
So Intel making a good product prevented AMD's engineers from being competent and not losing 20% of their single threaded performance? I'm sorry, I just don't follow.

Bulldozer's IPC being lower has not one goddamn thing to do with Intel. It's a flawed design that AMD took from start to finish. Unless Intel was in there sabotaging designs, then you should probably stop trying to blame AMD's abject failure in IPC on Intel.

I was against AMD releasing BD when they did, what was released was a product well below the initial engineering specs.
As a result of the flawed design of the Bulldozer module, not due to being rushed. This kind of high level design gets finalized years in advanced. AMD made their bed, and now they've got to sleep in it. Another year wouldn't have changed anything. You also have to consider how the market changes while they wait. Sure, they could have made Bulldozer way better by waiting another couple years, but they'd still be delivering 2011 technology in 2013, then. The longer you wait, the worse the chip looks because your competitors have had more time to produce products that don't suck.

Hell, I doubt AMD could have made a 2500k competitor if they'd delayed anyway. Even if they meets all the goals they have for Piledriver, it will still be slower than a 2500k on numerous workloads because of the underlying flaws in the architecture. The Bulldozer module itself is the problem. For it to perform well you have to feed it perfectly threaded integer workloads, which as I've explained over and over again, are rare for the average user.

Power and heat, while yes that is not changing a great deal with AMD, they are compensating the loss of IPC with higher clocks
Big deal. They failed at IPC so they were forced to increase clock speed, which increased heat. This is a bad thing. You're basically saying "Yeah, this part of the chip sucks, but at least they went and sacrificed performance elsewhere to make it suck a tiny bit less!" Intel's chips run cooler and faster each generation, AMD's lose performance and run just as hot as before.

I wasn't one that put expectations on BD, so for that I don't regard it a flop, perhaps underwhelming and badly marketed but that is history
So underwhelming performance and grossly misleading advertising to try to cover it up doesn't constitute a flop?

BD was step one in a new architectural direction for AMD so to expect step one to be exceptional would be over ambitious.
How about expecting it to be adequate? Not worse than the previous generation? A step forward, not backward? Not priced higher than chips which mop the floor with it in practically every test?
 

sarinaide

Splendid
Jul 14, 2011
3,820
0
22,960
74
Well from what I have heard about steamroller is the module design is completely different, if you had to take a die chart of a Thuban add two cores each core with multiple FPU in itself to process two instructions per core, and a front end that is similar to what Haswell will have I don't really believe AMD really put much time into Piledriver. While having seen the official ES chips performance better yes but nothing really fixed on the power and heat.

From what is said of Steamroller AMD are still not giving up on the mid 4ghz standard operating perimeters but reduce power and heat, I am sure to achieve that they will need better transistors.

I would like to just go on the "certain workloads" point, while most of the FX 81XX chips I use in builds for private and clients the systems are used in high calculations per second professional level systems, which the Zambezi does pretty well, In gaming terms the FX 4170 compared to every other FX does one thing the others don't do and that is hold consistent frame rates. I do believe this is a lot to do with being late and many programs are not going to make schedule and compiler changes this late into the game. Windows 8 and beyond we will have to wait and see what happens.
 

Similar threads


TRENDING THREADS