AMD Piledriver rumours ... and expert conjecture

Page 293 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 

so ... for gaming, get the slowest one avaiable ... 😴

51138.png


... ok ...

As for the disabling cores, Its the same thing as turning OFF HT. AMD doesn't call it HT, its CMT and the only way to disable it is to turn off every even # core.

HT is well known to cause FPS drops in games, so it is generally recommend to DISABLE HT for daily use and gaming.

ya, people have said the same nonsense with Intel chips, ESPECIALLY WHEN HT WAS NEW.
 

that's what I wanted / needed to see but that is only one chart, got more.?

now my question is how is it the slowest possible when the Bulldozer FX's had the FX-6100 as the slowest.?

thanks for the reply..

 

But PD are getting sold out everywhere? My local store sold them in a DAY.

The APU is a great CPU though, I think it's where AMD should focus their market on, there's already some laptops out there with the Trinity architecture.
 

thats anandtechs article and I said the slowest available, in refrence to the newest chips.

If you want the slowest possible, go back to an 8088.

IN BF3 Id take a 6100 over the 4100 any day. RL testing has the 4100 pegged at 100% cpu usage pushing 40-60 fps, the 6100 at 80% with minor spikes to 100% pushing 60-90fps in MP games at the same speed, same system running at 4.7 ghz.
 

wow, you suggesting an i3 build and your on AMD payroll, that's deep.. :)

seriously though, why do you feel that way.?
maybe our definition of 'strong HTPC' is a little different from each others.?
 


More L2 cache, and the extra cores are there to offload any other tasks that would otherwise slow it down. A little something called multitasking.
 

that might be an answer but it's not the answer to what I'm trying to figure out.. :lol:
again not talking about multi-tasking, talking about gaming...
hurry and catch-up.

if you look at the chart I posted, it shows the FX-6100 as the worse chip.
now to see it's replacement (FX-6300) do so much better is interesting, kinda.
 
[msg=2677097,8088,1385544][url]http://img688.imageshack.us/img688/1397/indexdo.png[/URL][/quotemsg]
Its a simple concept, the 6100 had an extra module but retained its 95w TDP. So it would only seem logical that Clock speed would be lower, thus lower scores. When the 6100 was $50 more that the 4100 it was a bad deal. Now with the 6300 being $10 more than the 4300 that's a bargain. When you need more cores, you have them. When you need higher clocks and better single threaded performance, disable a module, go for higher clocks and attain them easily. Simple.
 
http://www.legionhardware.com/articles_pages/amd_fx_8350_and_fx_6300,6.html


When testing with Just Cause 2 at 1920x1200 using the GeForce GTX 580 graphics card we see that there is just a 7% performance difference between the slowest and fastest processors. This leaves very little room for improvement and therefore we were not surprised to find no real difference between the AMD FX-6300 and FX-6100. That said the FX-8350 was found to be 3% faster than the FX-8150, likely due to the 11% bump in clock speed.

As a result the FX-8350 was able to deliver 1fps more than the Core i5-3470, though we should point out that it was only just able to match the performance of the Phenom II X4 980.


This time we find that the fastest processor is 11% faster than the slowest when testing with The Witcher 2. That said the AMD Piledriver FX processors were unable to improve on the performance set by the Bulldozer processors and as a result the FX-8350 was 6% slower than the Core i5-3470.


This time when testing with Crysis 2 we again saw almost no difference in performance when comparing the new Piledriver FX processors to their Bulldozer counterparts. This time there was a 13% performance difference between the fastest and slowest processor, while the FX-8350 was 5% slower than the Core i5-3470.

Therefore if we break down our testing we found that the FX-8350 was 7% faster than the Core i5-3470 in our synthetic benchmarks, 13% faster in the application tests, 7% slower in the encoding benchmarks and 3% slower in the gaming tests. This meant that overall the FX-8350 was just 2.5% faster than the Core i5-3470.

However the FX-6300 was 17% faster than the Core i3-3220 when comparing synthetic performance, 44% faster in the application tests, 30% faster in the encoding benchmarks and just 2% slower when comparing gaming performance. This means that overall the FX-6300 was 22% faster than the Core i3-3220, and given the similarity in pricing this is a good result for AMD.

Of course there is that little issue of efficiency that AMD’s aggressive pricing strategy can’t really erase. When compared to the Core i3-3220 the FX-6300 system consumed 86% more power to deliver just 22% more performance on average.

Bottom-line is the new Piledriver-FX processors provide a quick and cost effective upgrade for those still using lower-end K10 hardware. That said, those using high-end Phenom II X4 and in particular X6 processors, there still isn’t a lot to be gained for the upgrade, despite how cheap these new processors might appear. The real issue for AMD has to be that for those building a new system from the ground up, the Ivy Bridge range is still a more attractive option thanks to its much better single thread performance and vastly superior efficiency.
 

:pfff:

Gaming and multitasking can be concurrent. Who knows, maybe scheduling got better, so the 6 core model can we used more to its capability.
 

that chart is primarily ordered by clock speed.

6100 is 3.3 ghz (3.9 turbo), 4100 is 3.6 ghz (3.8), oddly enough the 8120 is 3.1 ghz (4.0) so there is some play in turbo with that particular game, but not 100% apparent.

the chart seems faulty tho as the 8120 and 8150 are tied. (3.1 ghz(4.0) vs 3.6 (4.2))

 


AFAIK, "Turbo" is not base and top speed as they introduced more states to handle them (and the subsequent logic).

The "max turbo" is what you can get with 1 core being used and base is with all cores being heavily taxed. So, in short, you get more "mid turbo" speed in between taxing the CPU. That's what I remember, but could be wrong.

Cheers!

EDIT: Forgot something.
EDIT2: I got an email from the Microcenter in Tustin CA. They have some 8350s in there. Too late for me I guess, haha.
 

I am under no payroll. Personally I don't have a defination of a "strong HTPC". I see no point in having a lot of processing power in an HTPC.
 

for converting movie formats would be one, being able to play the occasional game on your "htpc console"
 

well I would think most people with HTPCs also have workstations, why not convert videos on your workstation? And generally, playing games on the htpc is nice but not worth it for demanding games, the integrated graphics will allows a decent gaming experience in games like orcs must die 2 or torchlight 2 and even call of duty but if you are going to play something demanding like bf3, its probably best to not play it on the htpc.
 
For me, an HTPC simply is a PC I put together using mostly used parts. For example, my current HTPC is simply a stockspeed C2D E6600 that is passively cooled by an old Tuniq Tower in a HTPC case that I bought which looks more like a large A/V component rather than a PC case. When I upgrade to Haswell next year, the C2Q Q9450 in my primary rig will go into HTPC. The E6600 was in my primary rig before I upgraded to the Q9450.

However, I may ultimately decide to rebuild the HTPC around lower power consuming parts, but that may not be until after my new build next year since I will also likely buy a new laptop as well. I don't wanna spend too much money in one year on new electronics. What will likely happen is that whatever Haswell CPU I buy next year will go into my HTPC in 2015 when Skylake comes out. By that time my Q9450 will be 7 years old. That means I will also be able to ditch the nVidia 9600GT graphics card.
 
Status
Not open for further replies.