AMD's Piledriver And K10 CPU Architectures Face Off

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Shumok

Honorable
Aug 19, 2013
47
3
10,545
1
Who cares if the IPC on Phenom II is better when the FX architecture can clock high enough to outperform it at a lower wattage? IPC is completely and totally irrelevant in this case. Achievable performance, performance per watt, cost....these are the relevant parameters.
 


You SUUUURE about that?
 

rantoc

Distinguished
Dec 17, 2009
1,859
0
19,780
0
Why not add a true enthusiast platform with enthusiast grade dedicated gfx just to get a comparison so everyone get a feel how the difference between them? It would certainly be interesting as well as enlightening to a lot of people judging by several of comments.

Perhaps a full test of low end, mid and high-end. Just to show the spectrum of hardware configs that's available to buy now.
 

Shumok

Honorable
Aug 19, 2013
47
3
10,545
1
Who cares if the IPC on Phenom II is better when the FX architecture can clock high enough to outperform it at a lower wattage? IPC is completely and totally irrelevant in this case. Achievable performance, performance per watt, cost....these are the relevant parameters.
 
If people are willing to accept that games can be perfectly enjoyable on settings just a little lower than "UltraMaxOhWOW," all of these CPUs can provide an enjoyable gaming experience.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
15

Lol when did i say it improves overclocking? I said Windows 8 is affecting the (accuracy of the) overclock. That's it. I'm calling into question using Windows 8 for any article/benchmark where a BCLK overclock is involved.

Um, i only remember reading that Trinity wasn't affected, so i'm not sure making a blanket statement about all AMD's CPUs is a good idea...

EDIT:

Apparently doesn't affect anything if your OC is done via hardware.
http://www.extremetech.com/computing/164209-windows-8-banned-by-worlds-top-benchmarking-and-overclocking-site
 


The problem I have with that statement is that these days those badly optimized CPU game/ or just CPU intense games, that require an OC out of ANY mainstream CPU to get to playable FPS are becoming more and more common. And in MOST of these cases per clock/IPC performance is what matters most...

Hard to suggest AMD in these situations...
 
Agreed, except that poor performance of crappy software isn't the fault of the hardware. There are so many games out there that are enjoyable (even if some are older) that don't have this problem, such that it should be easy to boycott the junk.
 


:lol: :lol: :lol: fair enough, BUUUT some of that junk is pretty popular, as in a few million copies within the first week popular.
But yeah, do not play those titles most of the time anyway, so I guess MEH...
 

ingtar33

Illustrious


yep.

now you guys just need to stop using 2 of the 3 titles you rank your cpus with in your cpu rankings... as both of them are "crappy software" which is harming the AMD results. Skyrim uses the x87 code while starcraft is famously optimized for just 2 cores (badly coded). Stick crysis3 and something else into there.
 


But that would make the results biased? :heink:
 

ingtar33

Illustrious


i guess... i mean i can get their reasoning. all three are popular. 1 is basically representative of a non-multicored optimized title, the 2nd is representative of crappy code and the 3rd is basically representative of a game designed for a number of cores. That said crysis3 is designed well for as many cores as you can throw at it, which is probably the way most games will be designed going forward, and the number of modern games that use x87 code is numbered less then 5 (i can think of 3 actually...) which makes it a bit silly to use as a bench... besides AMD cpus don't even support x87 code.
 


Bold text - I think not, but that is just my opinion...
 


Second time you have said, saying it twice does not make it any closer to the truth...
Go read again!
 

ingtar33

Illustrious


I can see where you're coming from. I'm basing it on the fact the next gen game systems will have different core counts. that leaves manufacturers with two choices. build for 4 cores or build for as many cores as you have available. I expect they'll mostly do the later as they'll be trying to squeeze every last drop of computing power out of those jaguar cores. i can see crappy, cheap games/titles being built for 4 cores, but i expect most games will be coded a bit better.

that said i can understand why you'd be skeptical, it's not like the gaming industry is ever logical or all that dependable... it might be another 5 years before we see games make use of more then 4 cores at the pace the gaming industry evolves, so you could be right.



mmm... seems more like wishful thinking. I mean i know the 1100T at 4.0ghz is about the same speed as a fx6300 at 4.6ghz. A 1100T at 4.5 (possible) is like a 6300 at 5.2ghz (very rare). Now, it's not like you can find the 1100T anywhere anymore, but in theory, if i was the owner of such a chip, i would see zero reason to upgrade to piledriver.

 

sincreator

Distinguished
I agree 100% to your last paragraph of your last post ingtar33. I was looking at the Intel and AMD articles as a way to see how bad someone may/may not of needed an upgrade and what option would be best from both camps really. Now, I'm only talking newer games though that can use more than 4 cores, and most games are starting to lean that way. "IF" the x6 PH II's put up high enough numbers, there would be no reason at all to consider a current FX chip as an upgrade. Maybe that is exactly why they didn't include the numbers/tests. Users of such chips would only consider Intel's offerings an actual upgrade to what they currently have. I really want to see those numbers. Good or bad.

PS: I am totally biased here since I currently run a x6 1090t @ 4.0ghz 24/7 and I've been considering an upgrade in CPU+MOBO myself. IF current FX chips won't add anything I want to know, then I'll go with an Intel i5/i7 instead. I don't want to waste money to get no difference/upgrade at all. lol.

 

ingtar33

Illustrious


having played extensively with both an i5 system and a older PhII 965be system i can assure you its basically impossible to tell them appart unless they're sitting next to eachother and you have a stopwatch in hand.

About the only upgrade path i see around from a 1100T is to a 6 core hyperthreaded intel extreme edition; and its doubtful even then that you would feel like there is a big enough difference to justify the expense.
 


Thats reletive, I noticed a decent performance jump, TRUE not in general use, but gaming, yeah a decent improvement.
 

ingtar33

Illustrious


depends on your monitor. Most people are rolling at 1080p and 60hz, at that point there is basically no difference.
 


Using 1080p 60Hz... :D like I said, it's a relative thing.
Same story with 30FPS, most people say you can not see the difference anything above 30FPS, which is just not true...
 

ingtar33

Illustrious


haha... very true. Generally depends on the title. I can't tell the difference between 15fps and 60fps in civ5. looks identical to me. of course there is almost no actual movement in civ 5 so i guess that makes sense. Meanwhile in a FPS anything slower then 40FPS starts to feel choppy and jumpy.
 


Jip, in LoL for example my tolerance is fine till around 46FPS.
 

cmi86

Distinguished
Sep 29, 2010
2,145
0
20,160
123
Why when something (anything) is written about AMD so many people just jump right to bashing it for not being as fast as the Intel offerings ?? It's half the effing price get over it and appreciate the fact it offers 75% of the performance at that price.
 

RedJaron

Splendid
I would've liked to see the Intel results compared in the individual pages, not just summed up at the end. I think it would've made for a much more detailed and informed comparison.

Also, I'm a bit torn on your hardware setup methodology. I understand using a high-end GPU removes any CPU bottlenecks, but you specifically used a budget-priced CPU cooler and only included budget-priced CPUs. Using a GPU that costs more than the rest of the system combined seems a bit odd. Yes, it's nice for benchmarking numbers, but I doubt it's indicative of actual usage experience if the whole system is a "budget build." A budget CPU that barely squeaks by with a 7970 may be on the wrong side of "just good enough" with a less capable GPU.

Finally, anytime OCing is such an important part of a story, I think it reasonable that the "Value" calculations include the cost of aftermarket cooling ( regardless Intel or AMD. ) Really, this would hit AMD more since Intel's stock coolers ( while not great, ) are good enough for mild overclocks. Most AMD coolers seem to be borderline loud even at stock speeds.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS