AMD's Trinity APU Efficiency: Undervolted And Overclocked

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Thanks for the article Chris, especially love the undervolting part !

It would be great if you could give us an estimate of how far Trinity can be pushed while running at 1.275V, to match the power consumption of stock settings, but hopefully with better performance (is 200 MHz doable ? that would be a ~5% speedup).

In other words: what's the safety margin AMD took with Trinity in order to reach the desired TDP and ensure stability at the same time, 5%, more, less ?
 
[citation][nom]willard[/nom]Are there really this many people reading Tom's Hardware that think you can overclock the i3s?[/citation]

Really ? How do you propose to overclock the i3 by any significant amount ?
 
[citation][nom]ojas[/nom]nope, they won't. intel binns chips. 4.2 max would be my guess.[/citation]


I beg to differ. Yes, they are binned, but run at ridiculously low TDPs. Open them up to 1.4V+ vcore and they would scream. That just doesn't fit Intel's current model of efficiency. My old e8600 would flirt with 5ghz. I'm sure these little dual core cpus could really impress if given the chance.
 
[citation][nom]looniam[/nom]nice at cherry picking a benchmark to spin your tale. how about quoting the article you picked that from?AMD A10-5800K & A8-5600K Review: Trinity on the Desktop, Part 2there is NO WAY a trinity will use less power than an i3. if you fully read the article it explains why. also lets see about actually undervolting an i3 also:Undervolting i3 2120 (the only "fun" there is)not impressed with an undervolt of 1.280 when an i3-2120 can be stable @ 0.8 idle, 1.08 volt stressed and uses less than 40 watts when testing with prime95 on a desktop.don't get me wrong, love how trinity is coming along, but lets keep the bias down please.[/citation]
what is with you? I simply quoted the idle power consumption numbers. I have no need to look at anything else if all I was comparing was the idle power consumption. I don't see how it is biased. You on the other hand pulls the topic to something completely different and then accuse me of being biased. Either you are looking for a way to find a fault in my argument because of your own bias and failing or you lack comprehension skills.
I really just pick the most credible site in my opinion with benchmarks on idle power consumption. I seen a few other sites which show the same data.

http://www.xbitlabs.com/articles/graphics/display/amd-trinity-graphics_11.html
trinity beating the i3 at idle power.

http://www.pcper.com/reviews/Processors/AMD-A10-5800K-Performance-Preview-Trinity-Desktop
trinity beating the i3 2105 at idle

http://hexus.net/tech/reviews/cpu/46005-amd-a10-5800k-trinity-apu/?page=6
trinity beating i3 at idle

http://www.guru3d.com/articles_pages/amd_a10_5800k_apu_preview,8.html
beats i7 in idle but no i3 data

http://www.hartware.de/review_1534_7.html
again but in german


that including my original source is 6 sources with the same data. There really is no relevance looking at loaded power as that is not my point and I clearly said why in my original post. You keep calling me for bias when it is clearly you who is biased. Why post about i3 under volting when all my data is about stock performance?
 
Finally AMD did show good numbers and competition against Ivy Bridge with its Trinity APU's. I hope the same for the new line of Vishera.
 
[citation][nom]spekkie[/nom]In the preview it says that the new piledriver apu's have a base clock of 200 MHz like all previous AMD cpu's including the FX bulldozer cores. In that case the piledriver provides a 15% increase in performance over the bulldozer cores, but now it seems that the piledriver apu's have a base clock of only 100MHz.But since a higher baseclock is genually preferred over a higher miltiplier, I wonder if the stated 15% performance increase still holds...[/citation]
The base should have been 100 MHz in the previews.
 


lovely . .i don't have time to point out everything wrong with your references. such as the hexus review has the AMD A8-3870K is on a mini ITX board where as the i3 is on a full sized ATX board and only beats the "system" idle by 2 watts; very negligible.

other tests/reviews where for gaming performance to use that data for a comparison to low power consumption data is, for lack of a better term, completely ludicrous. or did you fall asleep during any "how to do scientific testing" lecture? i doubt i am the one with a comprehension problem and i would highly suggest you refrain from any personal comment(s) lest you would care for a full flame on reply.

the article showed a lower power consumption of an i3 in relation to trinity and i showed a screen shot backing that up in response to you posting unrelated data trying to say otherwise; thats whats up with me.


cheers
 
[citation][nom]jaquith[/nom]No debate. Then buy the cheapest CPU and there's zero need to OC.I researched the AMD APU's for an HTPC extensively, and I would have bought one if they could game in HD. Instead I ended-up needing a discrete GPU whichever CPU I researched. Blur from a lower i.e. non-native resolution especially on a 60" LED/LCD would have looked awful. In my case I settled for an Intel i5 and AMD 7000 series GPU, vSync (including adaptive vSync) on a 120Hz monitor wasn't an issue. Both Intel & AMD = win.Whoever can solve that (APU/iGPU) shortcoming and deliver 40-50+ FPS in HD will sell a lot of CPU's, but the discrete GPU market will suffer i.e. including AMD...something to think about.[/citation]

Well everything AMD are doing is progressive steps towards full HSA after excavator, it is no secret that is what AMD are gearing towards. From Bulldozer to Excavator its architecture refinement, when the CPU part is capable of handling a high end Radeon core which AMD expect by 2015 it could change the computing landscape, by then HSA will be realistic.

What APU's do show is that watered down Piledriver modules are very capable of delivering CPU-GPU performance capable of beating the fastest x86 processor in the market, that also shows how far ahead in iGPU stakes AMD are. While we can bounce of the walls about AMD's gamble on integer operation vs floating point as being a case of to early, in a years time we move more into opencl and mp and perhaps see a 20% performance gain in BD and all of a sudden things are not so bad.

If anyone really took the time to read the article on toms not that long ago about AMD's fusion future, will know that if AMD took on Intel in a direct IPC showdown intels superior resources will put AMD out of commission, instead they are getting into the future of computing and are well equipped to out outmaneuver Intel in that area due to superior CPU-GPU integration.

Basically in short treat the APU for what it represents, instead of harping AMD's dependance on this chip.
 
[citation][nom]americanbrian[/nom]Umm, WHY DIDN'T YOU SHOW THE GAME BENCHMARKS WITH THE OVERCLOCKED GPU SETTINGS!!!I can't be the only one who was waiting for the money shot of what is the difference in performance when you clock up from 800Mhz to >1000Mhz. SUCH AN OVERSIGHT. UNFORGIVABLE![/citation]

The article does mention two day coverage and that gaming benches would be forthcoming.
 
In the end, then, both Intel and AMD are offering you an experience. Which one do you pick?

Intel gives you great performance in productivity and content creation apps, with a fantastic thermal envelope. But any aspiration for gaming necessitates discrete graphics, putting you in the $200 range.

AMD counts on a “good enough” showing in x86-based applications and ample 3D muscle to play a number of modern games at mainstream resolutions. In exchange, you’re asked to accept comparatively high power use. But it’s a price point below what Intel charges for its neutered Core i3-3220 that swings favor toward the A10-5800K for enthusiasts on a strict budget.

Quoted from the last page of the review.

... And now I question myself: Why does this idea remain in peoples minds? Why necessarily Intel is better at productivity? In the comparisons we saw directly most of the time the victory was on the APU solution (except with Lame, iTunes, and a tiny 8sec delay in Acrobat). The bottomline of this review states the exact opposite. I dont get it. Seriously.


In general, though, its a great review, this data is invaluable for me. Besides studying, I assemble computers and give support to people looking for PCs. The power analysis is quite interesting, and I agree with your conclusions. Keep up the excellent data crunching, I really appreciate it

Cheers
PS: No, Im not bashing on Intel. I know for a fact how well they perform in tasks designed for their cpus, its just that, overall, people have the false understanding that it is, hands down, superior at all times. This review proves the opposite in many many aspects, not all though. But thats it, people need to see that.
 
Well, it certainly looks like a win for AMD with their aggressive pricing. I don't know that the average consumer will know a thing about power consumption or even care when looking at the price for performance. They will see two machines, one may be slightly lower in cost but includes a capable graphics solution. Let's face it, most folks getting these chips aren't looking to run BF3 at the highest settings.

I am doing a LLano build for my sister, she is only a casual gamer but is interested in the ability to do more. I explained the dual graphics capability to her and she was very excited about it, even if she didn't completely understand it all. What she does know is that for the money she is getting a faster and more powerful machine then she's ever had before and that is exciting. I think even the average consumer will believe that a $100 quad core is better than a $130 dual core. Even the A10 at $130 looks more attractive than Intel's offerings.

And I would buy the AMD A10 without hesitation. I think this is the best news I've heard from AMD in quite some time, even if I could wish for better efficiency or IPC.
 
it's just doesn't seem right when it's a quad higher end AMD chip against the dual core Intel chip
and yet the quad AMD doesn't necessarily blow it away and loses one or two benches.
so now I'm trying to decide if this article makes the Ivy Bridge chips look better or the AMD chips the same disappointment overall as their predecessors were/are.?

so I take it that any i5 from an i5-2320 and up will basically win in all the benches shown (excluding the IGP).
and also sticking with my Phenom II X4 980BE @ 4.2GHz a little longer before I pass it down.
(with dedicated GPU..)

c'mon Piledriver.. let's see you in action.
 
[citation][nom]luciferano[/nom]Why buy a locked i5 if you can buy an unlocked i3 and not only match it in highly threaded performance, but also beat it in single/dual-threaded performance greatly? That might hurt Trinity, but it would hurt Intel even more with the budget enthusiast market. Besides, there's still the IGP advantage in Trinity.[/citation]
Because all desktop i5s are quad-core. You can't replace a full core with a second thread on a core. You get FAR less performance out of a double-threaded core than 2 single-threaded cores, all other things being equal.
 
[citation][nom]sarinaide[/nom]Blah, blah, blah.
Basically in short treat the APU for what it represents, instead of harping AMD's dependance on this chip.[/citation]
Well I live in the here and now, and have learned what will be is all smoke and mirrors.

Listen, I truly want AMD to be a strong, no very strong if not leap frog Intel and only in that way we all win. I sure don't want Intel's 'put it on the back-burner' approach. After the FX Intel either mothballed or delayed several CPU's.

Again, in the real world; it is what it is:
Gaming: AMD APU vs Intel HD Graphics in HD (1920x1080) ; conclusion neither.
Gaming: AMD vs Intel + Discrete ; Intel but i5 or i7.
Synthetics: AMD FM2 vs Intel i3 ; conclusion neither.
Simple Real World Tasks: AMD FM2 vs Intel i3 ; conclusion flip a coin.

Further, all of this banter over 'Wattage' the differences just in the MOBO's by themselves alone can have a 25W+ variance; examples: High-end ATX ~55W, Regular AXT ~38W or ITX ~30W. So claiming CPU wattage is at best is 'guessing' and by no means is conclusive. Frankly, you can have fluctuations when running any PC even @ IDLE of ~10W; I'm looking at mine now 144W~155W just typing 'this' response.

Also, pushing the A10-5800K to ~1.45v-1.50v requires an after market HSF (or it'll throttle), and the cost of a 'AMD’s Asetek-designed closed-loop liquid cooler' as noted in the Article e.g. Kühler H2O 620 adds $57 (Kühler H2O 920 adds $90) so $130 + $57 = $187-$220+ -- then compare (same money; dollar-per-dollar) the A10-5800K plus (Kühler H2O) to a Intel Core i5-3330 or Intel Core i5-3570K Ivy Bridge.
 

Further, all of this banter over 'Wattage' the differences -- see above.

Try it for yourself (High End, Standard and ITX) - http://www.thermaltake.outervision.com/

I have a wattage meter on my UPS and it bounces around like a bunny, all PC's do. Ditto goes with benchmarks, if anyone re-runs a test 5 times they'll have 5 different results. Any results within ±2%~±3% (or a potential variance of 4%~6%) are considered within margin of error, it's a fact of any testing -- unless you 'cherry-pick' then your credibility goes out the window.

Solution, run multiple tests, post all results, and at best average all results plus notate unusual variances.
 
oneblackened:

"Because all desktop i5s are quad-core. You can't replace a full core with a second thread on a core. You get FAR less performance out of a double-threaded core than 2 single-threaded cores, all other things being equal. "

What are you talking about? Only 1/2 of the Core-i5's are quad core, the other half are dual core with HT enabled. Whoever told you all Core-5's were quad's was very mistaken.
 

nice link, thanx for that..
 
im confused, were you using the gpu in the apu for all of the tests? if so, how high could you overclock without using the gpu? that would probably free up 30 watts or so, more if the gpu is overclocked....
 
oneblackened:

"Because all desktop i5s are quad-core. You can't replace a full core with a second thread on a core. You get FAR less performance out of a double-threaded core than 2 single-threaded cores, all other things being equal. "

What are you talking about? Only 1/2 of the Core-i5's are quad core, the other half are dual core with HT enabled. Whoever told you all Core-5's were quad's was very mistaken.
Duh what?!

Intel consumer desktop processors (few exceptions only and typically low power variants {T} or older LGA 1156):
i3's are 2-physical cores + 2 Hyper-Threaded / 4 Threads
i5's are 4-physical cores + 0 Hyper-Threaded / 4 Threads
i7 are 4~6-physical cores + 4~6 Hyper-Threaded / 8~12 Threads ; 6-cores examples are some i7's on LGA 1366 and LGA 2011.

See - http://www.intel.com/content/www/us/en/processor-comparison/compare-intel-processors.html?select=desktop
 
[citation][nom]jaquith[/nom]Headline - Which is the 'Best' Dysfunctional CPU?Gaming: AMD APU vs Intel HD Graphics in HD (1920x1080) ; conclusion neither.Synthetics: AMD FM2 vs Intel i3 ; conclusion neither.Simple Real World Tasks: AMD FM2 vs Intel i3 ; conclusion flip a coin.There's no debating the AMD APU devastates Intel HD Graphics, but since the vast majority of folks now have HD monitors (1920x1080) neither is playable running the vast majority of games. Therefore, you still need a discrete GPU to obtain acceptable frame rates, and no the following isn't acceptable in HD: AVP 14 vs 8, DiRT3 32 vs 16, or Metro 2033 22 vs 16, etc.Rendering, folks looking into 'rendering' would choose neither CPU. Transcoding, okay where's the Quick Sync?If 'Time' in any way shape or form 'is' important then you need to compare up front money to long term gains and when you're looking into an extra $100 or extra $1000 just for a CPU then 20%~100%+ gains, reduced time, starts to add up very quickly. However if your time is valueless ($0) or your patience is limitless or your use is very basic then pick the cheapest CPU get a AMD Sempron 145 or Intel Celeron G530. This is why I always shake my head when the 'comparisons' are so narrow and alternative CPU's are mysteriously missing. 'Value' must take into consideration the CPU's 'Function', your personal 'Time' & 'Money' and 'Cost' in as close to a real world 'Environment' (conditions) for your use.Example, (any CPU here in) vs i7-3930K Cinebench R11.5 3.x vs 11.x or a 300% gain in rendering time. Other encoding x264 HD Video i7-3770K 7.x vs 15.x or a 200% gain. My point is IF these benchmarks mean anything to you -- then I must assume so does your time, and once you factor in Time vs Money all of these CPU's in this article are Dysfunctional.Therefore, what niche are these CPU's meant for the answer is simple, General Home Use and if Gaming is important then in the real world HD (1920x1080) you'll still need a discrete GPU with any CPU today and then Intel is the clear choice for you.[/citation]

AMD's graphics performance advantage does matter because you can simply drop the resolution if you can't play at what you want to play at. Regardless, AMD has Dual-Graphics where they can just grab a cheap graphics card such as a 6670 or a 6570 and even without that, you can simply use better memory than was used in the 1080p gaming tests and that with GPU overclocking could do the trick for 1080p on the A10s. Besides, people looking into such low-end options probably wouldn't mind playing at a resolution that is below 1080p anyway, not that that's an excuse for anything.

I really can't argue with your view on rendering because I'm not very experienced with it and if someone was doing it professionally, I'd think that if higher end hardware would help tremendously, then that is what would be used except maybe for some hobbyists and other such non-professional uses. Maybe some forms of it would be helped by the low latency connection between the APU's CPU and IGP, but otherwise, something much higher end would probably be more than worth the money, especially if how much rendering you do determines your income where spending a few hundred or even a few thousand dollars on better hardware might pay off in a mere few days or few weeks.

 
[citation][nom]Justposting42[/nom]oneblackened:"Because all desktop i5s are quad-core. You can't replace a full core with a second thread on a core. You get FAR less performance out of a double-threaded core than 2 single-threaded cores, all other things being equal. "What are you talking about? Only 1/2 of the Core-i5's are quad core, the other half are dual core with HT enabled. Whoever told you all Core-5's were quad's was very mistaken.[/citation]
ummm, look at newegg. they are all quad core. the dual cores are the mobile variety. were looking at desktop models here. please stop being a pain in the but now.
 
[citation][nom]oneblackened[/nom]Because all desktop i5s are quad-core. You can't replace a full core with a second thread on a core. You get FAR less performance out of a double-threaded core than 2 single-threaded cores, all other things being equal.[/citation]

If an unlocked i3 can hit around 5GHz, then it could easily match a locked stock i5 in highly threaded performance while beating it greatly in single and dual-threaded work, the same advantage that it has over stock AMD CPUs.
 
"jaquith "

"Duh what?!

Intel consumer desktop processors (few exceptions only and typically low power variants {T} or older
LGA 1156):
i3's are 2-physical cores + 2 Hyper-Threaded / 4 Threads
i5's are 4-physical cores + 0 Hyper-Threaded / 4 Threads
i7 are 4~6-physical cores + 4~6 Hyper-Threaded / 8~12 Threads ; 6-cores examples are some i7's on LGA 1366 and LGA 2011.

See - http://www.intel.com/content/www/u [...] ct=desktop"

see the link provided below

http://en.wikipedia.org/wiki/Intel_Core#Core_i5

2 core i5's include;
Core i5-23xxT
Core i5-34xxT
Core i5-6xx

Core i5-2xxxM
Core i5-2xx7M
Core i5-3xx0M
Core i5-3xx7U
All "Arrandale" Core-i5's

Thanks for the negatives you dumb SOB's! Freaking learn your specs before you jump to conclusions!!! Just because it's a core-i5 doesn't mean is a 4 core CPU. Stupid as$holes!
 


"jaquith "

"Duh what?!

Intel consumer desktop processors (few exceptions only and typically low power variants {T} or older
LGA 1156):
i3's are 2-physical cores + 2 Hyper-Threaded / 4 Threads
i5's are 4-physical cores + 0 Hyper-Threaded / 4 Threads
i7 are 4~6-physical cores + 4~6 Hyper-Threaded / 8~12 Threads ; 6-cores examples are some i7's on LGA 1366 and LGA 2011.

See - http://www.intel.com/content/www/u [...] ct=desktop"

see the link provided below

http://en.wikipedia.org/wiki/Intel_Core#Core_i5

2 core i5's include;
Core i5-23xxT
Core i5-34xxT
Core i5-6xx

Core i5-2xxxM
Core i5-2xx7M
Core i5-3xx0M
Core i5-3xx7U
All "Arrandale" Core-i5's

Thanks for the negatives you dumb SOB's! Freaking learn your specs before you jump to conclusions!!! Just because it's a core-i5 doesn't mean is a 4 core CPU. Stupid as$holes!

It's even on Intel own website! You referenced a website that counters YOUR claim that all core-i5's are quad core.... :facepalm:

http://ark.intel.com/products/65703/Intel-Core-i5-3470T-Processor-3M-Cache-up-to-3_60-GHz

wow... just WOW.

HE (jaquith) STATED THAT IN THE BEGINNING..
 
Status
Not open for further replies.