AMD A10-7800 APU Review: Kaveri Hits the Efficiency Sweet Spot

Status
Not open for further replies.

tiger15

Reputable
Aug 19, 2014
1
0
4,510
You are stressing power efficiency.
What about comparing those numbers with other offerings? (Intel?)
 


Maybe the new consoles lack CPU power (even if they are 8 core, the 1,6Ghz/1,75Ghz cripples them), their GPU part is far more powerful than existing APUs.
PS4's GPU has cores like 7870 and XBOX1 has cores like 7790, in other words more powerful than the 512 core R7 which exists in today's best APU A10-7850K.
 

Cryio

Distinguished
Oct 6, 2010
881
0
19,160
Wait. You can now CrossFire A10 7850 with GPUs other than the 240 and 250X?

I have a friend with a 7850K and a 260X and he's dying to know if he can CrossFire.

"I see no point in buying a processor that emphasizes on-die graphics and then adding a Radeon R7 265X. Yes, AMD officially recommends it and yes, we tried it out." Can I take this as a yes ?
 

gadgety

Distinguished
Dec 3, 2011
69
0
18,630
The A8-7600 seems to be the effiency sweet spot in the Kaveri line up, specially at 45W. Trying to compare with of the A10-7800 with the A8-7600, although as far as I can tell just about ALL your tests seem to be done at different settings (e.g. BioShock Infinity is run at Medium Quality Presets rather than the lowest settings as in the test of the A10-7800) so the comparison isn't straightforward. A8-7600 is within 91-94% of the A10-7850K. One item which is comparable is video encoding in Handbrake, where the A8-7600 is at 92.8% of the 7850k, whereas the A10-7800 is at 95.7% of the 7850k. Price wise you'd pay a 63% premium for the A10-7800 over the A8-7600 to get an extremely minute performance advantage, around 3% or so.
 

Drejeck

Honorable
Sep 12, 2013
131
0
10,690


Maybe the new consoles lack CPU power (even if they are 8 core, the 1,6Ghz/1,75Ghz cripples them), their GPU part is far more powerful than existing APUs.
PS4's GPU has cores like 7870 and XBOX1 has cores like 7790, in other words more powerful than the 512 core R7 which exists in today's best APU A10-7850K.
Not accurate.
PS4 GPU is a crippled and downclocked 7850 (disabled cores enhance redundancy and less dead chips)
XB1 GPU is a crippled and downclocked R7 260X (as above) and like the 7790 should have AMD True Audio onboard, but they could have changed that. This actually means that CPU intensive and low resolution games are going to suck because the 8 cores are just Jaguar netbook processors.
The reality is that PS4 is almost cpu limited already and the XB1 is more balanced. Now that we've finished speaking of "sufficient" platforms let's talk about the fact that a CPU from AMD and the word efficient are in the same phrase.
 
I think you need to do a little more research since: Reverse engineered PS4 APU reveals the console’s real CPU and GPU specs. "Die size on the chip is 328 mm sq, and the GPU actually contains 20 compute units — not the 18 that are specified. This is likely a yield-boosting measure, but it also means AMD implemented a full HD 7870 in silicon."

The PS4 will be CPU limited? Since they write the code/API according to a hardware that it will remain the same for like 7-8 years, such thing as CPU limited especially for a console that runs the majority of games at 1080p, does not exist...
ps: I agree with the downclocked part since they need to save as much power as they can...
 

blubbey

Distinguished
Jun 2, 2010
274
0
18,790


Maybe the new consoles lack CPU power (even if they are 8 core, the 1,6Ghz/1,75Ghz cripples them), their GPU part is far more powerful than existing APUs.
PS4's GPU has cores like 7870 and XBOX1 has cores like 7790, in other words more powerful than the 512 core R7 which exists in today's best APU A10-7850K.
Not accurate.
PS4 GPU is a crippled and downclocked 7850 (disabled cores enhance redundancy and less dead chips)
XB1 GPU is a crippled and downclocked R7 260X (as above) and like the 7790 should have AMD True Audio onboard, but they could have changed that. This actually means that CPU intensive and low resolution games are going to suck because the 8 cores are just Jaguar netbook processors.
The reality is that PS4 is almost cpu limited already and the XB1 is more balanced. Now that we've finished speaking of "sufficient" platforms let's talk about the fact that a CPU from AMD and the word efficient are in the same phrase.

PS4 is 1152:72:32 at 800MHz, 7850 is 1024:64:32@ 900MHz or so (860MHz release?) It is not a "crippled 7850", the 7850 is a crippled pitcairn (20 CUs is the full fat 7870, PS4 has 18, 7850 16 CUs). "CPU limited" is very PC orientated thinking, things like offloading compute to the GPU will help. No, I'm not saying their CPUs are "good" but they will find ways of offloading that work onto the GPU.
 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285
The A8-7600 seems to be the effiency sweet spot in the Kaveri line up, specially at 45W. Trying to compare with of the A10-7800 with the A8-7600, although as far as I can tell just about ALL your tests seem to be done at different settings (e.g. BioShock Infinity is run at Medium Quality Presets rather than the lowest settings as in the test of the A10-7800) so the comparison isn't straightforward. A8-7600 is within 91-94% of the A10-7850K. One item which is comparable is video encoding in Handbrake, where the A8-7600 is at 92.8% of the 7850k, whereas the A10-7800 is at 95.7% of the 7850k. Price wise you'd pay a 63% premium for the A10-7800 over the A8-7600 to get an extremely minute performance advantage, around 3% or so.

Yes, but the A8-7600 has a 384-shader GPU. I suppose it depends on whether you want to use the GPU or not.
 

curtisgolen

Distinguished
Sep 30, 2010
6
0
18,510
"I see no point in buying a processor that emphasizes on-die graphics and then adding a Radeon R7 265X. Yes, AMD officially recommends it and yes, we tried it out." Can I take this as a yes ?

This needs to be explained more... There is a lot of people that would love to use a 260x let alone a 265x
 
"I see no point in buying a processor that emphasizes on-die graphics and then adding a Radeon R7 265X. Yes, AMD officially recommends it and yes, we tried it out." Can I take this as a yes ?

This needs to be explained more... There is a lot of people that would love to use a 260x let alone a 265x


Yes chances are you can crossfire them without having a crash or something, but its a terrible idea to do it. Instead of increasing your performance in games, your FPS would drop by more than half and your power consumption would increase greatly. Its never a good idea to crossfire with the integrated graphics, it just doesn't go well.
 

atminside

Distinguished
Mar 2, 2011
134
0
18,680
I like that AMD is working hard on getting better at efficiency, but I am really disappointed that AMD has no future plans for a AM3+ or AM4 derivative for high end CPUs. I love my Phoneme X4 955, old that it is, but with Intel being so expensive and that AMD has not delivered a better platform for me to justify an upgrade; i have been stuck with my 790xta-4d4 mobo and 955. Hope AMD will come around make plans for a new high end or performance CPU not just APU.
 

LionD

Reputable
Aug 19, 2014
6
0
4,510
I agree with gadgety - the REAL sweet spot is A8-7600. In all test I could found, it shows 90%+ performance of A10-7800 - I mean GPU tests, no mention to CPU. So the ridiculous cost of A10-7800 just has no sense.
 

LionD

Reputable
Aug 19, 2014
6
0
4,510
Yes, but the A8-7600 has a 384-shader GPU. I suppose it depends on whether you want to use the GPU or not.
Still, the non-synthetic GPU-related tests (gaming, OpenCL) shows little difference between A10-7800 & A8-7600. In most cases it falls within 10% and NEVER reaches theoretical 25% - even 20.
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
AMD will probably go with a new naming convention if it makes a new desktop mobo socket. More than likely, the days of not getting graphics on chip are over. I think AMD is going to always have an APU from this day forward. So it will probably mean you should wait for the next APU architecture and not invest in this one. As we all know these APU are pretty much dual cores with a almost four cores. Most applications will treat them as dual cores. The next APU architecture will be complete cores and more worthwhile to invest in.
The mobo I am looking for next is probably going to be a server based board. When the AMD Socket G34 was released, there were a few desktop variants.
 

smoohta

Reputable
Aug 1, 2014
3
0
4,510
I agree with tiger15 - these power consumption/efficiency graphs are interesting but utterly useless without comparing them to other offerings.

Also, I was wondering whether you could expand on the HSA benchmark- it sounds very interesting but you offer no information on what it actually does (except that it was originally provided by AMD)...
For example- how much data does this benchmark actually use?
Did you try increasing/decreasing the amount of data to see where HSA starts being effective?

Also in HSA- comparing between the processors by percentage seems pretty misleading (and is not the way it is done in other benchmarks)... is it possible to add absolute measurements here?
 

icemunk

Distinguished
Aug 1, 2009
628
0
18,990
This makes me want a little FM2 system even more... just a low power consumption device that can play some pretty decent games and reasonable frames. The biggest downfall for me right now is the lack of cheap FM2 mITX boards available. The cheapest I've saw are sitting in the $130-160 range, which is far too expensive. If I could have a cheap little $50-60 mITX mobo, along with this APU; in a little mini-ITX case at a reasonable price, ($200-300) I would buy one today. I refuse to pay $150 for a mITX board though.
 

curtisgolen

Distinguished
Sep 30, 2010
6
0
18,510
"I see no point in buying a processor that emphasizes on-die graphics and then adding a Radeon R7 265X. Yes, AMD officially recommends it and yes, we tried it out." Can I take this as a yes ?

This needs to be explained more... There is a lot of people that would love to use a 260x let alone a 265x


Yes chances are you can crossfire them without having a crash or something, but its a terrible idea to do it. Instead of increasing your performance in games, your FPS would drop by more than half and your power consumption would increase greatly. Its never a good idea to crossfire with the integrated graphics, it just doesn't go well.

hat is not true according to AMD. Could not find it on AMD website but read this. http://wccftech.com/amd-kaveri-dual-graphics-works-ddr3-memory-based-radeon-r7-gpus/
 

curtisgolen

Distinguished
Sep 30, 2010
6
0
18,510
"I see no point in buying a processor that emphasizes on-die graphics and then adding a Radeon R7 265X. Yes, AMD officially recommends it and yes, we tried it out." Can I take this as a yes ?

This needs to be explained more... There is a lot of people that would love to use a 260x let alone a 265x


Yes chances are you can crossfire them without having a crash or something, but its a terrible idea to do it. Instead of increasing your performance in games, your FPS would drop by more than half and your power consumption would increase greatly. Its never a good idea to crossfire with the integrated graphics, it just doesn't go well.

I would read this for more informaton. http://wccftech.com/amd-kaveri-dual-graphics-works-ddr3-memory-based-radeon-r7-gpus/
 


There has been numerous cases of crossfiring between two graphics cards that AMD doesn't out right say you can crossfire. For example, an A10-6800k APU can be crossfired with an AMD 7730, AMD 7750, and AMD 7770. It doesn't say it, but people tried it and it works. That was working between two completely different architectures.

The A10-7850k is GCN architecture just like the GPUs now and I would be surprised that any of the GPUs currently made are incapable of crossfiring with it. AMD just doesn't recommend it or advertise it because the GPU quickly becomes too fast and out runs the iGPU which results in major drops in performance.

So just because a company says you can't do something doesn't mean its true. I guess you also believed 10 years ago that you can't overclock an Intel processor didn't you?
 
Status
Not open for further replies.