AMD A10-7800 APU Review: Kaveri Hits the Efficiency Sweet Spot

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

wiyosaya

Distinguished
Apr 12, 2006
915
1
18,990
Having picked up an A10-7850K for a HTPC build when they first came out, I doubt that I would change my mind if I were building that same HTPC now to get an A10-7800 simply because on Amazon, at least at this time, the difference in price is $6.00.
 

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285
anyone else notice that Kaveri was within 5% of the ipc of sandy bridge at similar clock speeds...

how did we miss this when Kaveri came out?

Because people were smart enough to know what you're saying isn't true.

An i3 has two real cores, so if you want to compare a real Sandy Bridge with a Steamroller (which shares the FPU, but otherwise the cores are 'real'), you'd compare with an i5. Steamroller gets molested, and left to runt off seeking professional help and counseling.

These are very badly designed processors. They have huge GPUs, but not enough memory bandwidth to feed them.

By the way, other sites did test the A8-7600 against the A10-7800, with the latter having better performance in the vast majority of benchmarks. Not by a lot though, in most cases.

But, in one sense the A8-7600 is a more balanced design, since the GPU is smaller and better matches the inferior memory controller on the chip. The bad part is, this is just a castrated part, so the die is still huge and part of it is disabled. AMD would have been better off just making this die, and forgetting about the bigger ones, since they're so bandwidth limited. Doing so would have made the chips significantly less expensive to make.

If they don't improve the lousy memory controller, throwing more transistors into the GPU isn't going to yield the types of results they are looking for. Obviously, the new memory they are looking at will help, but they should also make an effort to improve the terribly inefficient memory controller so it gets somewhere in the neighborhood of where Intel is. Even in the same county. Or state. They're not even in the same planet, which makes you wonder why they don't look at Intel's design more closely. Even several generations units use memory better than AMD's, so they've had time to copy it, or to get ideas how to use it in their crippled chips.
 
Still, the non-synthetic GPU-related tests (gaming, OpenCL) shows little difference between A10-7800 & A8-7600. In most cases it falls within 10% and NEVER reaches theoretical 25% - even 20.

That can be explained because of Turbo. Since less shaders mean less heat, turbo stays up longer, I'd say.

--

Why "Made in China", but "Diffused in Germany"? Also, I thought these were made by TSMC, which is Korean? Or GF, but with plants in China? o_O

Cheers!
 


smart enough? how about you read the article a little closer before you start to toss about insults.

you obviously didn't read the article. in the ONLY single threaded test run, the kaveri was 95% the power of the sandybridge, controlled for the same clock speed. It takes a little math to work out (since they were benched at different speeds) but it works out to the kaveri being just off the pace of a Sandybridge. Of course once he got into the generic synthetics that single core parody vanished which raises questions to his home made benches.

I know I've seen the benches in the past, with kaveri basically being on par with a phenomII in IPC... and a solid 15%-20% slower in single core performance then a sandybridge... so i was a little surprised that the lone -non-synthetic test resulted in such a small performance gap. that's all i was pointing out.


As an aside, these are not poorly designed chips... these are actually rather complex and fairly well designed all things considered. The issue at play is the focus of the design of bulldozer is both inspired and inherently flawed. Bulldozer was designs on a "modular" archetecture, the idea was with the bulldozer design they could custom make cpus to fit specific clients needs cheaper. It worked so well they landed the 2 next gen console, because frankly... it was cheaper and easier for them to individually design a cpu for them then anyone else due to the modular design of bulldozer.

The problem is with modular and flexible design comes trade offs. one of them is they simply can't compete with intel's performance tuned product line... furthermore bulldozer ended up with some crippling internal design faults that prevented it to SCALE up as well as it scaled down in short at lower voltages and lower clocks it's ipc actually is higher then it is at higher voltages and higher clocks... the result it its far more competitive on the low end with intel then it is on the high)

Finally anyone who talks about a bulldozer cored cpu as having "fake" cores needs to check their inter intel fanboy at the door

The engineering definition of a cpu "core" is that a core must have 3 parts
1) instruction control unit
2) instruction execution unit
3) input/ouput unit

AMD's bulldozer family cpu cores have all of these parts; each core module contains 2 separate cores, each one of those cores has their their own scheduler (control unit), 4 execution units, and an I/O unit.

The confusion about the bulldozer architecture, comes from the floating point processor unit. You see up until 2000 or so, no cpu had a floating point processor. In fact computers around 1997 started to include math-coprocessors add on boards to handle the floating point math... around 2000 cpus started to integrate the math coprocessor, called a floating point processor onto the cpu itself. These units basically handle floating point math (calculus) which traditional cpus rather suck at. Now understand, these floating point processors are completely separate units from the cpu core on both an AMD and Intel cpu... in a way they're sorta the progenitor to the whole concept of an APU, as all a gpu really is, is a highly specialized math coprocessor or calculus calculator. AMD chose, with bulldozer, to place 1 256-bit floating point processors on their cpu per core module... that single FPU is naturally a 256-bit unit, but when needed can function like 2 128 bit FPUs, THIS is the part that works like a gimped version of intel's hyperthreading; as in it's a single FPU which can at times, when needed handle 2 instructions at the same time.

The fx cpus ARE by every definition proper 4/6/8 core cpus. They just work a little different with their design then an intel cpu... or even the older retired AMD k10 architecture


**as a disclaimer i quoted the bit at the end from one of my own forum posts in another web forum (overclock.net) posted under the username azanimefan, this was not stolen from another poster, i just didn't feel like typing that all out again**
 
  • Like
Reactions: msroadkill612

Damn_Rookie

Reputable
Feb 21, 2014
791
0
5,660
Ingtar33, did you take into account the turbo speed of the Kaveri chip when doing your calculation to compare it to the Sandy Bridge chip (which doesn't have turbo mode)? Seeing as it's only doing a single threaded task, the 7800 should logically be running at 3.9 GHz, and with that taken into account, the performance is a fair bit away from the Sandy Bridge chip (based on the "wall construction" benchmark used, about 82% of the performance, if my calculations are correct).
 


I was going off the information provided in the article. the author made no mention of turbo modes (the i3 has them too) so i stuck to the specified clock speeds on the benching results. I assumed (granted that can be dangerous) that the author turned off turbo mode for all chips benched in the interest of fairness and accuracy as the turbo mode on the intels and amd chips work differently.
 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285
AMD's cat cores do not utilise a modular architecture. The only thing they have in common is a shared L2, and that's between 2 to 4 cores depending on model. In addition, Jaguar utilises an inclusive cache architecture, as opposed to any CPU AMD has made within the last 15 or so years (exclusive - potentially less cache required but more sensitive to speed and latency).
 

Damn_Rookie

Reputable
Feb 21, 2014
791
0
5,660

True, it wasn't specified whether the turbo mode of the APUs were left active, but if they were, and my calculations are correct, the performance would be a lot more in line with the expected 15-20% slower IPC you mentioned and expected. Just to note, I don't believe any i3 has a turbo mode, even in the mobile space (I guess Intel want to keep it as a differentiator between the CPU lines).

If it was working at 3.5 GHz only, with the turbo switched off, it would still only be 91.4% the single threaded performance of the Sandy Bridge, so I don't think it quite approaches the 95% figure you quoted (again, presuming my calculations are correct. Anyone, please feel free to correct me! :)).

Based on the figures in the "Wall Construction" one thread test.

(100 / 3.5) * 3.1 = 88.57 = the relative performance of the 7800 at 3.1 GHz clock speed
(88.57 / 96.9) * 100 = 91.4% = the single threaded performance of the 7800 expressed as a percentage of the i3-2100 performance.
 

CaptainTom

Honorable
May 3, 2012
1,563
0
11,960


Maybe the new consoles lack CPU power (even if they are 8 core, the 1,6Ghz/1,75Ghz cripples them), their GPU part is far more powerful than existing APUs.
PS4's GPU has cores like 7870 and XBOX1 has cores like 7790, in other words more powerful than the 512 core R7 which exists in today's best APU A10-7850K.
Not accurate.
PS4 GPU is a crippled and downclocked 7850 (disabled cores enhance redundancy and less dead chips)
XB1 GPU is a crippled and downclocked R7 260X (as above) and like the 7790 should have AMD True Audio onboard, but they could have changed that. This actually means that CPU intensive and low resolution games are going to suck because the 8 cores are just Jaguar netbook processors.
The reality is that PS4 is almost cpu limited already and the XB1 is more balanced. Now that we've finished speaking of "sufficient" platforms let's talk about the fact that a CPU from AMD and the word efficient are in the same phrase.
No what YOU say is not true. The PS4's gpu is almost exactly equal to a stock 7870, but the Xbox one is closer to a 7770 GHz in terms of gaming performance than it is a 7790.
 


Seems to work fine on my piledriver.

~24GFLOPS without AVX enabled.
c9017b99_ibtstable.png
~50GFLOPS with AVX enabled.
6508633d_ibtstable2.png
 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285


Seems to work fine on my piledriver.

~24GFLOPS without AVX enabled.
c9017b99_ibtstable.png
~50GFLOPS with AVX enabled.
6508633d_ibtstable2.png

"Memory writes with the 256-bit AVX registers are exceptionally slow. The measured throughput is 5 - 6 times slower than on the previous model (Bulldozer), and 8 - 9 times slower than two 128-bit writes. No explanation for this has been found. This design flaw is likelty to negate any advantage of using the AVX instruction set." source: http://www.agner.org/optimize/blog/read.php?i=285
 

martel80

Distinguished
Dec 8, 2006
368
0
18,780
Sorry for the multiple posts , there was no indication that they were working.
On the other hand, it can't be so hard to validate uniqueness of every user's post and eliminate any duplicates. Come on, Toms. It's like a single SQL query...
 


Apparantly their hypothesis was just proved wrong.
 

Thorfkin

Distinguished
Dec 22, 2006
81
0
18,630
Your Kaveri gaming benchmarks seem flawed to me. Kaveri's gaming power comes from it's Direct X 11 render engine. I noticed that all of your benchmarks are run at the lowest setting for the game. I use an A10-7850k and I never play with anything below high detail. Kaveri's power comes from the fact that it can do higher detail without considerable performance loss. I run Skyrim at 1920x1080 at the high detail preset. If it weren't for the fact that I have more than 150 mods installed I could use Ultra detail without missing a beat. Granted I do have my Kaveri heavily overclocked at 4.4ghz. The fact that Kaveri can do this is a huge part of its value proposition.
 

lp231

Splendid
I'm properly going to get down rated for this
AMD APUs, while they bring in budget friendly PCs for those on a budget, but stop making them like their actually worth buying, cause it's not. Just look at those gaming benchmarks, in order to make them attractive the games have to run at the lowest settings. Who in the real world runs games at the lowest settings? I know I don't, and I bet none of you here as well. The other is the price, AMD A10-7800 at Newegg cost $169.99, the cheapest Intel Core i5 at Newegg cost $184.99, that is the difference of $15 dollars. If a you were to buy a CPU for yourself you doing to get a AMD 10? I doubt it. If a friend asked you to help them build a gaming computer, you going to recommend a AMD A10 or even think about AMD A10 as your very first CPU pick? The most cheapest CPU would be a AMD FX 6300. Seriously, AMD really need to bring some focus back to their FX line up as many of us would still like to have a proper desktop cpu. and none of these APUs but you can pretend it still a great desktop CPU nonsense.
AMD A10 = those peddle cars that toddler drives around with
AMD FX = a proper car

Links to them APU and CPU
http://www.newegg.com/Product/Product.aspx?Item=N82E16819113372
http://www.newegg.com/Product/Product.aspx?Item=N82E16819116942
 

alextheblue

Distinguished
I agree with gadgety - the REAL sweet spot is A8-7600. In all test I could found, it shows 90%+ performance of A10-7800 - I mean GPU tests, no mention to CPU. So the ridiculous cost of A10-7800 just has no sense.
One problem: The A8-7600 supports configurable TDPs. So when you're looking at benchmarks, keep in mind that they may be running it at 65W. At 45W the 7600 loses a good chunk of performance. With that being said, the reduced power consumption might still make it worthwhile for certain uses. I think it should have been included in the comparison, so long as they tested it at both TDPs. I believe the 7600 would be very competitive on the CPU front, but anything that stressed the GPU heavily would favor the 7800.
This makes me want a little FM2 system even more... just a low power consumption device that can play some pretty decent games and reasonable frames. The biggest downfall for me right now is the lack of cheap FM2 mITX boards available. The cheapest I've saw are sitting in the $130-160 range, which is far too expensive. If I could have a cheap little $50-60 mITX mobo, along with this APU; in a little mini-ITX case at a reasonable price, ($200-300) I would buy one today. I refuse to pay $150 for a mITX board though.

The mITX boards are overpriced (mATX yields the best prices for FM2+ currently), but not THAT overpriced.

http://www.newegg.com/Product/Product.aspx?Item=N82E16813157464

$96. That's a pretty nice A88X model, too. Again though, I'd like to stress that if you don't need it to be super tiny, micro ATX options are great.
 


You are missing the point completely. They don't try to make the CPU seem more powerful than it is, they really just try to point it out to be as fast as you need. The part they really advertise is the GPU side and it is the fastest CPU with built in graphics money can buy. True, an Intel Core i5 isn't much more expensive, but what if someone is buying a gaming PC and doesn't have a lot of cash? To get a solid graphics card they at least need $80 to get a graphics card that can beat the graphics in this GPU. So now they are close to being $100 more expensive, and that is before considering motherboards which are usually a little higher for Intel.

Granted you could get an Intel Core i3 or an FX 6300 CPU which performs better, but that will cost $120, add an $80 GPU to get a better graphics card and you are $40 past this one for little gain in performance. For someone who is very tight on cash that extra $40 is really important.

So stop being so narrow minded and remember not everyone can afford to go out and buy an i7 and a GTX 780 Ti.
 
  • Like
Reactions: msroadkill612

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285


smart enough? how about you read the article a little closer before you start to toss about insults.

you obviously didn't read the article. in the ONLY single threaded test run, the kaveri was 95% the power of the sandybridge, controlled for the same clock speed. It takes a little math to work out (since they were benched at different speeds) but it works out to the kaveri being just off the pace of a Sandybridge. Of course once he got into the generic synthetics that single core parody vanished which raises questions to his home made benches.

I know I've seen the benches in the past, with kaveri basically being on par with a phenomII in IPC... and a solid 15%-20% slower in single core performance then a sandybridge... so i was a little surprised that the lone -non-synthetic test resulted in such a small performance gap. that's all i was pointing out.


As an aside, these are not poorly designed chips... these are actually rather complex and fairly well designed all things considered. The issue at play is the focus of the design of bulldozer is both inspired and inherently flawed. Bulldozer was designs on a "modular" archetecture, the idea was with the bulldozer design they could custom make cpus to fit specific clients needs cheaper. It worked so well they landed the 2 next gen console, because frankly... it was cheaper and easier for them to individually design a cpu for them then anyone else due to the modular design of bulldozer.

The problem is with modular and flexible design comes trade offs. one of them is they simply can't compete with intel's performance tuned product line... furthermore bulldozer ended up with some crippling internal design faults that prevented it to SCALE up as well as it scaled down in short at lower voltages and lower clocks it's ipc actually is higher then it is at higher voltages and higher clocks... the result it its far more competitive on the low end with intel then it is on the high)

Finally anyone who talks about a bulldozer cored cpu as having "fake" cores needs to check their inter intel fanboy at the door

The engineering definition of a cpu "core" is that a core must have 3 parts
1) instruction control unit
2) instruction execution unit
3) input/ouput unit

AMD's bulldozer family cpu cores have all of these parts; each core module contains 2 separate cores, each one of those cores has their their own scheduler (control unit), 4 execution units, and an I/O unit.

The confusion about the bulldozer architecture, comes from the floating point processor unit. You see up until 2000 or so, no cpu had a floating point processor. In fact computers around 1997 started to include math-coprocessors add on boards to handle the floating point math... around 2000 cpus started to integrate the math coprocessor, called a floating point processor onto the cpu itself. These units basically handle floating point math (calculus) which traditional cpus rather suck at. Now understand, these floating point processors are completely separate units from the cpu core on both an AMD and Intel cpu... in a way they're sorta the progenitor to the whole concept of an APU, as all a gpu really is, is a highly specialized math coprocessor or calculus calculator. AMD chose, with bulldozer, to place 1 256-bit floating point processors on their cpu per core module... that single FPU is naturally a 256-bit unit, but when needed can function like 2 128 bit FPUs, THIS is the part that works like a gimped version of intel's hyperthreading; as in it's a single FPU which can at times, when needed handle 2 instructions at the same time.

The fx cpus ARE by every definition proper 4/6/8 core cpus. They just work a little different with their design then an intel cpu... or even the older retired AMD k10 architecture


**as a disclaimer i quoted the bit at the end from one of my own forum posts in another web forum (overclock.net) posted under the username azanimefan, this was not stolen from another poster, i just didn't feel like typing that all out again**

Actually, you know nothing about this subject, so I'll help correct your misinformation.

First, you don't base IPC off of one benchmark. Ever. Sandy Bridge is way ahead of Piledriver in IPC. Virtually all benchmarks show this, but I can show you a benchmark that will show whatever I want.

Second, if you knew ANYTHING about Bulldozer and Piledriver you would know it's not a real core. But, you don't. You want to sound like you do, but you really don't know anything, huh?

Let's go further with this. A core in this processor is not a full core. It shares not only the FPU with another core, but also the decoders! It's also narrower than an Intel core, and AMD's previous generation, but that part would just make it a weaker core, but a real one. The first two, do not.

It's a piece of crap. That's why they lost market share. That's they can not even approximate the performance of an Intel chip of the same size, and they compete against much smaller Intel CPUs. Sure, Intel charges more, but not because they have to, but because they can. On top of this, these slow AMD CPUs take up a lot more energy.

Even AMD is panning these miserable processors. It's a failed design, that is being discontinued after Carrizo. That's not because it's good. It's because it's not. They point out the Kabini does as much work per clock cycle, yet is 1/3 the size. Ouch. That's the maker of the Kaveri saying that. Hurts, huh?

So, let's not be stupid, and try to talk about stuff we don't understand. You may still be in denial, but the company that makes the processor even talks about it in those terms, and has decided to kill the design. The nonsense about the custom designs is pure fabrication (forgive the pun). There are no custom designs for this processor, and there never will be. Jaguar got those. And it's a much better design. Little wonder why it got into the consoles, while the failed Piledriver/Steamroller failed to.

It's not because it was a good design that it failed. It's because it's a failed design.
 

HeavenKidz

Reputable
Apr 21, 2014
121
0
4,710
what's????metro last light??the i3 going into that test?i think the intel hd 2000 in the i3 2100 can't run the metro last light because it needs some physX driver or program that not available on the intel hd 2000
 

lp231

Splendid


All prices from Newegg

AMD APU
AMD A10-7800.................................$169.99
Team 8GB (2x4GB) DDR3 2400.....$75.99
Asus A78M-A...................................$67.79
Total.................................................$313.77
Sapphire R7-260X 2GB..................$119.99
Total with card................................$433.76

Intel Core i5
Intel Core i5 4440............................$184.99
Team 8GB (2x4GB) DDR3 2400....$75.99
Asus H81M-D Plus.........................$54.99
Total................................................$315.97
Sapphire R7-260X 2GB.................$119.99
Total with card................................$435.96
In terms of IGP performance in games, the AMD is the clear winner, but the game would either have to run at a the lowest detail as possible or even at a lower resolution like 720P.
Adding a graphic card, then the Core i5 is clearly the better pick because it will dance around that AMD A10-7800. To some Core i5 may be out of their budget, so that is the reason I've mentioned before the cheapest way to go is get a AMD FX 6300.

AMD FX
AMD FX 6300...................................$119.99
Team 8GB (2x4GB) DDR3 2400.....$75.99
Asus M5A78L-M/USB3....................$59.79
Total.................................................$255.77
Sapphire R7-260X 2GB..................$119.99
Total with card.................................$375.76
Price difference between AMD A10-7800 vs AMD FX6300 $58
With AMD FX 6300 you get 6 cores and it's a proper desktop CPU. So I'm not being narrow minded. AMD's APU do have their places, but when it comes to real computing power for games, its crap. All this pom pom shaking about how good a APU is, is nothing but a waste of time. AMD needs to do something about their FX. Sadly they had given up and right now only focus on their APUs. If it wasn't for them to actually buy ATi, they probably won't be here today. it's their graphic cards that's keeping them up.
For those that still want to get a APU as a gaming PC, then get a AMD X4 740, 750K, or 760K because that money save, can be put towards a better video card.
AMD X4
AMD X4 760K........$89.99
Team 8GB (2x4GB) DDR3 2400...$79.99
Asus A78M-A.................................$67.79
Total................................................$233.77
Sapphire R7-260X 2GB..................$119.99
Total with card.................................$353.76
AMD X4 740 with card...............$338.76
AMD X4 750K with card.............$343.76

 
Status
Not open for further replies.