Battle At $140: Can An APU Beat An Intel CPU And Add-In Graphics?

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
For as much as i'm an AMD fan the fact of the matter is still the same intel pretty much wins, but then again the difference is not much, what i would like to see is how well does the A8 can play with older titles like oblivion or unreal 3 i'm not intrested in the newer titles i think they are garbage and a waste of money also how well does the A8 do in windows xp?
 
I understand why people would build a cheap system for doing office work, or just surfing the internet, and I get that it's good to have built in graphics in case your graphics card fails, but I'll never get why someone would waste money on an apu or low end graphics card for gaming. Save $50 or $100 bucks more and you could actually play some games at decent resolution with higher settings.
 
[citation][nom]Cleeve[/nom]Wow and MMOs maybe, but not demanding games like Metro 2033 or Battlefield 3. You'd need a 6670 GDDR5 for that at the very least.I won't use 4 GB because we'll end up with the same situation. Everyone will complain that AMD's integrated graphics isn't given enough RAM.[/citation]

Cleeve,

I cannot disagree. You'll always get complainers about this tidbit or that. But that doesn't necessarily negate questions and points that some do bring up, even though they do tend to get lost very quickly in the whining. I know you have done your best to answer the ones that haven't gotten lost. (It saved me from a few more questions, or a bad-attitude asking of my following questions, I hope.)

I too have some questions that I will state hopefully in a voice of a student wanting to learn from their teacher, and not as some ungrateful, spoiled, know-it-all, smart*** jerk.

1. In the beginning you state the Intel CPU as 65W and the GPU card as 66W TPD... and later state their total power draw is 101W. vs the APU's 100W TPD. Is there an error existing in this? or have I missed something, as my calculations would suggest 65 + 66 = 131? I think some points were already addressed and fixed on this, but this may not have.

2. In the process of overclocking the APU, was the FSB (not just the clock multiplier) and/or memory controller adjusted and tweaked? Or were those features the selected mobo did not support? (if not, features can sometimes make a little bit of difference in performance.) if the Mobo for Intel was a little more expensive, was there one for the APU that supported memory and FSB tweaks and still did not exceed the price of the Intel Mobo? (based on your statement of mobo price doesn't really effect performance, a little leniency maybe?)

3. What was the video memory setting for the shared APU/GPU system memory? Was it set to equal the memory on the discreet card? I think you'll agree, that even your own testing has shown the amount of video memory can effect performance, although in this case, by how much is unknown to me since we are using system memory 100% to start with.


[citation][nom]cangelini[/nom]... With all of that said, Thomas is currently working on a memory round-up for FX. If you think an exploration of memory controller overclocking might be useful to the community, I can suggest that he do a comparison of several different architectures afterward?All the best, and thanks for reading peroludiarom. The feedback is always read and appreciated =)Best,Chris[/citation]

Chris, I think that would be a great thing to do. It can at least answer some questions, maybe generate more to look into though too. But it can be a good start for some solid research and empirical evidence on how well it works. I know we've seen memory overclocking done in the past. I remember the disappointments in one SBM where a part number on RAM was the same, but turned out to be slower RAM. (I think Crucial felt the bite, IIRC, enough on that to reintroduce the other back, but with a modified part number and maybe a different price.) AND that that had hurt the overclocking potential of the SBM build in question.
 
Quote from page 1 : Interestingly, the discrete Radeon card includes 800 MHz DDR memory, and that's what we're using as system RAM, complementing the APU configuration.

Really? You used 800mhz ram in the Amd build? Hell I'm surprised it performed as well as it did!
 


That was a total mistake on my part, I looked at the wrong TDP for the G620. It *is* 65 W.

I think I fixed that in the article but if you still see it let me know.



Frankly, I left the FSB out of it to keep it simple. If I were to do that i would have to overclock the Intel system to keep it fair, so I simply went with the multiplier. Same reason for the memory controller.

My objective to overclocking the APU was never to push it to extreme limits. It comes with an unlocked multiplier, so I wanted to show how it could be taken further relatively easily. once again, if the target was a hardcore overclocking I would have pushed the Pentium FSB and graphics card as well.




Actually, we've shown time and time again that the *amount* of video memory rarely has an impact except in extreme circumstances with super-high resolutions and high levels of AA, and usually only in games that have colossal texture sets set to their highest detail setting... none of which was the case here, at low detail and low resolutions. As such, I left it at the 'auto' setting where it grabs as much as it needs for the task at hand, although to be honest I'm not sure what the maximum amount of VRAM the A8-3870K's GPU side is allowed to reserve. But with 8 GB onboard it had a lot of headroom to play with.

Hope that helps. :)



 


800 MHz base clock = 1600 MHz DDR3 clock.

DDR= Dual Data Rate :)
 
AMD led us to expect a lot out of Fusion, and while it delivers in the mobile space, (very well, I must add) it just doesn't make sense on a desktop for gamers. Perhaps it would have been better pushed as a HTPC processor.
 
[citation][nom]Cleeve[/nom]That was a total mistake on my part, I looked at the wrong TDP for the G620. It *is* 65 W.I think I fixed that in the article but if you still see it let me know. Frankly, I left the FSB out of it to keep it simple. If I were to do that i would have to overclock the Intel system to keep it fair, so I simply went with the multiplier. Same reason for the memory controller.My objective to overclocking the APU was never to push it to extreme limits. It comes with an unlocked multiplier, so I wanted to show how it could be taken further relatively easily. once again, if the target was a hardcore overclocking I would have pushed the Pentium FSB and graphics card as well.Actually, we've shown time and time again that the *amount* of video memory rarely has an impact except in extreme circumstances with super-high resolutions and high levels of AA, and usually only in games that have colossal texture sets set to their highest detail setting... none of which was the case here, at low detail and low resolutions. As such, I left it at the 'auto' setting where it grabs as much as it needs for the task at hand, although to be honest I'm not sure what the maximum amount of VRAM the A8-3870K's GPU side is allowed to reserve. But with 8 GB onboard it had a lot of headroom to play with.Hope that helps.[/citation]

Cleave we still see on first page CPU - 65W and GPU Radeon 66W, and usage under load *101W* ??? That has to be fixed or we miss something else?
 
Nevermind Cleeve. He's just another Intel brown noser.

This site is supposedly international by the many different sites it comes in. And here, Denmark, you can get a DRR-3 1866 8Gb RAM for the EXACT difference in price between the used MBs in the test and not getting the DDR-3 1600 in the first place. So the test should have been made with those part. Pure and simple. Not to mention this "test" isn't made with a Pentium G620 at all as it claims. It's made with a Pentium G620T. A completely different part with completely different price. Here, again Denmark, the G620T is 30% more expensive (on the dot) than the G620.
 
After checking the prices on the Intel mobos and the the effect on gaming of the faster RAM for the AMD machine I am inclined to agree with Don.

Please ignore my previously critical comments ... I should have checked my facts before blathering on.

:)
 

In netherlands it is 20 euro more expensive.
You are completely right.
 


Oh! I see what your concern is.

OK, the explanation is simple: just because something has a maximum TDP doesnt' mean it *HAS TO* reach that TDP. Different loads create different power requirements.

Also, sometimes manufacturers will list a TDP that's above the maximum. For example, the 2.6 GHz Pentium G620 has the same 65 W TDP as the 3.0 GHz Pentium G860. Is it reasonable to expect that they use the exact same power for different clocks? Not really. intel obviously fits them under the same TDP blanket, but if the G860 is a 65W part then the G620 should logically use less power under a full load.

The charts are valid; the G620 simply doesn't use all that much power.

There might be a super-high theoretical load that would drive it closer to that power usage, but our tests didn't push it that far.


[edit] I explained this backwards the first time around, so please excuse my brainfart:

Look at the stock power usage. The A8-3870K uses more power for CPU-intensive tasks.

This is no surprise, think about it: the APU has twice the execution cores and a higher clock speed than a Pentium G620.

But graphics power usage is slightly lower, that's about what we should expect considering the relatively close stats of the 6670 and A8-3870K integrated GPU.

This is all where it *should* be. The power results are spot on. [/edit]

 
[citation][nom]Cleeve[/nom]Oh! I see what your concern is.OK, the explanation is simple: just because something has a maximum TDP doesnt' mean it *HAS TO* reach that TDP.The charts are valid; It simply doesn't use all that much power.There might be a super-high theoretical load that would drive it to that power usage, but our tests didn't push it that far.[/citation]

Completely wrong (and you know it). Under load means "under load". Run the test with the correct G620 and not G620T and it will read 137W power consumption under load.
 


Wow. You don't seem to be getting it.

Let me make it clear: I *DO NOT HAVE* a 35 W 2.2 GHz G620T here.

I have a 65 W 2.6 GHz Pentium G620 here. Let me repeat, the CPU is 2.6 GHz.

A G620T is not 2.6 GHz, it's 2.2 GHz. The CPU I tested is 2.6 GHz, therefore I do not have a G620T.

I'm not sure how you would come to the conclusion that I'm wrong "and I know it". That makes no sense at all.

"Under load" does not = "Guaranteed Maximum TDP"



 
[citation][nom]saturnus[/nom]Completely wrong (and you know it). Under load means "under load". Run the test with the correct G620 and not G620T and it will read 137W power consumption under load.[/citation]

This made me lol. xD
 
[citation][nom]saturnus[/nom]Here too. And they could have used the difference on better memory which is known to bottleneck the Llanos. That would probably have pulled the Llano ahead in all tests, not just power consumption.[/citation]
you can search a cheap intel base chipset to get the same cost as amd chipset base motherboard and you'll still have the same conclusion with this testing result..
 
I don't think I've ever seen a Tom's editor so clearly frustrated by the comments, but I side with them all the way.

Every single one of you talking about getting faster RAM for the APU can shut up. If you're gonna spend extra money on RAM on a budget, you might as well skip the APU, and the RAM, and just get a discrete card, with a low end CPU.

It's like people who buy hybrids for the gas savings, when it will take them 5-8 years to make up the extra cost of the hybrid system, in addition to the fact that you can produce and drive a Hummer for 100,000 miles before you catch up to the environmental cost of producing the battery/hybrid system.

Seems silly, right? Well that's all you whiners.
 
[citation][nom]kinggraves[/nom]I'd really like to see how much you're going to pull out of the Intel system, with a completely stable OC. If we're getting into changing the bus speeds and full stability isn't a concern, the AMD system could function on many FM1 boards with 133 Mhz, bringing the "potential" to nearly 5Ghz (An unrealistic potential on air, granted) while also OCing the iGPU. No no, in terms of OC potential, the winner is obvious.I think people just wanted to see the results of how the better memory would have performed, even if it was a separate test bar and you had to note that config cost slightly more. It isn't as if using better memory would double the results. It IS worth noting however, that your APU results are reduced by using that memory.I mean, as far as value goes, you could also get an A8 3850 for 10 dollars less and get nearly the same performance as the stock 3870. One of the graphics disabled models like the 631 might have been interesting paired with a discrete too, I don't know if anyone ran that comparison yet. How would an A6 3500 fare in these waters for $80, considering the graphics side is fairly close to an A8 and the CPU is still a triple core?[/citation]

Even with 3600MHz CPU/960MHz GPU we have the A8-3870K hitting almost 300w at load. Try getting a cheap cooler that can handle that. It doesn't matter if the A8 can go higher if allowed to because it could ONLY hit such high frequencies in laboratory settings, if even there. It would take liquid nitrogen, at a minimum, to cool it like that. It might need liquid hydrogen.

On the other hand, the Intel system can get a higher overclock if allowed to because it has it's heat spread across two different places (the CPU and the discrete video card's GPU) and Intel systems are already more efficient because Intel has better tech.

For example, an i5 or i7 can hit 5GHz on air if you're lucky, but no AMD CPU can, because Intel is more efficient even though Intel is also faster. AMD's often solution nowadays is to add more cores to get good multi-threaded performance. This doesn't mean that AMD has better technology, in fact it means the opposite and that AMD is trying to compensate for their problems. There is nothing wrong with trying to fix something, but it is not enough to fix poor performance in most software because most software doesn't utilize more than one or two cores. Going beyond 4 cores and you are pretty much left with mostly professional and server oriented software.

Having many cores on a system that doesn't use them all very well offers rapid diminishing returns on increased performance from increased core counts whereas improving the cores instead of adding more will always improve performance, unless there is another bottleneck other than the CPU cores.

Personally, an A6 plus a discrete card would undoubtedly be better than an A8 so that is definitely something that should be looked into. Just don't think that the same applies to an A4. An A6 has 20% fewer shaders so it is even slower than a Radeon 5550 (that is what the A8's graphics is based off of, not a 6550 because there is no such card anyway).

A lot of people seem to not understand something about TDPs and power usage. TDP is a very approximate number in many cases and should not be used to measure real power usage. Fully loaded scenarios are only going to happen when you do something like a burn-in benchmark test, otherwise you will probably never get even close.

A 65w TDP Pentium will usually max out at under 50w, normally closer to 45w, despite it having a 65w TDP. TDPs are also given for processors that just barely don't fit within a lower TDP bracket. If the Pentium used only 34ws maximum, it would still be given a 35w TDP, if it used only 40w at the maximum, it would still get a 65w TDP. This is done because a CPU/APU family has preset TDPs that a processor must fit into. A 50w Pentium fits into the 65w TDP bracket, but not the 35w TDP bracket, so instead of making a new TDP bracket they simply called it a 65w CPU. If the Pentium used much more than 65w, then it could have been given a 95w TDP instead. Intel has 35, 65, and 95 for the desktop Sandy Bridge processors, and most of their other processos. For the platforms that have CPUs that go above 95w, we have 130w and the much rarer 150w. If I remember correctly, only a few Xeons, the very top Core 2 Quad, and the top Netburst CPUs (Pentium 4 and Pentium D) have 150w TDPs.

Further confusing this, some processors go beyond their TDPs when they are fully loaded on stuff like a burn in benchmark instead of topping out at the TDP. AMD is supposedly notorious for this, but I offer no proof about this myself and this claim should most definitely be taken with a grain of salt until someone else shows some proof, unless you look this up yourselves.

Granted a Pentium probably uses more than 45w when you do a burn in test like Prime, but even if it only used 50w or 60w it would still get a 65w TDP because that is simply the way that hardware manufacturers decided to do things.

This stuff leads to great confusion and I understand why people are not understanding this and hope that this explanation helps.
 
[citation][nom]tourist[/nom]First i am really glad toms did this article it reiterated what most of us already know liano's cpu is weak compared to intel and it does not change when you add a discrete card. But how many will be sold with discrete graphics cards ? Yes anyone here would go buy a discrete card to play 1080p but there is a difference between hardcore and casual gaming. I will certainly entertain the the fact that casual gamer's should not be here and move to somewhere like pc mag for there info. But it will not change anything. Look how the builds have been going down 600 dollars 500 dollars 400 dollars now we are in the 300 dollar range.They are some of the most heated discussion showing how important this segment is. My suggestion would be start withBase configuration with igp for both then work up the line from there retesting each time with games 720/1080 Using 6570,6670 in dual graphics mode 8gigs ram 1600 ram. It might be a slide show at times but relevant info. Maybe even throw in the 3670K also ? At each point configure memclock and timing optimizationand if you are using only 4gigs ram why did you use 64bit os instead of 32bit? The specific answer i want to know is at what equals a8 3850 + 6670 from intel ? is it a 620 +6770 or maybe a 620 + 6750 Some would say it is irrelevant but at this price point i think it apply s.I still maintain the liano is the better choice because it offers the most gaming performance in a base configuration and even up to a 6670 when using discrete, JMHO Thanks in advance for answering my question[/citation]

1080p? That is a little ambitious for any APU+discrete GPU dual Crossfire combo. You might get away with 720p, but not at high settings and AA. The more intense games like Metro 2033 and BF3 need a Radeon 6770 to do 720p at maxed out settings and AA, not even enough for great 1080p gaming until you get a Radeon 6870 or Radeon 6950. The GTX 560 TI is comparable to the Radeon 6950, but the GTX 560 that is comparable to the 6870 uses the same amount of power as the GTX 560 TI and is thus junk in comparison to the much lower power using 6870, at least according to previous Tom's articles.

Just because Llano is better without a discrete card than Intel is it is better than Intel? Sorry, but even entry level gaming has used discrete video cards up until Llano came along. When Llano first came out, it was okay for entry level gaming, but it is showing weakness now. Trinity will undoubtedly give far greater performance, but it won't be cheaper. After Trinity comes along we might see Llano going down in price, but at that point we might also have games that are just too much for Llano anyway.

Llano with a 6670 will beat Intel with a 6670, but the Intel system will be a lot cheaper unless the buyer is stupid and gets an i3 instead of a Pentium G620 or Celeron G530. Llano without a 6670 is about equal in price to the Pentium 620 with a 6670 and the Pentium system is thus faster, more power efficient, all without being more expensive. Go for the Celeron and you save another $20 and it undoubtedly isn't slow enough to bottleneck a Radeon 6670 (it is 200MHz slower than the Pentium and has 1MB less L3, shouldn't make a difference in games but could in application work).

Go for the Celeron G530 and you can spend another $20 on a graphics card, get a 6750 instead of 6670 or maybe Newegg will have a 6770 on sale instead. Still the same price as the AMD system, but the faster graphics card could close the power usage gap to be on par with a stock A8-3870K/3850. However, the huge performance difference in games would justify spending the same amount of money and the same power usage :)

Not that I'd be caught using a Celeron or Pentium myself, but money tight gamers might love it.
 


A $140 A8-3870K plus a $70 Radeon 6770 = $210.

For that price, you could get a $70 Pentium G620 and a $140 Radeon HD 6790 with GDDR5 RAM that would totally dominate when it comes to gaming frame rates.
When more money is added, the Pentium gains a notable advantage.




I don't agree with you based on the results in this review, as far as building a system.

If you're buying a mobile platform that can't be upgraded, Llano rules with no competition; but when you're piecing together a system the APU a very hard sell if gaming is your goal.
 
I recently ut together a budget build and the price diffence in AM3 vs LGA 1155 MB's is what finally made me go for the AM3. The $$ I saved on the MB went for abetter GFX card and now my sub $300 MB, CPU memory and GPU can play SCII on ultra @ 1080p.
 
1. bla bla bla in my country bla bla bla. Use a different board. The only differences you'll find are efficiency (tiny bit), features and OC tolerance, and since the pentium can't be OC'd, no point bothering about it. From all the intel mobo reviews i've read, RAM speed remaining constant, i don't remember a difference of more than 2 fps.

2. The point of using the same RAM is to minimize variables. Fine, for the sake of testing: Don pls spend $40 on a 2x2GB DDR3-1866 kit.

3. Athlon stocks are diminishing, probably why it wasn't used. Or maybe he didn't think it necessary. Though seeing that llano is based on the athlon arch, i don't see the point of using a slower cheap athlon and expecting performance gains. Again, for the sake of a test, why not.

4. Power draw. Ambiguous at best for the pentium. Idle wattage is believable irrespective of the -T model, as almost all SB CPUs (35w+ models) idle around the same. Under CPU load, system draw shows an increase in 30 watts. I would not expect a dual core pentium with that clock to draw any more. According to anandtech, load power consumption for this chip should be around 20w more (assuming the 65w part), so i think the power consumption is valid.

[citation][nom]sarcasm[/nom]Will they demo a real game this time? lolAnyway, I'd like to see an i5 for $140... Don't know where in the world you get the idea that Intel will price it anything below $180 like the Sandy Bridge i5-2300.[/citation]
Probably microcenter, best guess i can make. I remember seeing a sub-$200 dollar 2500K listed there.

Edit: This post, poor thing, is slightly miplaced, it belongs to the 4th page, but my UPS failed me last night.

As a result, some of what i've posted is now a repetition of what others have said.

Also, i was just saying "for the sake of the test, retest" so that the AMD fans could wake up and face reality, but since you've mentioned it, i realise that you've better work than catering to tweenage fanboys who can say "APU" faster than they can think.

As always after a CPU article, i admire your patience Don. 😀
 
Status
Not open for further replies.