Gaming At 1920x1080: AMD's Trinity Takes On Intel's HD Graphics

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

chesteracorgi

Distinguished
I doubt that any serious gaming enthusiast would constrain himself to an APU/IGPU for the freseeable future. The advance of integrated graphics are impressive, but cannot even touch low-medium end discreet GPUs. My recent build for my grandson is limited to the IGPU on the i3 3225 until he shows me and interest in progressing to higher end gaming. If and when my grandkids start playing the likes of WoW, Skyrim or Crysis I'll take the plunge into a AMD 7000 or Nvidia 600. But, even before we get there we'll cannibalize the SLI setup of my GTX 470 to see if that will provide the sufficient horsepower.

Nevertheless, kudos to AMD on its Trinity platform, but I am interested to see how it competes with the IB platform when you add a discreet GPU.
 
+10 to THG for using the MP Holy Grail :)

What is amazing about the APU graphics is how they perform on older titles. Not all folks are interested in purchasing $50-$60 new games but are easily persuaded by $10-$20 bargains on the discount rack.

And as noted, not everyone is motivated by 1920x1080 at super detail. Dropping back to any lower 16:9 can greatly improve game play -- even for those who like higher settings.

And, not that I want to create extra work for you guys (!), but it would be interesting to see how Trinity stacks up against the *old* integrated Radeon HD4250 IGP -- just for snits and giggles on a few titles.

It's hard to judge how far we have come in a few short years without seeing where we were ...

 
A good article.

I'd like to see how the AMD A10's and the Intel CPU's with HD4000 go with a decent overclock.

If you could get about a 30% increase in framerate on the more taxing games then I'd consider making up a SFF gaming box (well a few actually for network gaming) as an entry level gaming machine.

Currently I use 3 X M405 Toshies for this sort of thing ... but we are restricted to the sorts of games that are pretty basic ... we are Freelancer addicts !!!

Our E450 Notebook is better ... but something like one of these might fit the bill a lot better.

I'd rather make up a SFF box with a decent monitor than use notebooks ... though the price on the notebooks has dropped a lot.

Anyway ... AMD's superior graphics will surely make Intel do some work on their replacement for the HD4000 ... I expect a healthy performance boost will be in their new offering shortly.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
Sleeping Dogs! must use sleeping dogs. it's a very HTPC-oriented title, as in, you'd want to run from your PC (for the graphics) and hook it up to a controller (for the control).
 

luciferano

Honorable
Sep 24, 2012
1,513
0
11,810
[citation][nom]americanbrian[/nom]I have to call out the review on choice of RAM as well. 1866 does not incur a significant price premium and would show the "TRUE" performance available to adopters of the AMD solution.It seems unfair that you would make this choice to bias the results to favor intel. When you tested the first i-series chips with triple channel memory you enabled that feature (correctly) as it is a feature of the hardware you are testing. Here there is a feature of the AMD hardware you have chosen to ignore. Not cool...[/citation]

To be fair, when Tom's did that for Intel's X58 platform, it was the very highest end consumer platform available. This is AMD's lowest end desktop platform where an entire computer based on it can be cheaper than a motherboard plus the lowest end CPU that was supported by X58 at launch (although to be fair, that's also related to prices on a lot of other components going down too, but still) and far cheaper than the high-end six-core CPUs for that platform.

Also, 1866 memory is not always as cheap as even 1600 memory. I think that you make a good point in that it should be cheap enough to have been used, but maybe it wasn't when Tom's got the hardware for these tests and maybe they simply didn't have anything else available. Tom's did other tests with RAM up to DDR3-1866 in some Trinity reviews, so maybe you can at least get a good guess of how it would help. It's a roughly 17% frequency jump and it probably increases bandwidth by a little under half of that, so maybe 6 to 8%, and performance is unlikely to increase by much less or much more than that. It would have been nice, but I don't think that it'd be a game-changer in performance. Maybe going up to 2133 would be more substantial coming from 1600.
 

luciferano

Honorable
Sep 24, 2012
1,513
0
11,810
[citation][nom]technoholic[/nom]i'm not impressed by this performance in a desktop machine although these trinities can be good for a laptop. AMD needs more steps forward to make these reasonable for desktop user, a couple of generations more[/citation]

Trinity uses Piledriver and is about 15% faster than Bulldozer per Hz while also being able to hit higher frequencies at a given amount of power consumption than Bulldozer, a CPU architecture that just happens to have an 8MiB L3 cache (even if a slow one) whereas Trinity does not. The graphics made a good improvement over Llano. With good memory, Trinity's A10s can probably match a Radeon 6670 DDR3 in gaming performance quite well whereas Llano could only hope to get close. It's not a huge leap in CPU nor GPU, but it's pretty good improvement and we have the actual desktop CPUs (Vishera) coming out (hopefully) soon enough and they should be faster than Trinity.
 

godfather666

Distinguished
Aug 10, 2011
132
0
18,680
great idea adding The Witcher 2!
It's a great game to include, because it is really demanding, even on low settings.

And if you use ubersampling, it can bring any system to its knees.
 
G

Guest

Guest
Small gripe; not a good idea using "auto" settings in a benchmark, because the game may choose different settings for different hardware.
 

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
Wouldn't 1280x720 be a more realistic settings with these systems? Up-scaling on todays TVs is pretty good. A comparison between 720 and 1080 would have been helpful.

But as we see, across the board - AMD kills intel in this area.
 
[citation][nom]Wisecracker[/nom]...What is amazing about the APU graphics is how they perform on older titles. Not all folks are interested in purchasing $50-$60 new games but are easily persuaded by $10-$20 bargains on the discount rack.And as noted, not everyone is motivated by 1920x1080 at super detail. Dropping back to any lower 16:9 can greatly improve game play -- even for those who like higher settings...[/citation]
This. Sure this doesn't impress the "enthusiast" market, but we're a very small percentage, in absolute numbers, and therefor also in total dollars available, even if we prefer $300+ graphics cards (and always will). Even if I have no desire to buy one, this kind of thing is making AMD's plan to focus on the APU look like a great business decision every time it comes up.
 

bawchicawawa

Distinguished
Dec 27, 2011
350
0
18,810
[citation][nom]mousseng[/nom]Keep in mind, though, that that's exactly what's going to allow AMD and Intel to advance their hardware faster than games will, as they were discussing in the article (first page of the interview). Look how far Fusion and HD Graphics have come over the past 3 years, and look how long the previous console generation lasted - if that trend is anything to go by, I'm sure integrated graphics could easily become a viable budget gaming option in the next few years.[/citation]

Of course, and look how far integrated graphics have come THIS far. 23 fps on bf3 at 1080p? On integrated? Sounds pretty beastly. I'm sure it will be a LOT better in 2014.
 

oomjcv

Honorable
Aug 4, 2012
15
0
10,510
According to this article: http://techreport.com/blog/23638/amd-attempts-to-shape-review-content-with-staged-release-of-info, AMD has had a say in this 'review'...
Is this in fact the case? To what extent was this article influenced, if so? If articles like this are to be published I think it should be clearly mentioned.
I'm an AMD and Toms Hardware fan and would prefer things like this rather not happen, I - and I assume others readers - expect reviews to be honest, transperant and independant.
 
[citation][nom]americanbrian[/nom]I have to call out the review on choice of RAM as well. 1866 does not incur a significant price premium and would show the "TRUE" performance available to adopters of the AMD solution.It seems unfair that you would make this choice to bias the results to favor intel. When you tested the first i-series chips with triple channel memory you enabled that feature (correctly) as it is a feature of the hardware you are testing. Here there is a feature of the AMD hardware you have chosen to ignore. Not cool...[/citation]
They've talked about this in the past and it basically was said they choose not to use premium memory, because these chips are not meant for enthusiasts. These are budget systems, and budget buyers use budget parts.
 

luciferano

Honorable
Sep 24, 2012
1,513
0
11,810
[citation][nom]oomjcv[/nom]According to this article: , AMD has had a say in this 'review'...Is this in fact the case? To what extent was this article influenced, if so? If articles like this are to be published I think it should be clearly mentioned.I'm an AMD and Toms Hardware fan and would prefer things like this rather not happen, I - and I assume others readers - expect reviews to be honest, transperant and independant.[/citation]

I doubt that AMD has the power to make APUs perform better than they can unless they used a new driver without telling us. I see your point, but I don't see what AMD could have done to influence this except to encourage Tom's to do the tests at 1080p or something like that and if Tom's didn't, then they could simply increase settings at the lower resolutions and results would be pretty much the same, just without being able to say that AMD can perform in 1080p in many games even on their IGPs.

The settings have to be set too low to use any settings that can overstate the performance differences because none of these IGPs have the performance to work with settings that change things around like if you were to do a set of comparisons of AMD and Nvidia high end cards, one with huge tessellation and one with huge AA, to show the performance characteristic differences. These IGPs look like they're simply too low weak to do something like that.
 
I don't think anyone really expects iGPU solutions to handle 1080 resolutions and if you do its asking for disappointment. That said it is encouraging as to how far Fusion is bringing iGPU solutions, not only in gaming but productivity, which is equally important for the market APU's target.

 

americanbrian

Distinguished
[citation][nom]luciferano[/nom]To be fair, when Tom's did that for Intel's X58 platform, it was the very highest end consumer platform available. This is AMD's lowest end desktop platform where an entire computer based on it can be cheaper than a motherboard plus the lowest end CPU that was supported by X58 at launch (although to be fair, that's also related to prices on a lot of other components going down too, but still) and far cheaper than the high-end six-core CPUs for that platform.Also, 1866 memory is not always as cheap as even 1600 memory. I think that you make a good point in that it should be cheap enough to have been used, but maybe it wasn't when Tom's got the hardware for these tests and maybe they simply didn't have anything else available. Tom's did other tests with RAM up to DDR3-1866 in some Trinity reviews, so maybe you can at least get a good guess of how it would help. It's a roughly 17% frequency jump and it probably increases bandwidth by a little under half of that, so maybe 6 to 8%, and performance is unlikely to increase by much less or much more than that. It would have been nice, but I don't think that it'd be a game-changer in performance. Maybe going up to 2133 would be more substantial coming from 1600.[/citation]

I was maybe a little harsh, in that they do not entirely ignore the fact that there is more performance available. However, if you bother to read the link posted in the review (which most people WON'T, which is what makes me call them out) you see that they have RAM available in there lab which they clock to 1866. It is there, sitting right in the lab.

And I don't know about where you are but the cost for 8GB of 1866 vs 1600 ddr3 is about $5 US or £3 GBP. Nothing really.

From the linked to benchmarks we see scaling in performance only in one game (WoW). I take issue with them choosing the 1600 for the rest of that review too. It is disingenuous in that an inexperienced person looking here for guidance may choose to duplicate that choice when for 1% of the total cost of the system ($5 of $500) they can realise an 8% total gain in gaming performance.

This is not really made clear and it is not fair to the average reader...
 

nice :D
steamroller: radeon 7790 with an integrated cpu! oh yeah!
i think nvidia has something like this in the works - like a gfx card with an arm cpu core.
 

godfather666

Distinguished
Aug 10, 2011
132
0
18,680
720p would have been really interesting because:
1. It's more realistic for these GPUs
2. It would be interesting to see if any CPU bottlenecks would show up
 

oomjcv

Honorable
Aug 4, 2012
15
0
10,510
[citation][nom]luciferano[/nom]I doubt that AMD has the power to make APUs perform better than they can unless they used a new driver without telling us. I see your point, but I don't see what AMD could have done to influence this except to encourage Tom's to do the tests at 1080p or something like that and if Tom's didn't, then they could simply increase settings at the lower resolutions and results would be pretty much the same, just without being able to say that AMD can perform in 1080p in many games even on their IGPs.The settings have to be set too low to use any settings that can overstate the performance differences because none of these IGPs have the performance to work with settings that change things around like if you were to do a set of comparisons of AMD and Nvidia high end cards, one with huge tessellation and one with huge AA, to show the performance characteristic differences. These IGPs look like they're simply too low weak to do something like that.[/citation]

Something went wrong with my link, here's the article I'm referring to: techreport.com/blog/23638/amd-attempts-to-shape-review-content-with-staged-release-of-info
 

luciferano

Honorable
Sep 24, 2012
1,513
0
11,810
[citation][nom]americanbrian[/nom]I was maybe a little harsh, in that they do not entirely ignore the fact that there is more performance available. However, if you bother to read the link posted in the review (which most people WON'T, which is what makes me call them out) you see that they have RAM available in there lab which they clock to 1866. It is there, sitting right in the lab. And I don't know about where you are but the cost for 8GB of 1866 vs 1600 ddr3 is about $5 US or £3 GBP. Nothing really. From the linked to benchmarks we see scaling in performance only in one game (WoW). I take issue with them choosing the 1600 for the rest of that review too. It is disingenuous in that an inexperienced person looking here for guidance may choose to duplicate that choice when for 1% of the total cost of the system ($5 of $500) they can realise an 8% total gain in gaming performance. This is not really made clear and it is not fair to the average reader...[/citation]

Hmm... Perhaps you were a little harsh, but I might have not been harsh enough too, now that I've read and thought about this post.

Also, about pricing on 1866 memory,
http://pcpartpicker.com/parts/memory/#v=1500&z=8192&t=11&s=301600,301866&sort=a8

The cheapest, quality 8GB DDR3-1600 9-9-9-24 kit is $31 (after MIR, but still).
The cheapest, quality 8GB DDR3-1866 kit has 9-10-9-27 timings and is $42 (also after MIR).

Also, I only counted 1.5V kits and didn't care to see if there are better prices on higher voltage kits. I wouldn't want a higher than 1.5V voltage kit. IDK if you don't mind going to a higher voltage or not, but I wouldn't, at least not with these APUs. One thing going for the 1866 kits is that they have a good chance of being able to be overclocked to DDR3-2133 without unreasonable timings, maybe even at stock voltage of 1.5V or at least below 1.6V. The difference in performance between 1600 and 1866 isn't huge, but 1600 to 2133 is probably a greater boost. I've had better luck overclocking most 1866 kits to 2133 than I've had getting 1600 to 1866 without voltage hikes or crap timings and this could be more incentive.

Still, that price difference is well under (by percentage) the performance difference, so despite my semantic ramblings, you do seem to be correct.
 

luciferano

Honorable
Sep 24, 2012
1,513
0
11,810
[citation][nom]oomjcv[/nom]Something went wrong with my link, here's the article I'm referring to: techreport.com/blog/23638/amd-attempts-to-shape-review-content-with-staged-release-of-info[/citation]

I read the article, but all it said was that AMD was asking some sites to refrain from posting non-gaming benchmarks. It didn't say anything about polarizing the benchmarks that were used and Tom's had a review about four months ago that was full of non-gaming benchmarks anyway. This was just a follow-up with a gaming focus like AMD asked for and shouldn't be held against Tom's for that because Tom's already did a CPU performance review of these desktop Trinity APUs.
 
Status
Not open for further replies.