AMD "Trinity" APU Models Release Schedule Details Leaked

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
The HD 7660D has 384 Graphics CoreNext stream processors.

Hmm ok. My A8 llano has the 6550D with 400 shaders. That the same thing? I've clocked it up to 960mhz, which funny thing, works great on many benchmark suits. A realistic clock speed that actually works in games without causing crashes and restarts is about 930mhz, tied to DDR3 memory at 1600, is a bus width of 128 Bit and bandwidth about 25.6 GB/s, a texture fill rate of 18.0 GTexels/s, and a pixel fill rate of 7.2 GPixel/s.

I can tell you right now stepping up to DDR3 at 1866 isn't much difference at all that I can tell.

In a nutshell, and Im jus pulling this out of where the sun dont shine, from what Ive read or seen, means a difference of playing DiRT 3 on medium settings instead of low, when comparing the current A8 to the future's A10. How far off am I? (to the experts)

Maybe I just want to feel good about adopting the A8 today instead of waiting for the A10. The multitasking experience sounds like it will be well improved, and more drastic a difference than the gpu side of things.
 


VLIW5 shaders (yours) and GCN shaders are pretty different, the 384 will beat the 400 VLIW5.

The Trinity A10s should be significantly faster than Llano A8s in both graphics and CPU performance. We see conflicting reports on by how much, but the lowest I've seen was 20-30%. 1866MHz memory does provide a benefit over 1600MHz, but it will probably provide a larger benefit with Trinity A10s because they will have an even greater memory bottleneck. You definitely get better frame rates even on your A8 from using 1866MHz memory, but it might not be enough of an increase for you to see it.

As for how far off you are? It's hard to say because it depends on several factors, not the least of which the actual performance of these A10s when we see them reviewed. It will also depend on the game being played.
 
[citation][nom]blazorthon[/nom]Ivy Bridge HD 4000 graphics is supposed (as said by Intel) to be 60% faster than HD 3000 which puts it, at best, in the range of the Llano A4s and the Radeon 6450, not even close to the Llano A6s let alone the new A10s.[/citation]

Actually, if you look close at the benchmarks, they compare to the Intel HD 2000, not 3000. It's 60% faster than 2000, which means its only 30% faster than 3,000. NOT TO MENTION that they are using the i7-3770 (Which is around $300). Ivy Bridge is a step up from sandy bridge and will have some nice new features, but nothing close to what AMD will do. Of course, Intel just does what they want since the consumers get herded in their direction anyways.
 


They do have the new GCN cores... Please read at least the start of an article before posting.
 
Wasn't AMD's trinity supposed to use the old VLIW4 architecture rather than Graphics Core Next? Also I seriously doubt that running the Piledriver cores in Trinity at 3.80GHz is a very good idea. I can see plenty of overheating issues from that ... and it will most likely make after-market coolers mandatory!


-- To all the people arguing whether this will "smash" Ivy Bridge, or Ivy Bridge will "smash" Trinity ... You have to remember that these are two very different products. The knowledgeable Ivy Bridge buyer will be concerned with performance most of all especially in CPU intensive tasks such as Video editing, Computer design, programming etc ...
-- On the other hand, a knowledgeable Trinity buyer will be considering Budget, versatility, and power consumption as their main reasons for buying a system. The ability to go from office use in an economical power-saving mode, to running a casual game would be their main requirement that Trinity would solve best.

I, for example will definitely be buying a the Trinity A10 (if these specifications regarding the GPU are true) when it comes out to power a Media-Center computer, that can double as a low power print/file server. This won't cause me to throw out my i5 2500/560Ti workhorse (which I will probably also upgrade to Ivy Bridge) as the two computers will perform very different tasks. ... and any processing that needs to take place (such as encoding and compressing recorded television from the Trinity based Media center) I would offload to my Sandy Bridge workhorse.


The laptop market however is a completely different story. Compare a dual core Ivy Bridge with poor graphics performance to a quad core Trinity with good graphics, where neither are bought for raw CPU power, the knowledgeable buyer will choose the one with lowest power consumption and the best graphics. In this instance Ivy Bridge may have met its match in Trinity ... But only time will tell.

 
[citation][nom]halls[/nom]Bummed to see AMD drop out of the enthusiast desktop race with Intel, but I have to admit it looks like they made a good decision. If those APUs have as much video processing power as it looks like they do it could be an amazing budget choice.[/citation]These will be nice for the HTPC crowd. You could easily build a competent HTPC with no discrete graphics, using Trinity. Especially if these really do have GCN and UVD3.
 
If those GPU specs are accurate you're looking at roughly 75% the raw performance of the recently released 7750, if the memory can keep the stream processors fed.

That would put you at:
30fps in BF3 @ 1680x1050 (NO AA)
32fps in Skyrim @ 1680x1050 (NO AA)

Approx 30Watt from the GPU leaving 70Watt for the CPU.
CPU clocked nearly at the FX-4170 frequency which was a 125W CPU.

Almost seems impossible to go from 125W to 70W just from piledriver enhancements. That's a huge power savings in 1 tick. Impressive if accurate.
 
[citation][nom]Cazalan[/nom]If those GPU specs are accurate you're looking at roughly 75% the raw performance of the recently released 7750, if the memory can keep the stream processors fed. That would put you at:30fps in BF3 @ 1680x1050 (NO AA)32fps in Skyrim @ 1680x1050 (NO AA)Approx 30Watt from the GPU leaving 70Watt for the CPU. CPU clocked nearly at the FX-4170 frequency which was a 125W CPU.Almost seems impossible to go from 125W to 70W just from piledriver enhancements. That's a huge power savings in 1 tick. Impressive if accurate.[/citation]

It's only 3.8GHz and the 95w 4100 is at 3.6GHz, so it might not be quite as large of a jump as you think. Considering Piledriver is supposed to be 20 to 30% faster IPC without increasing power usage, if not even faster. It doesn't seem so far-fetched.

There's also the lack of L3 cache to think about because losing it should also decrease CPU power usage.
 
I enjoy the idea of APU, most games don't utilize CPUs heavily anymore, so a with a decent quad core you can reach high/max settings for most games.

APUs is what i would call a stepping stones for people starting to game on PC, or just wanting lower end titles such as WoW.
APU's are for the most part cheap and can play the game you want to play, later on buy discreet gfx and put details up! in most games the difference between Low and medium is very significant so APU playing later games on medium at 720p would be fantastic.
 
Status
Not open for further replies.