28nm Trinity Successor Rumored To Debut in Q2 2013

Status
Not open for further replies.

back_by_demand

Splendid
Jul 16, 2009
4,822
0
22,780
0
The low end versions of these would be low enough TDP to use for nettops, but powerful enough for relatively powerful gaming. Obviously not very high end games but totally ready for an all-in-one HTPC.
 

The_Trutherizer

Distinguished
Jul 21, 2008
509
0
18,980
0
[citation][nom]goodguy713[/nom]the release cycles are already too long ..[/citation]

Wut? Feels like just yesterday the first APU came out. What's the hurry these days? Nobody can afford to buy every shiny new toy anyway.
 

The Greater Good

Distinguished
Jan 14, 2010
342
0
18,810
7
[citation][nom]back_by_demand[/nom]The low end versions of these would be low enough TDP to use for nettops, but powerful enough for relatively powerful gaming. Obviously not very high end games but totally ready for an all-in-one HTPC.[/citation]

A lot of us tech guys (and girls) forget that not everyone needs the power that we do. Heck, sometimes WE don't even need it. This APU would meet the needs of most computer users and coupled with an SSD, would be great.
 

Maher90

Honorable
Mar 8, 2012
84
0
10,640
2
Trinity and "RichLand" are the most recommended Processors (APU's i know) for any Budget Limited People,infact i watched on youtube the upcoming A10 and it seems good with Low and some Med settings on BF3 and other things :D (although i don't know why they added DX11 if it's not runnable at all :l?) i really like those APU's and well i guess Intel will succeed in everything even if they made Mini CPU's like what i like to call them,they will win with it.
 

dudewitbow

Dignified
[citation][nom]A Bad Day[/nom]I thought APUs typically use the previous generation GPU architecture?[/citation]
The IGP in trinity uses 7xxx, it would make sense that the next gen chip will use the 8xxx. This also brings a high probability that the next radeon generation should be released before then as well.
 

supall

Distinguished
Jun 17, 2011
103
0
18,680
0
[citation][nom]goodguy713[/nom]the release cycles are already too long ..[/citation]

I fail to see how "a year" for a new Trinity APU is "too long".
 

werfu

Distinguished
Sep 27, 2008
54
0
18,630
0
[citation][nom]A Bad Day[/nom]I thought APUs typically use the previous generation GPU architecture?[/citation]

I makes no sense to use previous generation design in an APU, as the GPU part of it will take part of the global envelope. You want to squeeze out the best performance per Watt, to leave the most thermal capacity to the CPU part, where it is badly needed.
 

shloader

Distinguished
Dec 24, 2001
231
0
18,690
4
[citation][nom]The_Trutherizer[/nom]Wut? Feels like just yesterday the first APU came out. What's the hurry these days? Nobody can afford to buy every shiny new toy anyway.[/citation]

More like yesteryear. Even then it was packing Phenom cores while Piledriver came out. Granted that was the best choice at the time but APUs should be keeping up with the times from now on. Having a A8-3850 I see no compelling reason to scrap the motherboard to upgrade the CPU but I'm happy to see progress in this area all the same. When APU meets DDR4 I think I'll step up.

Speaking of 'keeping up' the FX line needs to get the revised (fixed?) design, too. AMD doesn't have a compelling reason to upgraid from the 1090T, yet.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
2
[citation][nom]dudewitbow[/nom]The IGP in trinity uses 7xxx, it would make sense that the next gen chip will use the 8xxx. This also brings a high probability that the next radeon generation should be released before then as well.[/citation]
[citation][nom]werfu[/nom]I makes no sense to use previous generation design in an APU, as the GPU part of it will take part of the global envelope. You want to squeeze out the best performance per Watt, to leave the most thermal capacity to the CPU part, where it is badly needed.[/citation]

http://www.tomshardware.com/reviews/a10-5800k-a8-5600k-a6-5400k,3224.html

Moreover, Trinity employs a newer graphics architecture than Llano. Instead of the VLIW5 arrangement, which also sat at the heart of Radeon HD 6800 and older GPUs, it utilizes the VLIW4 design that went into AMD’s Radeon HD 6900-series cards. Everything after the 6900s swapped over to Graphics Core Next, so VLIW4 isn’t a very prolific implementation. But it’s supposed to be more efficient. Naturally, then, we all want to see how Trinity’s on-die GPU compares to what came before.
I don't know why AMD would use a older architecture. Maybe it was because the APU and GPU developments aren't synced and/or the APU team has little time by the time the new GPUs arrive.
 

sonofliberty08

Distinguished
Mar 17, 2009
658
0
18,980
0
hope that richland will beat llano on every bench, current trinity still can't beat llano on some bench yet,
if not, i still think the die shrink and improvement of the starcore are better than the new bulldozer architecture.
 

verbalizer

Distinguished
May 28, 2010
2,930
0
20,960
96
[citation][nom]jryan388[/nom]As much as I like to bash bulldozer/piledriver, I think it's probably fine for most people... and with a great igp, it's better than intel...[/citation]
:/ - you must have lost your mind..
 
[citation][nom]dudewitbow[/nom]The IGP in trinity uses 7xxx, it would make sense that the next gen chip will use the 8xxx. This also brings a high probability that the next radeon generation should be released before then as well.[/citation]

The Trinity IGP uses die-shrunk Radeon 6900 series VLIW4 cores. The Llano IGP used Radeon 5000 VLIW5 cores, not even Radeon 6000 VLIW5 cores. They might be called Radeon 7000 and 6000 IGPs, but that's just because of their release times and some of their feature sets. For example, Trinity is supposed to have the Radeon 7000 VCE feature. However, it still is a 32nm die shrink of VLIW4, not a GCN implementation.
 
[citation][nom]Ninjawithagun[/nom]Too little, too late - AMD is done. I foresee AMD giving up on the CPU market and exclusively developing graphics cards only by end of 2013. AMD had a chance to keep up with Intel starting back in the mid-2000s. But, unfortunately thanks to extremely sloppy CEO management, they are no longer competitive within the CPU market. Intel is literally outclassing and outperforming AMD CPUs in every range of the CPU families. How sad it was to see AMD release its brand new Bulldozer CPU family, only to see it outperformed by Intel's 1st generation Sandybridge CPU family! Seeing a quad-core CPU with hyperthreading beat the pulp out of a true octa-core CPU is sad indeed.[/citation]

AMD wasn't competing well with Intel in the older days of superior AMD CPUs being outsold by slower and more expensive Intel CPUs because of Intel's illegal and monopolistic practices that they are still being fined for to this day. AMD later on had sloppy management problems and still does, but back then, that was not their problem. Furthermore, Intel is not winning in everything. At any given price point, AMD easily wins in highly threaded performance and when you get down to the very low end, Intel has nothing but dual core CPUs that lack even Hyper-Threading Technology, so they have nowhere even near AMD's highly threaded performance or even near AMD's quad threaded performance.

Also, taking an FX-6100 or FX-8120 and disabling one core per module (or prioritizing one core per module over using both cores except for highly threaded workloads) gives them a significant speed boost in per core performance while decreasing power consumption even more greatly. A $170 or so 8120 that can compete with the non K edition i5s in gaming performance and the 6100 in the same situation at a lower price point and only up to triple threaded performance can be very competitive today, although Haswell would almsot defintiely outclass them both substantially.
 
[citation][nom]A Bad Day[/nom]http://www.tomshardware.com/review [...] ,3224.htmlI don't know why AMD would use a older architecture. Maybe it was because the APU and GPU developments aren't synced and/or the APU team has little time by the time the new GPUs arrive.[/citation]

An APU needs to be built off of GPU and CPU cores that already work. They will be slightly modified for the APU to work with both parts together, but they will be mostly the same as preexisting implementations. Can't include GPU/CPU cores that aren't built or almost built when the APU designs start being worked on because they would not be able to account for the newer parts because the newer parts aren't even built yet. It would be like trying to use a Core 2 CPU in a P4 motherboard before Core 2 is even taped out.

So, AMD uses the best that they can for the time. When Trinity was being designed, GCN was not finished yet, but Cayman's VLIW4 was finished quite a while before Trinity started being designed and was the best that AMD had at the time. One die shrink later, it's probably about as energy efficient as a 28nm GCN GPU of similar performance would have been anyway, so the only major loss would probably be in compute performance.
 

pacioli

Distinguished
Nov 22, 2010
1,040
0
19,360
36
[citation][nom]Ninjawithagun[/nom]Too little, too late - AMD is done. I foresee AMD giving up on the CPU market and exclusively developing graphics cards only by end of 2013. AMD had a chance to keep up with Intel starting back in the mid-2000s. But, unfortunately thanks to extremely sloppy CEO management, they are no longer competitive within the CPU market. Intel is literally outclassing and outperforming AMD CPUs in every range of the CPU families. How sad it was to see AMD release its brand new Bulldozer CPU family, only to see it outperformed by Intel's 1st generation Sandybridge CPU family! Seeing a quad-core CPU with hyperthreading beat the pulp out of a true octa-core CPU is sad indeed.[/citation]

I just recommended an AMD A4 in a HTPC for a friend today. The system is going to be really nice with 8 Gb of RAM.
 
[citation][nom]pacioli[/nom]I just recommended an AMD A4 in a HTPC for a friend today. The system is going to be really nice with 8 Gb of RAM.[/citation]

I'd recommend an A6 over an A4. Most definitely worth the extra money. A8 over A6 is probably more than reasonable for an HTPC, but A6 over a4 is a huge leap in performance to an extent that even an HTPC could benefit from.
 
G

Guest

Guest
i miss the days when AMD was eqalizing the market.
next time i Will spend my money buying the "underdog" processor!!!!

Power to the nerds!!!
 

verbalizer

Distinguished
May 28, 2010
2,930
0
20,960
96
OK here's the deal being totally unbiased.
AMD is good for APU and server environment right now.
Desktop is over with (except for APU).

so bottom line unless going APU, you go Intel.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS

Latest posts