AMD Fusion: Brazos Gets Previewed: Part 1

Status
Not open for further replies.
G

Guest

Guest
Interesting to see them projecting an increased market in netbooks given all the high profile remarks about Tablets supposedly cannibalizing it.
 

acku

Distinguished
Sep 6, 2010
559
0
18,980
Tablets get us into a whole other portion of the debate, as we need to start also talk about demographics. The short version of the conversation - keep in mind that a cannibalized market doesn't necessarily mean a decrease in the volume. It generally translates into a slow down in growth and a smaller proportional market.

Cheers,
Andrew Ku
TomsHardware.com
 

joytech22

Distinguished
Jun 4, 2008
1,687
0
19,810
AMD has and always will kick Intel's ass in the IGP market (excuse the rude but necessary language)

Well, at least that's what i think, probably going to get downed a bit.
 

compton

Distinguished
Aug 30, 2010
197
0
18,680
The more the merrier. My hope is that AMD can leverage this launch into a successful campaign, then bring more competitive desktop solutions to people like me, looking at my Phenom II, then looking at the Intel 32nm core i3 that can meet or beat it... and gets worse from there. If Brazos doesn't do so hot, AMD's gonna need some serious tricks (like a miracle) to bring the heat to Intel. My Phenom II is pretty good stacked against the Core 2 but its almost 2011. Bring the funk AMD! Or let me know if you decide to... well, not. I'm ready for it now.
 

dEAne

Distinguished
Dec 13, 2009
2,189
0
19,860
While AMD cannot go straight with intel in big CPU battles he makes little ones like david to bite it's toes to make intel preoccupied until bulldozer arrives.
 

super_tycoon

Distinguished
Apr 15, 2010
22
0
18,510
Trust me, I've tried playing dirt2 on a ultra-thin and you get miso soup for graphics. A kind of foggy, blurry slurry; it's not something you really want to be associated with.
 

SteelCity1981

Distinguished
Sep 16, 2010
1,129
0
19,310
If the Graphics Core is based on the Evergreen architecture then why not just call it Radeon 5250 and 5310 series? It's just confusing to the consumor standpoint to call a Radeon GPU a 6250 and a 6310 when it's based on the Radeon 5 series. Especially considering there will be no new Radeon 6 series that will come out lower then the 6800 series.
 
Let's see my wishlist in a netbook or equivalent:
* Decent battery life (if needed for plane trips) *possibly??
* Around 15" screen (I like this size, 10" seems too small for me) *possibly??
* Be able to play HD (whether online or off a DVD) *sounds like a good possibly??
* Be able to play most recent games on (even low settings is fine) *possibly??
*
 
Last post got clipped, but I was hoping also for:

* Decent LCD resolution on 15" of at least 1080p resolutions (16x9 or 1600 x 900) would be better than the 1366 x 768 crap that their pushing out these days. I even like my current 1280 x 800 resolution better than the 1366 x 768 stuff, IMO.
 

saturnus

Distinguished
Aug 30, 2010
212
0
18,680
Although on the whole it looks like an extremely good APU that probably will perform far beyond it's immediate Intel competition there's still some design choices that intrigues me.

Primarily, it seems odd to develop a propritary southbridge interface (UMI) when it would have been much more cost and power effective to include a few more gatable PCIe controllers on die instead. And thereby have all non-display/non-memory related off-die communication consolidated into a single standard interface.

That would enable much more cost-effective 3rd party controller chips, like standard PCIe USB chips, standard PCIe legacy i/o chips, standard PCIe SATA controllers, and so on. It would offer much more versatility for OEM design, at an even better price point.

I also seems odd to have a on-die VGA DAC when that could just as easily have been relegated to be produced from a 3rd party HDMI-to-VGA bridge controller. It would have saved some die real-estate as well as not having to directly support a legacy display that is rapidly disappearing.

 

K2N hater

Distinguished
Sep 15, 2009
617
0
18,980
If AMD manages to get solid RAM performance (RAM bus is shared for graphics and CPU) we'll have much more powerful miniITX solutions, which may eventually enable entry-level server roles with fewer (if any) fans.
 

dalta centauri

Distinguished
Apr 1, 2010
885
0
19,010
The chart makes it sound like discrete graphics are going to slowly fade away, and that the era of cpu+gpu/integrated graphics will be the outcome after a dozen or two years.
I have a feeling nvidia may suffer without branching out to the cpu market as well, as it seems that it's becoming the more dominant.
 

keer

Distinguished
Nov 9, 2010
1
0
18,510
Secret leak: AMD fusion brazos to be first released in Q1-2011, big bang on a secret product from market leader on multimedia products. (A fruitful company).
 

qwertymac93

Distinguished
Apr 27, 2008
104
37
18,710
[citation][nom]saturnus[/nom]Although on the whole it looks like an extremely good APU that probably will perform far beyond it's immediate Intel competition there's still some design choices that intrigues me.Primarily, it seems odd to develop a propritary southbridge interface (UMI) when it would have been much more cost and power effective to include a few more gatable PCIe controllers on die instead. And thereby have all non-display/non-memory related off-die communication consolidated into a single standard interface.That would enable much more cost-effective 3rd party controller chips, like standard PCIe USB chips, standard PCIe legacy i/o chips, standard PCIe SATA controllers, and so on. It would offer much more versatility for OEM design, at an even better price point.I also seems odd to have a on-die VGA DAC when that could just as easily have been relegated to be produced from a 3rd party HDMI-to-VGA bridge controller. It would have saved some die real-estate as well as not having to directly support a legacy display that is rapidly disappearing.[/citation]

UMI is based on pci-e, or at least that is how it connects to the cpu (which has 8 pci-e lanes)
 
G

Guest

Guest
@radiumburn

yes as many as possible so they can have money for R&D
 

AvielSe7en

Distinguished
Nov 9, 2010
5
0
18,510
Really good article....in these tough economic times, its great news that AMD are trying to push/influence market trends with their new processors.

Come on AMD!!!
 

poppasmurf

Distinguished
Sep 10, 2010
109
0
18,690
It is always good to see news of the underdog trying to right its ship!

Can't wait to see the actual line up and numbers from netbook to desktop and compare apple's to ornage's to see who's who!

Nice article thank you
 

saturnus

Distinguished
Aug 30, 2010
212
0
18,680


Isn't it physically only a 4 PCIe lane? Either way, it's still not real PCIe lanes which would have been the most obvious choice because of the better versatility. And from a power management PoW it's very odd because had they been seperate PCIe lanes they could have been shut down individually when not used. As it is now all the PCIe lanes used for the UMI interface has to remain open even if it's not fully used. Very odd decision in my opinion.
 
Status
Not open for further replies.