AMD Fusion: Brazos Gets Previewed: Part 1

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I'm really happy to see more news from AMD but I'm disappointed that there was no mention of a Bulldozer CPU launch for January. Will be buying a desktop then and Intel's i5 2500 looks pretty good right now but I wasn't wanting to go with an Intel build 🙁
 
[citation][nom]saturnus[/nom]Isn't it physically only a 4 PCIe lane? Either way, it's still not real PCIe lanes which would have been the most obvious choice because of the better versatility. And from a power management PoW it's very odd because had they been seperate PCIe lanes they could have been shut down individually when not used. As it is now all the PCIe lanes used for the UMI interface has to remain open even if it's not fully used. Very odd decision in my opinion.[/citation]

shutting down lanes like that would likely impact performance a whole lot considering everything from sata to usb would be run off of those lanes. the amount of power saved would not be worth the cost of implementing said tech.
the cpu has 8pci-e lanes, 4 are used for umi, the other 4 for external graphics.
 
I was waiting to get a netbook and holding off on the current lineup.
I'll wait until some of these new AMD based netbooks come out.
Looking forward to any benchmarks on these and would love it if they could do gaming with some older titles (like Postal 2 etc).
 
AMD already has an answer, the AII/TII Neo who absolutely destroys Atoms. Ontario/Zacate brings the same with with better gpu, longer battery life and lower prices.

Atom is just crap, and it will be crap forever.
 
Here's my 2 cents worth. AMD looks promising when it pitted up with intel atoms. Looking at some of the other slides on the upcoming fusion architecture, having 2cores sharing an fpu/sse units is like intels hyper-threading technology; but for fpu/sse units instead of as a whole. Sure its more efficient to share the units. But since more & more programs are becoming more threaded for multi-core setups in the first place. I would predict a slow down as a whole when there's 2 256-bit AVX instructions to be processed or even worst 4 128-bit instructions. It all equals a slow down as a whole which means more money & time wasted. Time=money in most cases if not all cases.

The only solution to this problem is. if programs developers start developing for GPGPU applications solutions and offloading the cpu as a whole, than maybe AMD will be performance competitive once again.

If not die a slow painful death AMD & R.I.P. Long live Intel "The chip giant".
 
I was right with the avx slow down with amd's upcoming proc. According to anandtech.com, quote "Compared to an 8-core Bulldozer a 4-core Sandy Bridge has twice the 256-bit AVX throughput."
Which means i was right about the possible slowdown. Ha ha ha ha ha ha!!!
here's the address to the page.
"http://www.anandtech.com/show/3922/intels-sandy-bridge-architecture-exposed/3"

Dang i'm good
 
"Or say you're running your display from the on-die graphics, and only spinning up the discrete card when a 3D application needs it. There might be power-oriented benefits there."

Nvidia's Optimus tech already does this with Intel's mobile i3 and i5 platforms. However it would be great to see this technology migrate to the desktop space, and desktop APU's offer the opportunity to do this.

Not sure it would make as significant a difference in desktops though, as modern discrete graphics cards, even high end derivatives can offer very reasonable power usage at idle, ~30W. So how much less could one of these integrated GPU's consume at idle? Well... it's guaranteed to be less than a 30W difference.
 
Very interesting... This may be the final step that convinces me to get a netbook, which is quite an achievement, considering I already have a miniITX i7-860 system for high performance/size, and an Android phone for ultra-portable computing (obviously, nothing too extreme there).
 
Great job AMD. I hope the OEMs and people will change their attitudes towards AMD, implanted on many by Intel.

I remember a time browsing at a B&M store, and the salesperson recommended that a family go for the Intel laptop because "AMD produces a lot of heat and can overheat the laptop". I thought of intervening but decided against it.
 
Status
Not open for further replies.