Next-gen AMD Fusion CPU + GPU Coming in 2015

Status
Not open for further replies.

N.Broekhuijsen

Distinguished
Jun 17, 2009
3,098
0
20,860
49
beautiful stuff, but the true system builder (most of us on toms) will want a GPU seperate from the CPU, just so we can choose excactly what we want.

I do see this becoming ideal in netbooks, small desktops, office computers, HTCP's etc.

Love to see computer evolution!
 

joytech22

Distinguished
Jun 4, 2008
1,686
0
19,810
10
This should definitely shake up the market a bit, but by that time the major CPU manufacturers would have already done this, hopefully AMD isn't going to make us wait 5 whole years (as said) to put a GPU onto a already outdated Phenom II CPU architecture.
 

joytech22

Distinguished
Jun 4, 2008
1,686
0
19,810
10
[citation][nom]joytech22[/nom]This should definitely shake up the market a bit, but by that time the major CPU manufacturers would have already done this, hopefully AMD isn't going to make us wait 5 whole years (as said) to put a GPU onto a already outdated Phenom II CPU architecture.[/citation]

(Forgot how to edit my own post)
What i mean by "outdated" is that Phenom II is nothing revolutionary performance-wise on a clock for clock basis, AMD need's to implement it's GPU core into a newer architecture with better performance per clock, this would ensure nobody would have to sacrifice CPU performance for a IGP (or whatever you wish to call a GPU on a CPU)
 

worl

Distinguished
Jan 5, 2009
18
0
18,510
0
Please AMD dont make this another larrabee. This is a great chance to pull ahead of intel dont mess up.

Can't to see the resultss of the 2nd gen.
 

hundredislandsboy

Distinguished
Feb 9, 2009
2,503
0
20,860
38
Intel who? Go AMD!! Along with the GPU, if they can throw in the audio, LAN, and a TB SSD in the CPU die, then I'll be impressed because my desktop won't be the noisy tower it is now.

If you had a dual socket system mobo and threw in two of these, is that considered SLI?
 

digiex

Distinguished
Aug 26, 2009
834
0
18,990
1
AMD should convince also software developers for lot of support, with lack of software running it, it will end up like Itanium.
 

HVDynamo

Distinguished
Feb 6, 2008
283
0
18,810
16
[citation][nom]gekko668[/nom]It will be nice if AMD give the user an option to disable the integrated GPU.[/citation]

I think that rather than disable it, they should implement the same tech thats hitting notebooks where the hardware can switch between the integrated and dedicated GPU's depending on the work load, or even allowing them to be operating at the same time giving the system more computational power depending on the workload and how tasks can be divided up.
 

rooseveltdon

Distinguished
Jan 18, 2009
364
0
18,790
4
[citation][nom]techguy911[/nom]By that time bio chips will be out making this tech obsolete.http://www.physorg.com/news192801007.html[/citation]
lol no offense but DNA powered computers are at least ten years away the ethical and moral implications of such topic alone would cause tons of debates in congress plus half the nerds here (me included) would fear the potential rise of a sky net like computer that will want to replace us with machines, i will stick with silicone and metal thank you lol
 

antisyzygy

Distinguished
Oct 9, 2009
245
0
18,690
4
[citation][nom]joytech22[/nom]This should definitely shake up the market a bit, but by that time the major CPU manufacturers would have already done this, hopefully AMD isn't going to make us wait 5 whole years (as said) to put a GPU onto a already outdated Phenom II CPU architecture.[/citation]

They are releasing a fusion chip to vendors now, meaning a version of the fusion chip is being released probably by next year. This article says that a fully integrated Fusion chip will be out in 2015. Whereas the Fusion chip coming out now is a slapped together Phenom II and a GPU, the next gen Fusion chips will probably be designed from the ground up as one cohesive unit.
 

mikeangs2004

Distinguished
Aug 28, 2009
312
0
18,810
10
but the true system builder (most of us on toms) will want a GPU seperate from the CPU, just so we can choose excactly what we want.

I do see this becoming ideal in netbooks, small desktops, office computers, HTCP's etc.
You should read more about fusion before you make anymore of these ~~`. It does not mean under fusion the external GPU sub-system option will not be available. If more graphics acceleration is needed then certainly one can add a discrete video card that goes along the APU
 

deweycd

Distinguished
Sep 13, 2005
846
0
19,010
14
One thing about integrating the GPU with CPU and having a dedicated graphics card is that the fully integrated GPU can assist the CPU do calculations like Anti-virus, Physics, and other GPU assisted calculations. This leaves the dedicated GPU to do its own work without having to add in these extra calculations. It may also be easier to code a CPU with integrated GPU then a CPU and seperate GPU.
 

antisyzygy

Distinguished
Oct 9, 2009
245
0
18,690
4
[citation][nom]mikeangs2004[/nom]You should read more about fusion before you make anymore of these ~~`. It does not mean under fusion the external GPU sub-system option will not be available. If more graphics acceleration is needed then certainly one can add a discrete video card that goes along the APU[/citation]

Not to mention the point is that combining a GPU and CPU adds computational options beyond just graphics. You can add a graphics card and still take advantage of the GPU on a Fusion processor in other ways.
 

Teen Geek

Distinguished
Apr 8, 2010
34
0
18,530
0
[citation][nom]hundredislandsboy[/nom]Intel who? Go AMD!! Along with the GPU, if they can throw in the audio, LAN, and a TB SSD in the CPU die, then I'll be impressed because my desktop won't be the noisy tower it is now.If you had a dual socket system mobo and threw in two of these, is that considered SLI?[/citation]
No, it's considered Crossfire
 

figgus

Distinguished
Jan 12, 2010
364
0
18,780
0
[citation][nom]xbeater[/nom]beautiful stuff, but the true system builder (most of us on toms) will want a GPU seperate from the CPU, just so we can choose excactly what we want.I do see this becoming ideal in netbooks, small desktops, office computers, HTCP's etc.Love to see computer evolution![/citation]

Actually, I predict that in the future you will have generic sockets on your motherboard that you can either slot a fusion, a dedicated GPU, or a dedicated CPU into or any combination thereof. Talk about the ultimate in customization for power users!!!
 

webbwbb

Distinguished
Aug 18, 2009
221
0
18,680
0
So AMD is essentially trying to replicate the Fermi architecture but with a stronger emphasis on the CPU than the GPU side of things. This is not necessarily a bad thing, it's just good proof that Fermi isn't nearly as bad as most fanboys make it out to be. You can downvote this as you did the other guy who made a similar observation but it is the truth.
 

zaznet

Distinguished
May 10, 2010
387
0
18,780
0
Parallel computing is finally coming. This is where we will gain some serious performance gains on tasks that are often repeated or can be completed simultaneously.

Having a GPU on the CPU I don't expect to prevent anyone from adding in a high performance GPU add on card for desktop systems. For mobile systems this may be a great way to cut down costs and power consumption.
 
Status
Not open for further replies.

ASK THE COMMUNITY