Next-gen AMD Fusion CPU + GPU Coming in 2015

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]pei-chen[/nom]Sounds a lot like what Nvidia has been saying about combing GPU and general computing into the same hardware.[/citation]

They don't have x86 license and they don't have any know-how about CPUs so they are building things like "Fermi" which add a lot of overhead for nothing. Intel will probably by them after few years.
 
[citation][nom]webbwbb[/nom]So AMD is essentially trying to replicate the Fermi architecture but with a stronger emphasis on the CPU than the GPU side of things. This is not necessarily a bad thing, it's just good proof that Fermi isn't nearly as bad as most fanboys make it out to be. You can downvote this as you did the other guy who made a similar observation but it is the truth.[/citation]

AMD's GPUs already support OpenCL as well as ATI Stream and can do GPGPU functions just like Fermi. Their downfall is more that Nvidia pushed CUDA very well, and obviously, they can't support CUDA on their cards. While true, an AMD card would not perform in GPGPU functions as well as Fermi, saying they're trying to replicate them is silly considering they already have similar applications, and have had for a while. Nvidia is just ahead in the software game, so here's hoping OpenCL (key word Open) takes off and CUDA dies. Then it won't matter what GPU you have. Fermi just takes GPGPU to a new level (with more hardware to support it), however, the difference between this solution and Fermi is, I believe, that Fusion will *automatically* shunt OpenCL and floating points and whatever is more efficient on the GPU, straight to the GPU, whereas Fermi isn't automatic.
 
[citation][nom]worl[/nom]Please AMD dont make this another larrabee. This is a great chance to pull ahead of intel dont mess up.Can't to see the resultss of the 2nd gen.[/citation]

How do you pull ahead by being 5 years behind? LOL I mean, AMD's plan is probably further integration than what Intel has on their i3 line, but still. It's like announcing in 2010 that you've created a new digital camera that prints it's own pictures right after you take them! (Polaroid)

What AMD/ATI needs to do is CATCH UP with Intel. AMD is doing well now simply because of pricing, not because of overall performance compared to Intel. That's why my last two CPUs have been Intel (the first Intel chips I've used).
 
[citation][nom]Zingam[/nom]2015? That's too long. Unless AMD starts throwing out of the door new tech every 18 months like Intel they'll never have the chance survive.I bet until 2015 Intel will throw at them many more surprises unless Sandy Bridge is a crap like P4 and I doubt that Intel will repeat that mistake any time soon.[/citation]
What new tech ? memory controller on die borrowed from AMD ? x64 instructions borrowed from AMD? 2 core cpu first seen on AMD? or the old P3 architecture in current line of CPUs ?
 
AMD/ATI is in a unique position to pull this off. I don't think they are thinking "catch up" to Intel as they are offering a product that no one else can. Even if Intel and NVidia merge into one company it will take years before the companies are truly integrated to the degree of AMD/ATI. If you think 5 years is to long to bring this concept to market, then you give it a try. :)
 
[citation][nom]joytech22[/nom]This should definitely shake up the market a bit, but by that time the major CPU manufacturers would have already done this, hopefully AMD isn't going to make us wait 5 whole years (as said) to put a GPU onto a already outdated Phenom II CPU architecture.[/citation]

I know the article wasn't very clear, but I'm pretty sure Llano is the Phenom II-based, 1st generation Fusion architecture that will arrive next year. The second generation Fusion processor will be the one that arrives in 2015.
 
It's about time. I can hold out 5 years for this. My PC has lifetime warranty on most of the components anyway. I use an i7 at work and it is good and all but for home use I'm skipping the whole i series. It just isn't necessary. But at least it isn't as bad as the Pentium D which basically was a dinosaur when it came out.
 
I bet the world ends before that...well at least the christian world as we will be gonei n the rapture.But any non believers will be left for another 7 years after that so if you dont like god Im sure you will see its release.There will be so much crap going on in the world though that I doubt manny people will care anymore.
 
[citation][nom]jerreece[/nom]How do you pull ahead by being 5 years behind? LOL I mean, AMD's plan is probably further integration than what Intel has on their i3 line, but still. It's like announcing in 2010 that you've created a new digital camera that prints it's own pictures right after you take them! (Polaroid)What AMD/ATI needs to do is CATCH UP with Intel. AMD is doing well now simply because of pricing, not because of overall performance compared to Intel. That's why my last two CPUs have been Intel (the first Intel chips I've used).[/citation]

They are catching up with Bulldozer in 2011. The Fusion being discussed here is a next-gen model for 2015. The 2011 Fusion will be a Phenom II integrated with a GPU. This is targeted at Laptops and Desktops that dont need dedicated graphics. The 2015 Fusion will be a fully cohesive CPU/GPU combination engineered from the ground up (so to speak). I imagine in 2015 there will be Fusion processors for mobile, laptops, desktops platforms and for gamers, scientists (For access to GPGPU) and business purposes.
 
[citation][nom]rodney_ws[/nom]2015? *rolls eyes* That's like 100 years away in computer years.[/citation]

People are obviously not reading here. Fusion and Bulldozer are coming out next year. The Fusion being a Phenom II paired with a ATI GPU on one chip. It wont be fully integrated together as one single unit. In 2015 there will be Fusion processors that BLUR THE LINE between CPU's and GPU's. I.e. it will be a completely cohesive unit that has both CPU and GPU functionality.
 
This doesn't sound good for AMD. If they're talking five years in advance, then right now they probably don't have anything good to say about Llano. After all it's only an improved process on the PII architecture which isn't impressive at all. Core i7 and i5 (and even i3) offer amazing performance with low power usage and reasonable prices. i5 750 beats Phenom II 965BE in every benchmark (check it out on Anandtech) yet costs only $15 more.
 
[citation][nom]CoryInJapan[/nom]I bet the world ends before that...well at least the christian world as we will be gonei n the rapture.But any non believers will be left for another 7 years after that so if you dont like god Im sure you will see its release.There will be so much crap going on in the world though that I doubt manny people will care anymore.[/citation]

You are commenting on something completely unrelated to the article at hand. Furthermore, you are deep in delusion. Congrats Troll.
 
[citation][nom]killerclick[/nom]This doesn't sound good for AMD. If they're talking five years in advance, then right now they probably don't have anything good to say about Llano. After all it's only an improved process on the PII architecture which isn't impressive at all. Core i7 and i5 (and even i3) offer amazing performance with low power usage and reasonable prices. i5 750 beats Phenom II 965BE in every benchmark (check it out on Anandtech) yet costs only $15 more.[/citation]

You are not factoring in motherboard cost and features. Nor are you factoring in AMD and ATI being one company now. The i5 is a more expensive platform overall = better performance.
 
*Sigh*

This is "next-gen" Fusion as seen from the title. The first fusion chip (Llano) is supposed to come out this year or early next, based on Phenom II and Evergreen. The second chip (Ontario) is planned for next year and will use a new CPU architecture. The 2015 deal is something completely different from what Intel has done, so saying it comes "five years too late" is just wrong.
 
[citation][nom]antisyzygy[/nom]You are not factoring in motherboard cost and features. Nor are you factoring in AMD and ATI being one company now. The i5 is a more expensive platform overall = better performance.[/citation]

I'm factoring performance. If you want anything faster than an i3, right now you're better off with Intel. Sure the motherboard is a bit more expensive and comepting CPUs are a bit more expensive, but in total that's what, like $30-40? Even for a computer that you'll be using for a year or two, that's nothing, an extra $2-3 a month. In addition it'll draw 20-35 watts less power and require less cooling. The only AMD CPUs worth buying now are Athlon II 435 and 440.
 
You guys are not thinking big enough here. What if it is a lot more then a gpu and a cpu combined. What if they harnessed the GPU's superior processing power in many applications, and had that built into the CPU as well, making it seamless and much faster. Not simply removing the need for a mobo based graphics system or low end cards, but actually vastly improving the CPU's ability in several applications.

What if they can actually get Apple on board as well, and Apple builds their next OSX around it, with these GPU capabilities in mind.

There is a whole hell of a lot of potential in this chip beyond just removing motherboard integrated graphics.
 
I agree with many here, I want AMD to come back with a powerful and competitive high end CPU. It does not need to be the ultimate performer but something to compete head to head with Intel. Why wont they announce something like this? go AMD.
 


Don't hate, congratulate!

I recently "converted" to AMD CPUs and haven't looked back. Am about to upgrade my 955BE to Thuban hexcore for faster video editing.
 


Not me, I'm building a 2012 proof shelter complete w/oxygen, rations, the works. I'm not going out like in the movie, No SIr!
 
[citation][nom]xbeater[/nom]beautiful stuff, but the true system builder (most of us on toms) will want a GPU seperate from the CPU, just so we can choose excactly what we want.I do see this becoming ideal in netbooks, small desktops, office computers, HTCP's etc.Love to see computer evolution![/citation]
you ddidnt read the full article did you ??? they are not doing this to replace dedicated GPU's , the point to this is bcausemany programs for cpu actualy run better on a gpu set up it's obvious that to any oen taht this would never replace a gpu but amd clearly stated thier intentions were not to replace , but to augment a standard cpu witha cpu for programs that run better that way. personaly i coudl see this being awome for a 3d work station , 3ds max and maya both use the cpu to actually render images ( gpu only powers the view ports)./ i could see this cpu crushing the competeition at rendering sceens in max though
 
Status
Not open for further replies.