if it does get small enough and sips power all we needed then would be one of those nano-ITX board..! wow, im getting bumps, this is all very exciting! and who knows it may very well be part of your future LCD tv! that would save me the time for experiments
What's interesting to me is the fact that Radeon is still made on SiGe while Phenom is made on SOI. I wonder if they have qualified Radeon production on SOI which will make it easier to eventually go to GPU on-die. Also, interesting is the fact that it was said that AMD would make the first Fusion, but also that Fab36\38 wold concentrate on the high-end projects.
And contrary to what one person says above, it depends on how many Stream units it has as to whether it will actually be good for gaming. The Phenom side can play games at 1024 as can the 3200 (not CrySis of course). This should be the 4200\4300 if it's RV800.
Hmmm... In Phenom there is nothing wrong in architectural level. The bigger problem for AMD has been the production technology. Area where Intel has been far away for a guite long time.
With Intel production technology Phenom would most propable have much better clock speeds. Now with this prosessor we talk about much smaller production node, so it's not easy to tell how good this is gonna be. It can be good, or not so good. I hope that this is good, because it really can be very good home theather chip! Two CPU cores and GPU in one piece. It has potential of being more than sum of it's components.
You allso have to remember that all the components here have the same memory pool, so there is not separate memory for GPU.
Predictions (IMHO method used ;-)
Good parts: faster communication between CPU and GPU. Power reguirements (total). Small size (total)
Bad: GPU use normal DDR memory, is it limiting factor is hard to say. It has not been done before (I supose). Many thing can go wrong. "Only" two CPU cores at this moment (Not big problem, because this is not gonna be highend speed monster? anyway)
well what if you added this chip with 4 graphics cards. it is not like it is an on board chip it is a cpu. i think they will make it function with other cards for work place, and home expreance so only need one chip for all reasons
It's possible, but most propably the best part of fusion idea is gone. Two core phenom is ok now (if it does clock well enough), but if you want real gaming computer in the future you need more cores and disgrete GPU for a while. Nobody knows how Larrabee change that though...
Fusion should be cheap, low power solution... 4 graphics cards eats that away
ps. If AMD can make it so that in 2D you could use fusion GPU and in 3D some other graphic card like in Nvidia solution, well maybe it would be useful even then.
I won't be holding my breathe, while die shrink adds a nice free boost in performance and a huge improvement in heat and watt usage. But AMD's track record against the Core 2 Duo isn't good and a 40nm AMD Phenom won't touch the 45nm Intel Nehalem, heck it won't even touch the current 45nm Core 2 Duo.
Good try AMD, but your technology of tomorrow isn't even as good as Intel's technology of today.
I have faith in any company that has imagenation. i thought Nintendeo and the console war would have crushed them as they did not have very much money. so they thought and thought and came up with a cheap console now they back in the game much higher then all the others. talk about funding yourself. I am not sure if amd can fund themselves. they have alot of eggs in one basket. The risk is high. hope they get alot of funding from someone.
I doubt it will be TSMC in future since the Foundry is the new fab for AMD. And AMD is planning to build another new Fab in New York - big meeting in NY on Monday!
This is all future stuff - too many comments are referring to old approach, and we can only guess at the extent of the new. This is some bigger picture. Many of you seem only to know old AMD info - and a LOT has changed in the last year. And more coming soon.
~_~ you guys are failing to realize the potential of gpu cpu pre-rendering imagine if the pre-rendered data of course you wouldn't give it the major job of texturing or the filtering just the simple pre-rendered data just how much of an improvement the gpu hooked into the pcie could do. You're talking almost getting about 30 to 40% improved performance depending on how fast that cpugpu can do it. AMD / ATI have a great chance to rule the gaming market with this move. Can't wait to see this months new driver that is supposed to unlock some sleeping functions.