AMD's Fusion Cloud Could Hurt GPU Biz

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
If you watch the actual video of the CES speech (http://www.cesweb.org/sessions/IndustryInsiders.asp and click "Watch the Video"), the point they seemed to be making at the end of the presentation was that some applications people have said can't be done "in the cloud" actually may have a solution using GPU computing done server-side. That opens up new customer avenues for cell phones or netbooks or other non-3D heavy devices to get full 3D experiences via just a browser, which is what the EA guy talked about.

I didn't see anything Dirk Meyer said at CES that indicated the GPU or CPU is going away anytime soon, though, even with the "fusion cloud" supercomputer concept in place. He did the opposite, in fact, talking up AMD's position as the only supplier of both CPUs and GPUs and how that brought about the Dragon and Yukon stuff they showed from Dell and HP

The Lightstage stuff showing how they made digital characters out of real actors for Hancock and Spiderman III was pretty interesting as well. It did drag in parts and came off like a sales presentation when Dell and HP were on stage, though, so you might want to skip ahead if you watch the video.
 
Couldn't we just turn the GPUs into GPGPUs to help assist any other computer software's acceleration? I don't buy into the GPU fusing into CPU unless it's meant to offload the IGP off the motherboard into the CPU. However, that concerns me that CPU prices will skyrocket which I already don't want it to since I find $200+ CPUs ridiculous.
 
One Word - SUBSCRIPTIONS

Microsoft wants the Office model to be subscription based. Advantages for the user always up-to-date, Lower initial cost, and ect.

How much would you pay for having the latest graphic capabilities pre-rendered so you dont have to buy @500-1000 for 1-2 of the latest graphics cards to have your cards be old after 1 year? All you would need is a nice internet connection and im sure they can do a special compression which gets decompressed on-the-fly.
 
I've seen X2 OEM chips selling for $35. The silicon inside an X2-3800 only costs about $25, $30 tops. So in theory it should only cost $25 more to integrate a decent CPU onto a gpu. Drop in the bucket. I would be more than happy to pay that, especially if it meant that my CPU could get access to GDDR5 to be used as main memory!
 
Status
Not open for further replies.