I don't think it will hurt graphics cards. The application is limited. Right now you will achieve a better looking game using client-side rendering. At worst if this idea catches on, you will have companies buying hundreds of video cards every 6 months to render for the cloud.
The future course of this may be more obvious then it seems. This problem has been solved before. Evolutionary history has faced these exact same parameters in the development of nervous tissue. neurons evolved first, then neural nets and finally specialized neurons and neural nets for each problem causing evolutionary pressure. If transisters have hit a speed limit, then multiple processors are the first obvious solution. The next real step will be dedicated processors and groups of processors for specific problems. We already know that custom silicon can produce solutions several orders of magnitude faster than general purpose processors like cpu's even at the same frequency. The best known example of this is graphic processors. It is also clear that both microsoft and intel have conflicts of interest with improving the efficiency of software and hardware. Microsoft needs new hardware to justify new versions of software, and intel needs new fancier, more demanding software to drive hardware upgrades. Netbooks such as the one I am currently using scare them both silly. There is little or no incentive for them both to really innovate since they are already at the top of the heap. The real question is who will be the next disruptive innovater? How will we recognize them when they appear on the scene? Look for the first one to push heterogenious processors designed to take advantage of problem specific. Maybe a hungry AMD, Apple or some new startup will bring us what we are looking for.
First, by offloading a GPU to a cloud supercomputer, someone still has to pay for the supercomputer and now the network bandwidth. Essentially you may be renting a GPU remotely, and this may cost more or less.
Second, notwithstanding the mediocre net connections in some countries, even the best connections will lag behind a real GPU. Responses won't be as instant, and quality won't be as good as a high-end GPU.
Third, this will impact low end GPUs the most, especially integrated solutions. But does it fully rectify Intel's horrible integrated performance?
Fourth, GPUs cost both in hardware outlay and power usage. If your computer is part of the cloud, what measures will be taken to prevent leeching? People might really skimp on graphics, or have nothing but 2D and try to fake 3d capability. I'm talking about the majority who just might not be using AMD platforms. Will the system hold up or defend against such abuse?
this is nonsense.
how could you hurt the discrete GPU business with that amd fusion supercomputer? it's as though amd will be providing discrete gpu performance to all users of that service, probaby by a million users! that even excludes the problem in speed of broadband access required for an average gaming desktop resolution probably at least 10mbps.
what amd's fusion will do is provide a more cost and space efficient way to produce a display than integrated graphics. So it is probably the integrated graphics business will lose since it will seat between fusion and discrete graphics businesses which is quite a crowd.
the last paragraph is just annoying. you don't buy a GPU for a handheld gaming device.
[citation][nom]zodiacfml[/nom]this is nonsense.how could you hurt the discrete GPU business with that amd fusion supercomputer? it's as though amd will be providing discrete gpu performance to all users of that service, probaby by a million users! that even excludes the problem in speed of broadband access required for an average gaming desktop resolution probably at least 10mbps.what amd's fusion will do is provide a more cost and space efficient way to produce a display than integrated graphics. So it is probably the integrated graphics business will lose since it will seat between fusion and discrete graphics businesses which is quite a crowd.the last paragraph is just annoying. you don't buy a GPU for a handheld gaming device.[/citation]
Not sure you really understand what AMD is going for. With Fusion super computer, they are not adding "some" graphics power to discrete solutions--they want to render the entire game server side, compress the the video, and stream it to the client. Basically, the client has no client-side game install, just a terminal, where you provide control feedback.
This hurts discrete, not integrated solutions. Because you will not need a super fast 3D card, if Fusion is successful, to play the latest games--provided that your net connection is fast enough to stream the video feed beyond 720p resolution.
I remember reading a while ago that the PCs are reaching the limits of graphics. Such as you keep making the car engine more powerful, you start needing better wheels, brakes, etc. then a whole new frame is needed. Finally, the car starts going so fast, it can't stay on the ground. etc etc. So the basic idea of what a car is needs to be re-examined.
So, basicly, the fusion project is as much about CPU/GPU inner communication as it is putting both on one die. I also think Intel is investing heavly in this area.
2 cents worth..
actually, this article itself is confusing which is why i mentioned integrated graphics. amd FUSION is a tech totally different from the and amd FUSION RENDER CLOUD, i just stated my opinion regarding FUSION which is off the topic/headline.
Regarding amd fusion render cloud on your headline, i think Wr, the guy who posted above me explains my points well.
also quoting from the link of the amd fusion render cloud news:
So what's under the hood of this beast? According to the company, the AMD Fusion Render Cloud will include AMD Phenom II processors, AMD 790 chipsets, and ATI Radeon HD 4870 graphic processors.
how does amd go against its own business when the hardware used on the supercomputer is what they sell to consumers?
i think you have to read the render cloud news again to know its applications and to whom its for.
it could be used first as a supercomputer which harness the power floating point capabilities of graphics cards then if OTOY produces the right software, could be used as a service for consumers for which the author suggested, gaming on handheld devices.
[citation][nom]tuannguyen[/nom]Not sure you really understand what AMD is going for. With Fusion super computer, they are not adding "some" graphics power to discrete solutions--they want to render the entire game server side, compress the the video, and stream it to the client. Basically, the client has no client-side game install, just a terminal, where you provide control feedback.This hurts discrete, not integrated solutions. Because you will not need a super fast 3D card, if Fusion is successful, to play the latest games--provided that your net connection is fast enough to stream the video feed beyond 720p resolution./ Tuan[/citation]
I don't think it would be possible to stream modern games with our current internet connections in the manner they make it sound. Maybe if you were sitting near the server that would stream it to you it may provide decent results but that's simply not possible.
Its not like its just a set amount of data that has to be transferred either like a normal download. With every single action taken within the game data would have to be both sent and received I simply can't see this working perhaps in the future its plausible but not the present.
I think their goal is simply going one step to far and rather than trying to make it streamed similar to a video they should have the games get a certain % done and then allow then player to play while the game continues to download and install itself on the computer. Granted this means that the gamer would still need a decent video card but at least its realistic.
[citation][nom]zodiacfml[/nom]actually, this article itself is confusing which is why i mentioned integrated graphics. amd FUSION is a tech totally different from the and amd FUSION RENDER CLOUD, i just stated my opinion regarding FUSION which is off the topic/headline.Regarding amd fusion render cloud on your headline, i think Wr, the guy who posted above me explains my points well.also quoting from the link of the amd fusion render cloud news:So what's under the hood of this beast? According to the company, the AMD Fusion Render Cloud will include AMD Phenom II processors, AMD 790 chipsets, and ATI Radeon HD 4870 graphic processors. how does amd go against its own business when the hardware used on the supercomputer is what they sell to consumers?i think you have to read the render cloud news again to know its applications and to whom its for. it could be used first as a supercomputer which harness the power floating point capabilities of graphics cards then if OTOY produces the right software, could be used as a service for consumers for which the author suggested, gaming on handheld devices.[/citation]
AMD clearly says that Fusion is more than what people "might have thought."
First strategy: combine GPU+CPU: odd for gamers
Second strategy: take processing of intensive games/apps to the cloud and process/render them
Great, so I guess I won't need to buy expensive graphics cards anymore to play the latest games. Sure, you mention that the Fusion Render Cloud will use AMD GPUs, so how does that help its bottom line? AMD will sell GPUs to itself? If games are rendered server side and only the video feed is streamed to users, it doesn't take much to render video.
it's going to be weird for a lot of gamers.. it's like suddenly switching from normal car engines to electric, basically there's going to be alot of hesitation and doubt.. also, not to mention, the gigabytes of bandwidth needed per person per day just to play a few hours of gaming.. a super computer might be able to render them all, but then streaming them to the people? i'm thinking expensive.. very expensive.. and, the proper connections have to be there too, which a very high percentage of this planet doesn't ATM...
i'm not saying it's a bad idea, on the contrary, it seems to present gamers a solution that'll allow them to finally install a game and switch graphic options on to ultra high everything and just play.. i just think that it can't exactly be feasible in this year or two.. at least not in the way we've interpreted it from the article.. there's probably something they're not telling us..
on the other hand, the fusion cpu/gpu sounds like a good idea.. i think with this (and proper drivers..) you won't need bulky crossfire/sli systems (and pricey) anymore.. we'll just have to wait and see!
thanks for the link of the maximumpc.com article.
i was right about speculating the render cloud being used in some sort of web browser or small application to allow pc gamers play with another handheld or smartphone device.
still,i don't see how it can hurt gpu business there, unless graphics chips are of limited supply which is very unlikely.
yes, the cloud will use amd chips and it will help the bottom line. as though amd is selling their hardware to small devices, which for now is exclusive only to PC's. though profit will be a spread among several users of about 10 or 20 gamers using that service, it still helps selling more.
i think your confusion comes from the perception that the cloud will allow gaming of more than 1027x768 resolution or fast games which it can never do because you haven't read any specific detail of how will this be done or implemented.
i see the big potential for this on MMO titles since it won't require fast or intricate graphics.
[citation][nom]tuannguyen[/nom]See here:http://www.maximumpc.com/article/n [...] ercomputerhttp://www.amd.com/us/fusion/Pages/index.aspxAMD clearly says that Fusion is more than what people "might have thought."First strategy: combine GPU+CPU: odd for gamersSecond strategy: take processing of intensive games/apps to the cloud and process/render themGreat, so I guess I won't need to buy expensive graphics cards anymore to play the latest games. Sure, you mention that the Fusion Render Cloud will use AMD GPUs, so how does that help its bottom line? AMD will sell GPUs to itself? If games are rendered server side and only the video feed is streamed to users, it doesn't take much to render video./ Tuan[/citation]
From what I understand this sounds good on paper, but I see 2 problems:
1. Internet speed - unless the connection is fast enough, you may as well enjoy screenshots slide show. Games are essentially real-time user-computer interactions, so any lag between inputs and outputs will really kill the experience.
2. Costs - both the cost of service and the cost of bandwidth. I think this could turn into a pay-per-play kind of service, where you pay certain amount of money to play certain amount of time. On top of that all the ISPs from where I live have ridiculously low monthly data cap - you either pay for more or thrown back to dail-up speed when you used up - so this will increase the cost to use this service as well.
In other words, it could/may be affordable for light users, but I don't see how that will work out for a heavy gamer like me.
But, you never know... the world is full of surprise. But one thing is for sure, people won't stop upgrading their GPUs to play the latest games any time soon.
I think the Fusion cloud sounds pretty interesting. I don't believe they are going and trying to compete against high end graphics cards (where the big profit margins are) and gamers that demand instantaneous feedback at maximum frame rates but they are trying to enable a higher level of graphics capability for devices that don't have the hardware.
And what that means is anything with an MPEG2/H.264/VC1 video decoding core. Hypothetically say a digital settop box like for cable or fios. It will never have the capability to play WoW. But if (hypothetically) the game is hosted in the cloud, rendered in the cloud, and the resultant output is just another video stream sent to the cable box, then that user will be able to enjoy a fairly good experience. Once again assuming that the cable service implements a client to access the cloud and send user input to the game hosted in the cloud (or anywhere else).
And thinking some more, it doesn't necessarily have to be targeted towards games as we know it. Imagine again the settop scenario where one of the applications is a virtual tour of say the Louvre. The cloud just dutifully spits out a video stream rendering of wherever the person is running around. All the terabytes (petabytes) of data needed to reconstruct every detail of every piece of art in the Louvre can be hosted on servers in a central location and rendered from there.
Or... scientists navigating models of proteins and molecules determining how they interact. Instead of each computer needing a beastly graphics card, the complex models can all be rendered in the cloud and a video stream sent to the client.
The Cloud idea may or may not be successful, but if it is, it will not be providing free and easy 4870X2 graphics power for hundreds of millions. In practice, you'll likely see casual consumers being able to "rent" gameplay online at resolutions most serious gamers would consider unacceptable. It would be Crysis on low/medium settings rendered in 300x200 on a cell phone. Or perhaps medium/high setting rendered in 600x400 for people that want to play games like they surf YouTube. The graphical power required would actually be fairly small, a single 4870 being powerful enough to render several games at the same time. Tackling the problem from the CPU side would be the bigger challenge, but that would mean a lot of CPU power being rented from AMD, wouldn't it?
On the side of having integrated graphics on the CPU...again this is not about placing the highest-power GPUs directly on top of the CPU. It'd cost way too much in the first place for most consumers, and there will never be adequate cooling solutions for such a beast. The closer current-day analogy would be AMD fitting 3650-equivalent parallel processing power into their CPUs, not ridiculously huge, yet think about all that could be done with this. Everybody knows Intel is a mediocre GPU manufacturer, yet they actually own most of the graphics market on account of hohum integrated graphics. Fusion would make HD-playback level graphics power the standard for AMD CPUs, potentially a major selling point for consumers not interested in the fuss and expense of added graphics cards and inadequate integrated graphics, finally eating into a segment of the market that Intel has been allowed to dominate without any real sort of effort.
Finally, the parallel processing that would become available on AMD's CPUs would be accessible to developers for any application. Media conversion would no longer be the architectural weakness for AMD chips the way it has been the last two years. Developers could expect to have sufficient CPU power to include more physics in their games, without having to gamble on whether or not enough gamers have recycled 8800GTs to run PhysX. Played correctly, this could help as a transitional technology towards outboard physics cards, because the software will finally become available enough to make it worthwhile for high-end gamers.
When you consider that Intel's architecture is presently robust enough that their i7 can almost match the parallel processing of a lot of GPUs anyway, Fusion may not just be what AMD needs to do to compete. It may be necessary for them just to survive.
+1 on what's said;
There are many security issues; internet providers are already cutting on bandwidth (bringing monthly speed and volume limitations).
Seeing that,a night of playing could easily cost you a few gigabytes of network transit.
Second,there might be applications where you will want privacy,and not for everyone to see.
The extra cost of subscription is another thing, many users, including myself,did not play WOW because it costs a monthly subscription fee, and I am not prone of sharing my banking info with any company online.
Let alone, pay $50 per year, per game, where other games just cost the CD ($25),and rely on local free servers, if they do support multiplayer.
Plus, you can always play even when not having internet.
I also agree with the lag issue.
We do everything in our power, to shave off 5ms on our LCD monitors.
They now go downto 2 and 1ms, yet we will play games over internet which has a 105ms lag?
The games that would work, often older games, that easily can run in software render mode (does not need a powerful graphics card) will probably be playable.
I am not for this idea;instead, AMD would do better creating their motherboards to have a plug and playable GPU (just like a CPU) right next to the CPU on the mobo, with a few hundred of lanes.
That would void the need for PCIE, and separate graphics cards.
Just plug in a CPU, a GPU core, and some VRAM (like you plug in DDR ram).
I think that'd be a much better solution,if CPU/GPU on one die is not possible.
I've been thinking about this all day and came up with another problem that they will undoubtedly face if this becomes successful. How many millions of people will be connecting to a single server at one time? Each playing a game. Even if we are talking about a super computer that can handle running all those games at once its certainly wouldn't have enough bandwidth to provide smooth streaming to people. Also the only way this would be accepted would be if its very high quality video streams which would just put more strain.
I'm sure we all have tried to visit a website that has been bombarded and weren't able to connect to it. Or another example would be steam when they weren't fully prepared for launches of their games and people had trouble just activating HL2. Now these examples aren't even near what AMD is purposing and they ran into a brick wall.
AMD has a bright idea while its fully possible its not going to be achieved with what we currently have. If they tried to launch this anytime soon its going to end in a massive failure and may never be resurrected when it actually could be done.
i dont think this would catch on. 1. u would most certainly have to pay a monthly fee for the service; 2. i already experience enough lag in an online game, why would i want to take that lag into my singleplayer game; 3. who is going to initially fund the servers; 4. hardcore gamers arent just going to give up their blindingly fast pcs so some internet super computer can render a game for them.
i think the way to go is to just advance graphics card technology. intel and nvidia have the right idea.
Yeah, I'm sure all the gamers out there want to actually have their input lag increased by a huge factor. I know guys who won't use a wireless mouse because they're concerned about it. I'm sure those same guys would love to have their game rendered from afar and delivered to them. Except the fact that they couldn't play in real time and they'd be essentially using their computer to use another computer remotely. Great dig on how internet communications in the US are really lagging behind, it was really relevant to the rest of the article.
Regardless someone would actually have to pay to have a fusion cloud (you know they're not giving them away for free right?) and that person would be paying for all of the graphics cards installed in it and all of that AMD technology. I wouldn't call that them killing their business so much as changing it. They're obviously not competitive in the CPU industry anymore and they're constantly on the run in the graphics industry. Or here's an idea: the server might be rented and people would have to pay to have their games rendered for them. How exactly does this hurt their business or the GPU biz? It adds a new dimension to the industry and changes the ground of competition,
On a differnet note, there are better places to get tech info, and at least 3 out of every 5 articles on this site are becoming absolutely crap speculation, news that someone else reported much earlier on another site but updated with a new crappy editorial, or an article that has no point. I seriously am considering not reading anything on this site anymore, and I'm not the first person who has mentioned that lately.