PlayStation 4 Could Get GPU Switching, Dynamic UI

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
And it still will be outdated even by my current PC (GTX 690, i7 @ 4.8, 16gb ram, 2560 x 1600 display, 2 x 512Gb SSD). No thanks, kids you can have it.
 
Neat idea. With this, they could have the APU render the boot system and only spin up the GPU when you decide to load up a game. Save some power when you dont need it.
 
[citation][nom]vrumor[/nom]And then youd have a PC. Great idea. /s[/citation]


The whole point of a console is you buy a game and it plays. IT's not, you only play this game with this much ram or this video card upgrade, it's plug and play. You want customizable system, it's called a PC.
 
Does this mean the ps4 can run the discrete gpu + the APU in crossfire for increased performance as well as switching back and forth between them based on load?
 
[citation][nom]chronium[/nom]And yet they offer it with current AMD APU's and GPU's so they must have been able to comensate it. It will also most likely be available for the PS4 as well.[/citation]

His idea of who ACF works is wrong. There are several modes it can run in. Two identical cards use AFR where each frame is sent to the next card in the loop. ACF use's a load balancing mechanism where each frame is sent to the next available GPU. With a moderate dGPU + APU you typically end up with a 5:1 ~ 6:1 ratio of frames being sent between them. My laptop is an A8-3550MX + 7690M (rebranded 6770) and with ACF turned on it tends to get 6:1 most of the time.

Anyhow their probably not going to even bother. As this is a console the developers will end up using the graphics array on the APU as a coprocessor and dispatching specific instructions to it's GPU component while the main rendering is done on the dGPU.
 
[citation][nom]laststop311[/nom]Does this mean the ps4 can run the discrete gpu + the APU in crossfire for increased performance as well as switching back and forth between them based on load?[/citation]

You want confirmation of a feature based on a rumour?
 
Well we have three possibilities here: the person quoted has never looked into what would be prior art (Optimus)or they said something idiotic or Sony is going ARM+desktop class videocard. While the third option would be interesting and possible, one of the first two are more plausable.
 
Hopefully it will work like this:

#1 - Very efficient GPU for video playback and the PS4 interface.

#2 - Efficient GPU used for the HUD element and STREAMING in-game. Like this:
- Efficient GPU used for HUD, thus allowing the REST of the game to be anti-aliased independent of HUD/text. Both elements are MERGED at the end.
- Some game elements could be GPU-decoded on the efficient GPU (texture decoding or whatever) and these elements MERGED with the main GPU.

POWER and the "efficient" GPU:
Power consumption is important and that demands two GPU's to work properly (one efficient for video and light-loads and the other turned OFF when not needed).

However, Sony really needs to make that efficient GPU work in conjunction with the main GPU. It might seem the advantages are small, but they are not.

*If the two GPU's are to work together it would be advantageous for the XBOX720 to have a similar architecture. In the PC world this would work great as well (Intel CPU graphics + addon graphics card).
 
[citation][nom]vrumor[/nom]And then youd have a PC. Great idea. /s[/citation]

To be fair sony make a loss on the hardware... by offering simplified hardware upgrades with a markup could add revenue people.


Would be even better if sony began to offer the PS OS (assuming if it is running on an AMD APU it's x86 arch) for a price, then some PC gamers would probably buy it and install it on their own hardware so they can enjoy PS titles.
This could mean more revenue through titles, as well as OS sales, less lossy hardware sales, whilst giving the PC hardware market a boost, and PC gamers a larger selection (and hopefully lower game prices due to competition).

This is unlikely but would be nice!

 
[citation][nom]lovett1991[/nom]To be fair sony make a loss on the hardware... by offering simplified hardware upgrades with a markup could add revenue people.Would be even better if sony began to offer the PS OS (assuming if it is running on an AMD APU it's x86 arch) for a price, then some PC gamers would probably buy it and install it on their own hardware so they can enjoy PS titles. This could mean more revenue through titles, as well as OS sales, less lossy hardware sales, whilst giving the PC hardware market a boost, and PC gamers a larger selection (and hopefully lower game prices due to competition). This is unlikely but would be nice![/citation]

This will never happen. A playstation operating system that you install on your PC. That turns your PC into a playstation. First sony needs everybody to have all the exact same hardware so the games can be optimized properly and everyone has a good equivalent experience. Second, with the PS OS on a normal PC, it will be hacked before it even comes out. On a PC hackers can completely control whats going on and intercept decryption keys and break the drm with ease. Sony was the only manufacture able to keep ps3 games off piracy for the majority of it's life. They are not gonna give up that awesome security and with all they have learned from the ps3 hackings the PS4 is gonna be the fort knox of video game systems. I hear there will be many hardware mechanisms built onto the mainboard to stop from accessing sensitive areas and even booby trap type things that brick your ps4 on purpose if you mess with it.
 
[citation][nom]laststop311[/nom]This will never happen. A playstation operating system that you install on your PC. That turns your PC into a playstation. First sony needs everybody to have all the exact same hardware so the games can be optimized properly and everyone has a good equivalent experience. Second, with the PS OS on a normal PC, it will be hacked before it even comes out. On a PC hackers can completely control whats going on and intercept decryption keys and break the drm with ease. Sony was the only manufacture able to keep ps3 games off piracy for the majority of it's life. They are not gonna give up that awesome security and with all they have learned from the ps3 hackings the PS4 is gonna be the fort knox of video game systems. I hear there will be many hardware mechanisms built onto the mainboard to stop from accessing sensitive areas and even booby trap type things that brick your ps4 on purpose if you mess with it.[/citation]

Whilst I agree with you (I did say it was unlikely), with regards to security you could say that about any PC game? So they'd have to implement a steam equivalent where your games are associated with your account, and cannot be played otherwise (although EA's origin sucks and I personally had many difficulties with playing BF3). I would also mention that because it is a separate OS it wouldn't be similar to cracking the familiar windows x86.

With regards to optimisation, all games would be based around the base console model as with the xbox (the newer xbox's 360's are more powerful but all games are designed with the original 360 in mind). Just with upgraded hardware, or a personal rig people can enjoy the greater detail that PC gamers are used to. Especially with AMD now having a generic driver, different gfx cards would not be too much of an issue(excluding xfire etc). And obviously they could provide a list of support hardware to cover discrepancies.

Again I say wishful thinking, but it isn't the most ludicrous idea.
 
***UPDATE***
I think I just realized I may have mistaken GPU switching for upgradability but I will keep my initial post incase anyone wants to read my not-so-crazy idea on keeping consoles current.

I wrote a blog post about how console makers should add GPU upgradability and how it could be done. Nice to see an idea like this is seriously on the table.

My blog basically talks about how you would need to keep it simple and how the upgrade could work similar to the hard drive upgrade bay on the PS3.
Here is what I suggested:
1. Each GPU would have a number the lowest is the slowest. The initial consoles would come with GPU 1 installed. After about 1 or 2 years GPU 2 could be released. People could upgrade if they wanted to but all games made for the system would detect which GPU was installed and automatically adjust graphical settings (Manual settings/adjustments would still not be allowed). If you have GPU number 5 then you might get the best graphics that game has to offer. If you still have the original one that's fine, you can still play the game you may just not get all the same detail that the number 5 GPU would get. (This could also allow for 1080p [or higher] resolutions in games that would otherwise be 720p.)
2. Each GPU would need to almost completely enclosed in plastic to protect it from electro static discharge issues and to keep the circuitry from being exposed. Since the general public would be handling these video cards this would be a good idea.
3. Each GPU would need plenty of cooling capacity with heatsync, vents and a fan or two.

Imagine...
You get Half-Life 3 with revoutionary new graphics. It looks amazing on your PC but your console version doesn't look that great... until.. GPU 3 is released!!!

It would be a good selling point for new consoles aside from just a bigger hard drive and an included game or two.
 
[citation][nom]dark_lord69[/nom]I wrote a blog post about how console makers should add GPU upgradability and how it could be done. Nice to see an idea like this is seriously on the table.My blog basically talks about how you would need to keep it simple and how the upgrade could work similar to the hard drive upgrade bay on the PS3.Here is what I suggested:1. Each GPU would have a number the lowest is the slowest. The initial consoles would come with GPU 1 installed. After about 1 or 2 years GPU 2 could be released. People could upgrade if they wanted to but all games made for the system would detect which GPU was installed and automatically adjust graphical settings (Manual settings/adjustments would still not be allowed). If you have GPU number 5 then you might get the best graphics that game has to offer. If you still have the original one that's fine, you can still play the game you may just not get all the same detail that the number 5 GPU would get. (This could also allow for 1080p [or higher] resolutions in games that would otherwise be 720p.)2. Each GPU would need to almost completely enclosed in plastic to protect it from electro static discharge issues and to keep the circuitry from being exposed. Since the general public would be handling these video cards this would be a good idea.3. Each GPU would need plenty of cooling capacity with heatsync, vents and a fan or two.Imagine...You get Half-Life 3 with revoutionary new graphics. It looks amazing on your PC but your console version doesn't look that great... until.. GPU 3 is released!!!It would be a good selling point for new consoles aside from just a bigger hard drive and an included game or two.[/citation]

I totally think this is a brilliant idea (I said something similar earlier), and as you said if simplified could be made "consumer proof"! Someone else mentioned on an older article that they should think about implementing a technology such as thunderbolt (I appreciate thunderbolt has limitations).

The advantage is is the console could have multiple "thunderboltish" ports so you could actually buy 2 GPU upgrades and run them on separate ports. The console could then run these in an "SLI" fashion.

But I think the overall theory that all games should be built with the base model as recommended specs, but with gpu upgrades things like AA, and resolution could be bumped up so that consoles provide the gfx experience PC users have (or maybe only a year behind rather than where we are now like 7 years behind!) if you're willing to shell out for the upgrade!

Sony could totally do something silly like sell TV's with a gpu slot, so that there is less clutter/cables!
 
[citation][nom]Soda-88[/nom]So Nvidia Optimus/AMD Dynamic Switchable Graphics never happened...?!?[/citation]

It's entirely possible to have two different patents that describe two different ways of doing something similar.
 
[citation][nom]InvalidError[/nom]While it should be technically possible, it would not be practical.Since SLI/CFX work by issuing frames to the next GPU available, if one GPU is substantially faster than the other(s), frames get finished out-of-order and the IGP/slower-GPU's tardy frames simply get dropped which may make the final rendered video stuttery/choppy.Pairing an IGP with a much faster GPU simply does not work for realtime rendering.[/citation]No offense, but you're being extremely narrow-minded. AFR is NOT the only way to make two GPUs work together. Just because this is the easiest, most compatible route for PC gaming doesn't mean it is the best method for all setups. Depending on the GPUs in question, AFR can become nearly useless. On a console (a closed system, in which you only have to worry about one hardware target) you could even just have each GPU (the slower iGPU and the faster dGPU) render/crunch specific things you designate. Forget AFR. No need to worry about compatibility with existing DX/OGL titles. 😛
 
[citation][nom]lovett1991[/nom]With regards to optimisation, all games would be based around the base console model as with the xbox (the newer xbox's 360's are more powerful but all games are designed with the original 360 in mind). Just with upgraded hardware, or a personal rig people can enjoy the greater detail that PC gamers are used to.[/citation]The newer Xbox 360 consoles have several advantages over older ones, but they are NOT more powerful. CPU and GPU performance have remained the same from start to finish.

One of the advantages of consoles, the optimization for a specific set of hardware, butts against this idea. In particular, any game developed prior to an upgrade's existence would not benefit from said upgrade, because it was custom tailored for the non-upgraded hardware. Anything after it would still have to run perfectly on the base model, as upgrades in the console world rarely meet with heavy success.

So while I have nothing against upgrades, they won't benefit you to the same degree they do on a PC. PC games have to work on an extremely diverse bunch of devices, and thus are less-optimized but are also scalable (and thus benefit immediately by gaining performance through upgrades). Still... some kind of one-time major upgrade released halfway through the console's lifecycle (perhaps with a new model released at the same time that has the upgrades integrated) still would be very interesting if nothing else.
 
[citation][nom]alextheblue[/nom]The newer Xbox 360 consoles have several advantages over older ones, but they are NOT more powerful. CPU and GPU performance have remained the same from start to finish.[/citation]

As far as I was aware the 360 slim had die shrink SoC implementation (rather than seperate CPU/GPU), which allowed far lower latency, less pipe-lining, and had potential for improved clock speeds (As well as lower power consumption). It had to be purposely restrained as to keep in check with the original 360's.

I definitely see a difference in performance (not in gaming graphics but in responsiveness on dash/video/other apps etc) between my original 360, elite & slim.

[citation][nom]alextheblue[/nom]So while I have nothing against upgrades, they won't benefit you to the same degree they do on a PC. PC games have to work on an extremely diverse bunch of devices, and thus are less-optimized but are also scalable (and thus benefit immediately by gaining performance through upgrades). Still... some kind of one-time major upgrade released halfway through the console's lifecycle (perhaps with a new model released at the same time that has the upgrades integrated) still would be very interesting if nothing else.[/citation]

Ah ok. So things like shadow, texture quality, resolution are [would be] relatively fixed? In which case GPU upgrades could only offer improved AA performance?

To be fair, whilst older games wouldn't benefit, as newer games are released (post GPU upgrade) devs could program a limited number of modes for each upgrade (if the upgrades are as simple as dark_lord69 said). A GPU upgrade every 2-3 years could realistically mean 3-4 different modes by the end of an 8-12 year life cycle.
 
I expect that the GPUs would have similar specs on both chips. Performance will depend upon if they employ eDRAM, and if so, on which die it is used.
 
[citation][nom]dark_lord69[/nom]***UPDATE***I think I just realized I may have mistaken GPU switching for upgradability but I will keep my initial post incase anyone wants to read my not-so-crazy idea on keeping consoles current.I wrote a blog post about how console makers should add GPU upgradability and how it could be done. Nice to see an idea like this is seriously on the table.My blog basically talks about how you would need to keep it simple and how the upgrade could work similar to the hard drive upgrade bay on the PS3.Here is what I suggested:1. Each GPU would have a number the lowest is the slowest. The initial consoles would come with GPU 1 installed. After about 1 or 2 years GPU 2 could be released. People could upgrade if they wanted to but all games made for the system would detect which GPU was installed and automatically adjust graphical settings (Manual settings/adjustments would still not be allowed). If you have GPU number 5 then you might get the best graphics that game has to offer. If you still have the original one that's fine, you can still play the game you may just not get all the same detail that the number 5 GPU would get. (This could also allow for 1080p [or higher] resolutions in games that would otherwise be 720p.)2. Each GPU would need to almost completely enclosed in plastic to protect it from electro static discharge issues and to keep the circuitry from being exposed. Since the general public would be handling these video cards this would be a good idea.3. Each GPU would need plenty of cooling capacity with heatsync, vents and a fan or two.Imagine...You get Half-Life 3 with revoutionary new graphics. It looks amazing on your PC but your console version doesn't look that great... until.. GPU 3 is released!!!It would be a good selling point for new consoles aside from just a bigger hard drive and an included game or two.[/citation]
All of this makes for more work, and increased cost, which is passed on to everyone. The money is in the software, not the hardware. Besides, there are other components that may need to be upgraded along with an upgraded GPU, such as a power supply.
 
[citation][nom]bigdog44[/nom]All of this makes for more work, and increased cost, which is passed on to everyone. The money is in the software, not the hardware. Besides, there are other components that may need to be upgraded along with an upgraded GPU, such as a power supply.[/citation]

I agree, having multiple versions causes all sorts of headaches with optimizing. It's one of the main reasons many developers stay away from the PC.

On the other hand, Valve is coming out with the STEAMBOX. It's a PC optimized to a console form factor. It will likely run the new STEAM controller interface on top of Windows 8.

If it's successful, it would be logical to release a new STEAMBOX every two years. On launch it will support all the Steam games whereas the XBOX720/PS4 will have a limited portfolio.

Very, very interesting times ahead for consoles.

Even if the STEAMBOX cost a little more this is offset by the savings in games. At first glance, I think the benefits outweight the cons but we'll have to wait and see.
 
This is a terrible idea and will cause much stress over to the developers having to tweak for lower cards and the default cards and then high end cards. I don't think this will happen in this generation.
 
Status
Not open for further replies.