PlayStation 4 Could Get GPU Switching, Dynamic UI

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Someone please kill this dumb thread.

Looks fine to me, lets keep it that way please.

I agree that I'm not sure this is needed. Idle on modern GPUs is very low. 10-15W even on the biggest of cards. How much lower do you want? If they could bring down the power used while doing something light like watching a movie that would be nice. I was in a thread the other day where a card (don't remember which one, I think 7950) used around 50W for watching a Blu Ray. But as also mentioned this is a console and not a battery powered device so whats the harm? Other then higher electric bills.
 

Given the option of saving $1/month on power and $1/month on cooling my using a more efficient resource at no extra cost, I would pick saving $2/month and possibly fan noise over NOT saving $2/month and possibly having an extra fan humming along for no reason.

But now that we know the PS4 actually uses a custom APU, we know there won't be any GPU switching so idle/light-load power on the PS4 will depend on how much of the CPU/IGP the PS4's APU can power down when they are not required.
 
That's what I meant Invalid. If this was a battery powered device like a phone or tablet ANY power savings is good. But this plugs into a wall. It won't die on you in the middle of gaming unless your power goes out for some reason. Yes, lower power bills is good. But its not like it NEEDS GPU switching.
 

Since there is no GPU to switch to/from in the PS4, whether or not it "needs" switching is pointless.

If shaving watts at every economically viable opportunity did not matter, AMD would not be improving their power and clock gating to reduce idle power by 5-10 watts with almost every GPU generation.

As "unnecessary" as shaving watts on plugged-in devices may be, being a little greener each year still look good in AMD, Intel, Sony, etc.'s annual reports.

600 million devices using 1W less each is still 600MW, enough to reduce worldwide power needs by the equivalent of one nuclear power plant. This may not be as glamorous as slashing 30W off at once by replacing a CCFL LCD with a LED one but every watts still adds up in the grand scheme of things.
 

From OFFICIAL specs released at the end of the PS4 unveiling event earlier this year which we can safely presume are final specs since Sony needs their AMD wafers soon if they want to start assembling, testing and refining the design before they start shipping finished units by the end of the year, we already know for sure that the PS4 is based on a custom AMD APU with discrete-class IGP coupled with 8GB 256bits GDDR5 shared RAM.

Do note that this "GPU switching" rumor article dates back to last year, which would be before official specs were released.

I seriously doubt Sony or AMD would want to waste resources on dual-IGP. Much cheaper to power-gate individual GCN modules and enable only as many as are required for the graphics workload.
 
People who say that it will be nice to have 2 gpu to save on the electric bill, make me wonder: how much do you pay monthly because of not being able to save on the electric bill? because I suppose that an extra low performance gpu/cpu will be like another 100$ and in my country with that money I could pay for the electricity of my whole apartment for 5 years! (1,6$ approx. per month) and if I only count the console spent energy... well it will probably break down before it spent 100$ of electricity
 

In most countries, power is far more expensive than that... $0.06-$0.15 per kWh. Unless your landlord is paying for heating/cooling/lighting or you have a tiny apartment with almost nothing plugged in, it makes it pretty difficult to get away with less than $30/month.
 
Ummm....... I've had an HP DV6-6135dx since day 1, sometime in June of 2011. A8-3500M APU + AMD Radeon HD 6750M, which can work as dual graphics and/or switchable graphics, automatically. Who is writing these damn articles?
"Without user interaction:" That's what it does. My only interaction is setting what GPU to use (The 6620G in the APU, or the 6750M dedicated GPU) in Catalyst Control Center for an application, and that's only if it doesn't automatically recognize it, which 99% of the time it knows what to do on its own.
Nvidia Optimus? Oh please, that came into light in 2010. AMD dates this stuff back to 2008.
Again, who is writing these damn articles?
 
Status
Not open for further replies.