News PlayStation 5 Pro Enhanced requirements allegedly leaked — upscaled 4K at constant 60 FPS with ray tracing is the new target

If the game is CPU-limited at 30 fps, I don't see a 10% boost to CPU clock speed being able to increase it to 60 fps, even if the GPU takes some of the workload off. Unless, there is some kind of frame gen, then you'd be able to go from 30 to 45 and then use frame gen to achieve 60+, potentially.
 
PS5 Pro RT enabled is still the most energy efficient to play RT enabled games than a PC with nvidia rtx 4090.
Well said, japanese mind set is to be extremely effecient in acheiving the desired level of visuals. unlike Nvidia where there approach is to use brute force which resulted in a graphic card(4090) that consume rediclus amount o energy (450w), which is twice the whole ps5 system consume (225w),
If this trend continues, the consoles will have serious advatange over PC specially in terms of effieciency and value per dollar, and coming ps6 would be the defalut choice for gamers without a second thought
 
  • Like
Reactions: iLoveThe80s
Well said, japanese mind set is to be extremely effecient in acheiving the desired level of visuals. unlike Nvidia where there approach is to use brute force which resulted in a graphic card(4090) that consume rediclus amount o energy (450w), which is twice the whole ps5 system consume (225w),
If this trend continues, the consoles will have serious advatange over PC specially in terms of effieciency and value per dollar, and coming ps6 would be the defalut choice for gamers without a second thought
You need to learn what efficiency means before you criticize products for not being efficient. 4090 is the 2nd most efficient GPU ever released for PC's. Using more power does not automatically make something inefficient. Also the 4090 doesn't consume 450W during typical gaming. Max TDP does not = typical power usage. The 4090 uses more power, but it is so much faster than everything else, that it beats them all except the 4080 in efficiency. Even the PS5 Pro isn't going to be in the same ballpark as a 4090 in performance.

energy-efficiency.png
 
Well said, japanese mind set is to be extremely effecient in acheiving the desired level of visuals. unlike Nvidia where there approach is to use brute force which resulted in a graphic card(4090) that consume rediclus amount o energy (450w), which is twice the whole ps5 system consume (225w),
If this trend continues, the consoles will have serious advatange over PC specially in terms of effieciency and value per dollar, and coming ps6 would be the defalut choice for gamers without a second thought
If the article screenshot is anything to go by, NVIDIA doesn't need to fear Sony's RT implementation at all.
 
Cyberpunk 2077 (even in 1080p60) with path tracing and all bells and whistles enabled wants a word with you.

Last time I played it on my RTX 4090 it was pulling ~350W and that was before Phantom Liberty DLC.
The math I learned in grade school taught me 450 ≠ 350. Fully path traced AAA gaming does not qualify as a typical gaming scenario. Why would you even bring that up in an article about a console? The PS6 is not going to be able to run 2077 fully path traced. PS5 is running AMD hardware. WithOUT frame gen, the 4090 is over 4 times faster than the fastest AMD GPU.

uYzCuMbiQJjQvKwazFDA8Z.png
 
  • Like
Reactions: Sluggotg
Fully path traced AAA gaming does not qualify as a typical gaming scenario.
Then they should better not brag about RT at all.
Why would you even bring that up in an article about a console?
I was just clarifying that yes, RTX 4090 uses a lot of power when gaming. Your definition of gaming is apparently different than mine. Who would buy an expensive RTX 4090 and then use it to play games that don't tap it's power?
The PS6 is not going to be able to run 2077 fully path traced.
Of course it won't, even calling what's in that screenshot RT is silly.
 
I don't think 4k60 is going to be doable, without upscaling, or framegen, especially if they want RT to be turned on, and that's going to lead to image degradation, and loss of responsiveness, due to increased latency, if they go with framegen. Nvidia has Reflex, to offset some of the increased latency, but AFAIK, AMD has no equivalent technology, and AMD is what is found in consoles.
 
I don't think 4k60 is going to be doable, without upscaling, or framegen, especially if they want RT to be turned on, and that's going to lead to image degradation, and loss of responsiveness, due to increased latency, if they go with framegen. Nvidia has Reflex, to offset some of the increased latency, but AFAIK, AMD has no equivalent technology, and AMD is what is found in consoles.
Don't worry, if past couple of months has taught us anything is that they will just throw AI at it and the AI will make it possible.

/s
 
Well said, japanese mind set is to be extremely effecient in acheiving the desired level of visuals.
The bottom line is not whether my dad can whup you dad because that's what the comments have devolved into. The point is that while Sony's product may not be as powerful as an Nvidia 4090 or have the CPU power of a Ryzen 7800X3D or an Intel 14900KF. You can achieve visuals to the point that the gamer doesn't feel like they are missing out on anything significant.

When I purchased a gaming rig with an i910850K CPU and an RTX 3080, I dialed up the details on MSFS 2020. I thought I was in gaming heaven.

Then Microsoft released it on the XBOX Series X/S. I have played MSFS 2020 on both in 4K and tbh, if I didn't know which was which, I would not be able to tell a noticeable difference. That's what Sony is trying to do with the PS5 Pro. Upgrade the visuals that they look good enough where the gamer feels happy and content not having to shell out a few grand on a new high-end PC.

All the games I play on both the Series X and PS5 don't make me wish I was playing it on the PC. They all look good enough.
 
You need to learn what efficiency means before you criticize products for not being efficient. 4090 is the 2nd most efficient GPU ever released for PC's. Using more power does not automatically make something inefficient. Also the 4090 doesn't consume 450W during typical gaming. Max TDP does not = typical power usage. The 4090 uses more power, but it is so much faster than everything else, that it beats them all except the 4080 in efficiency. Even the PS5 Pro isn't going to be in the same ballpark as a 4090 in performance.

energy-efficiency.png
To be fair, it is not an apples to apples comparison here. Consoles are clearly more efficient because they need the least amount of total system power to run games. There are always compromises made when it comes to visual quality on a console. The only question is how bad these compromises are.

On the other hand, PCs are more general purpose machines with bloated OS and inefficiencies to tackle. But a RTX 4090 will deliver probably the best image quality and frame rates here. The RTX 4090 is rated at 450W, with some allowing power limits up to 600W. While it sounds inefficient, if one is to limit the graphic quality to match the console graphic settings (often with dynamic resolution and upscaling) with a capped frame of 60 FPS, the RTX 4090 should be drawing quite a lot less power than the 450W. But for a RTX 4090 user, there is little reason to do that since they probably bought it to run it at max settings.

So ultimately, it depends on individuals' priority here. If one can afford and prefers very high FPS and image quality, then RTX 4090 is more sutiable. For me, I am fine with running games on PS5. Some games I may still run on my PC, especially if it is easier to run it with keyboard and mouse.
 
To be fair, it is not an apples to apples comparison here. Consoles are clearly more efficient because they need the least amount of total system power to run games.
It doesn't appear to matter how many times this is stated. The definition of most efficient is not the one that uses the least power. It could be the one that uses the least power, but it almost never is. The lowest powered gaming GPU at stock settings is never the most efficient gaming GPU. The PS5's GPU is slower than the slowest GPU in the chart I post above. 220W to get less than 4FPS is absolutely putrid efficiency and it gets obliterated by a 4090 getting 70fps at 450W. Twice the power to get over 15x the performance.