PlayStation 5 vs. Xbox Series X: Next-Gen Console Face Off

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The PS4 and Xbox One really aren't that close in raw specs. They have the same GPU architecture, but the Xbox used a 12 CU / 768 core GPU paired with a 'special' 32MB ESRAM buffer and 8GB of DDR3, with 1310 GFLOPS of compute. The PS4 had an 18 CU / 1152 core GPU with GDDR5 instead of DDR3, and 1843 GFLOPS. So the PS4 GPU was 41% faster, and the memory bandwidth was 2.59 times higher if you couldn't fit what you needed into the 32MB buffer (which you couldn't).

PS4 Pro and Xbox One X actually swapped places. The Xbox One X has 12GB GDDR5 and 326GB of shared bandwidth, with 40 CUs / 2560 GPU cores and 6001 GFLOPS of computational power. (Again, same GPU architecture, so GFLOPS is actually a very reasonable comparison point.) The PS4 Pro has 36 CUs / 2304 cores but with the same GDDR5 setup (plus an extra 1GB DDR3 for ... I don't recall), and 4198 GFLOPS. So, PS4 is faster than Xbox One, but Xbox One X is faster than PS4 Pro.

I know the One and 4 are different spec but it isn’t as drastic this time. The 5 hasn’t got DDR4 and eSRAM. Overall the spec difference didn’t make much difference until true next gen titles came out. But the true difference was Microsoft always on DRM and not owning games etc that set them back and they never recovered.
 
On the consoles, the NVMe controller can write directly to video memory without intervention from the CPU. Latency will be very low. And it sounds like both consoles will have hardware for on-the-fly texture decompression. In theory, a game can draw upon 100GB (or even 0.5TB) of assets at any time. That's not something that can be done on a high-end gaming rig--no sane developer is going to go with a 100GB RAM requirement.
It will hopefully be quite some time before we're looking at 500GB of assets for a game! I think 100GB is already obscene and often unnecessary, especially when the difference between 2K and 4K textures is already minuscule.
 
How the hell is the memory a "tie"??? The Series X has a MUCH wider memory bus (320bit vs 256bit on PS5) that allows it to feed it's larger GPU with significantly more memory bandwidth, as well as not having to share said bandwidth with the CPU/audio/etc... which has its own prioritized addresses & traces to the other 6GB. So please explain to me how a wider bus pushing much more bandwidth to the GPU is not a Microsoft/Xbox win again???

And if you're going to round up the PS5's TFLOP number to 10.3 (from 10.28), then you should do the same for Series X & list it as 12.2 (from 12.16). Only fair.
So, I've changed the memory. Initially, as I was writing this, I wasn't aware that Sony had actually divulged the 256-bit interface. I think it's a unified memory space, but yeah ... 512-bit GDDR6 on the Xbox is pretty nuts!

As for the TFLOPS, the Xbox is 52 CUs, or 3,328 GPU cores, running at 1.825 GHz. It's technically 12,147.2, which is why it rounds down to 12.1 TFLOPS.
 
Mar 21, 2020
1
3
15
A lot of people are overlooking the fact the file IO takes MUCH longer the floating point ops. This seems to be a huge misconception between the the two systems. After writing a simple test on my computer with 1000000 floating point ops (using * as it's the slowest, and few optimized codes use /) and reading a file (newline char seperate file of int) of 1000000 ints, the floating point ops came to 34ms and the fileIO came to 555ms. If Sony's is IO speed is roughly twice what the XBSX is, or roughly half the time, the PS5 could perform Many more flops in the time that XBSX is still loading the file. I should also mention that the file I tested was not even in .obj format which would take much longer to parse; only a little bit longer if you thread. Meaning while rendering the same scene, the PS5 will have already have finished rendering by the time the XBSX has finished loading the file. With rough numbers, benefitting XBSX, the PS5 can load a file the file and preform 18M flops in the time that the XBSX can load the file and do 4M flops. However this does mean that at a certain point the XBSX can load a scene faster, but it'd be around 50 light sources with physics already accounted for (wouldn't be more than 1000000 ops).

Another thing people miss is that both consoles will have close to instantaneous seek time, meaning game sizes will stay about the same as (unlike HDDs) the developer won't have to put hundreds (literally) of copies of textures and models on the harddrive.

There are several considerations here, however. For one, 10GB of the XSX's memory has 112 GB/s faster throughput. Since storage will first need to load into RAM, this much faster space where visuals are stored will more than make up for the difference in I/O.


It's also a major assumption that the PS5 can achieve its theoretical speeds. It has been stated (and some devs I know have confirmed) that normal compute power sits at around 9TF for the PS5. It can boost, but it has to make sacrifices when it does so. The XSX is steady on pretty much all its measureables.

One thing MS stressed was that their storage solution was designed to stay cool enough that I/O bandwidth does not throttle downward. It is my understanding that the storage in the PS5 has a high theoretical capacity, but that even with a heat sink on the unit that performance degrades greatly over time, down to levels that are equal to or lower than what the XSX has.

This is really going to be the issue and, I've been told, may be why we'll see the PS5 delayed by at least 6 months because they have severe heat issues with the SSD and this causes data throughput issues. Engineering on the Xbox has been done for awhile and they aimed to deliver steady, predictable performance from all components. I've heard the XSX has been able to sustain its I/O speeds for days in their testing. While I've also heard that they haven't shown the PS5 off because they don't know how to cool it and their original form factor did not work to deliver performance.

They only put together their presentation for damage control, but they didn't tell you like MS did that their system could sustain its performance numbers over time.

The devs I work with have told me for a few months that they feel Sony really dropped the ball and felt that the PS5 would be delayed. Now they can pin the delay on the Coronavirus crisis, but that will be hard if MS can deliver the XSX at the end of the year.

We'll have to see when they reveal the look and allow outside testing and benchmarking. MS has already done that, just with NDAs. From what I hear, Sony has not and many of their partners are quietly concerned - though they won't say that publicly.
 

Thretosix

Honorable
Apr 27, 2017
28
8
10,535
you forgot a BIG asterisk...it is proprietary storage!

remember how many ppl disliked sony's handhelds and their proprietary storage?

they were costly as F.


got some gen 4 storage sittign around? too bad. you cant use it casue proprietary :|
This should have been a tie for now. Once 3rd parties start making the proprietary storage for the Series X even those prices will come down no different than the storage for PS5. Just like the cost of the now overpriced M.2's that might work with the PS5, but like the article says, the PS5 compatibility could have issues with heatsink, and other size or heating issues, that will be easier going forward with advances in technology, but that doesn't come cheap either. There is no clear winner here. The difference will be closer to a second or two at best between them. All gamers win with this feature regardless of the console they choose.
 
Mar 23, 2020
3
0
10
What will be important is how long the Xbox GPU spends in boost. Sony have said that the cooling is sufficient that the PS5 should be at max clock speed or very close to it all the time. If the Xbox spends more time towards it's base clock speed then it will actually perform less well than the PS5.
 
Mar 27, 2020
2
4
15
Well now you have them mixed, the Xbox Series X does not do boost and can run at the given frequencies all the time. So CPU at 3.8GHz (SMT off) and GPU at 1825MHz at the same time. The PS5 can either do max on GPU or max on CPU, but not at the same time. e.g. if CPU is at 3.5Ghz the GPU will maximum hit 2Ghz (based on Cerny saying that it would be about 10% boost). So for the Xbox the developer has all power available whenever they want it on the PS5 the need to balance it out and average performance will be even less than the maximum numbers the showed for it.
 
Mar 23, 2020
3
0
10
Then why does everywhere list the higher figure of the Xbox GPU as boost if it is not? Can you link your source as to where the Xbox GPU is able to run at maximum frequency all the time?

Cerney did not mention developers having to juggle the CPU and GPU speeds in his recent presentation. He mentioned a 10% drop in power only equates to a ~2% reduction in clock if the system got too hot but that it's expected to run at or close to the maximum frequency all the time. Sure if it's possible for the developer to lower the CPU speed it will give more room for the GPU to manoeuvre in the heat budget but none of that was mentioned.

This is video I'm using as a source:
View: https://www.youtube.com/watch?v=ph8LyNIT9sg

Cerney starts talking about the size and frequency of the PS5 GPU at 31:40, and specifically mentions 37:30 scaling down to control heat.
 
Then why does everywhere list the higher figure of the Xbox GPU as boost if it is not? Can you link your source as to where the Xbox GPU is able to run at maximum frequency all the time?

Cerney did not mention developers having to juggle the CPU and GPU speeds in his recent presentation. He mentioned a 10% drop in power only equates to a ~2% reduction in clock if the system got too hot but that it's expected to run at or close to the maximum frequency all the time. Sure if it's possible for the developer to lower the CPU speed it will give more room for the GPU to manoeuvre in the heat budget but none of that was mentioned.

Cerney starts talking about the size and frequency of the PS5 GPU at 31:40, and specifically mentions 37:30 scaling down to control heat.
Just my take: be skeptical of anything published by either manufacturer when it comes to boost clocks. Cerny did mention at one point how some "worst-case" workloads might drop clocks more, but they're not going to be common. But we don't actually know this, and he's basically doing PR for the console and wouldn't say anything bad. Based on how SmartShift works on PCs, don't be surprised if there are many cases where clockspeeds and performance drop up to 10%. I expect Xbox to also clock down from maximum boost sometimes, but without actual hardware in hand, doing testing, it's impossible to state for sure what typical clocks will be.
 

Thretosix

Honorable
Apr 27, 2017
28
8
10,535
you forgot a BIG asterisk...it is proprietary storage!

remember how many ppl disliked sony's handhelds and their proprietary storage?

they were costly as F.


got some gen 4 storage sittign around? too bad. you cant use it casue proprietary :|

3rd parties will make storage for the proprietary storage bringing down the costs, happens every generation for such devices. You seriously can't compare the support for Sony handhelds with anything. Sony handhelds were all failures. Curious when they will make affordable solutions small enough with no heatsinks for the PS5. This is a draw at best.
 
Mar 27, 2020
2
4
15
Then why does everywhere list the higher figure of the Xbox GPU as boost if it is not? Can you link your source as to where the Xbox GPU is able to run at maximum frequency all the time?

Cerney did not mention developers having to juggle the CPU and GPU speeds in his recent presentation. He mentioned a 10% drop in power only equates to a ~2% reduction in clock if the system got too hot but that it's expected to run at or close to the maximum frequency all the time. Sure if it's possible for the developer to lower the CPU speed it will give more room for the GPU to manoeuvre in the heat budget but none of that was mentioned.

This is video I'm using as a source:
View: https://www.youtube.com/watch?v=ph8LyNIT9sg

Cerney starts talking about the size and frequency of the PS5 GPU at 31:40, and specifically mentions 37:30 scaling down to control heat.

No they don’t look at thermals, but rather activity in the cores and then have a power budget. He mentions at 37:00 that they can shift power from the CPU to the GPU when CPU has less to do. This way they ensure that there will never be any thermal throttling on the PS5 as that would be unpredictable for the developer.

as for the Xbox Series X I will use this video from Digital Foundry that was invited by Microsoft and got hands on with the Xbox:
View: https://youtu.be/qcY4nRHapmE

At 2:48 he mentions the CPU clocks are locked and 3:57 the GPU. So Microsoft has a higher power budget and a thermal solution to handle it. They made it very clear that they wanted the locked clocks to avoid any thermal limitations from boosting and that boosting either CPU or GPU when the other one is not doing much makes it harder for the developers to utilize the maximum potential.
 
D

Deleted member 14196

Guest
Well now you have them mixed, the Xbox Series X does not do boost and can run at the given frequencies all the time. So CPU at 3.8GHz (SMT off) and GPU at 1825MHz at the same time. The PS5 can either do max on GPU or max on CPU, but not at the same time. e.g. if CPU is at 3.5Ghz the GPU will maximum hit 2Ghz (based on Cerny saying that it would be about 10% boost). So for the Xbox the developer has all power available whenever they want it on the PS5 the need to balance it out and average performance will be even less than the maximum numbers the showed for it.
That right there should be a no brainer for people to buy the Xbox
 
Mar 23, 2020
3
0
10
No they don’t look at thermals, but rather activity in the cores and then have a power budget. He mentions at 37:00 that they can shift power from the CPU to the GPU when CPU has less to do. This way they ensure that there will never be any thermal throttling on the PS5 as that would be unpredictable for the developer.

as for the Xbox Series X I will use this video from Digital Foundry that was invited by Microsoft and got hands on with the Xbox:
View: https://youtu.be/qcY4nRHapmE

At 2:48 he mentions the CPU clocks are locked and 3:57 the GPU. So Microsoft has a higher power budget and a thermal solution to handle it. They made it very clear that they wanted the locked clocks to avoid any thermal limitations from boosting and that boosting either CPU or GPU when the other one is not doing much makes it harder for the developers to utilize the maximum potential.

Thanks for linking that. Why plenty of other places are incorrectly listing it as a boost clock and the base of ~1.4GHz then I don't know. Looks like the PS5 will lag behind in graphics performance in this generation. Having double the SSD performance might lead to some interesting differences in games; a bit like the PS1 vs N64 where the Playstation's CD drive meant that it had massive games with varied textures and the N64 games pushed the advantages of cartridge loading being instantaneous.
 
Xbox wont have The Last Of Us, and that's really all there is to it! The Xbox library can be played on PC anyways. Like someone said before, a PS, PC, and Switch are really the absolute most you'll ever need for current-gen gaming.
 

Thretosix

Honorable
Apr 27, 2017
28
8
10,535
For the first time in years, I am tempted to return to console gaming. They will finally support high frame rates and variable refresh rates. The addition of an SSD is great, as well!

Still, one burning question; where applicable, will these new consoles support keyboard and mouse control? If so, it's a no-brainer for me. On paper, the XSX GPU looks comparable to an RTX 2080, which is anywhere from $950 to $1100 CAD. (I'm still using a GTX 1070).

If the XSX can consistently push 4k at 60 FPS, minimum, and offer keyboard & mouse support, count me in for this generation. Even 1440p at 120 FPS would be great.

Thoughts from anyone on the keyboard and mouse support? Perhaps I already missed this info...?
The Xbox One already has keyboard and mouse support. It didn't release this way so which games support it may vary. I would think the Series X would require it going forward. Xbox has been pretty good with peripherals as well. Games like Flight Simulator coming out for the Xbox and PC will use flight sticks and keyboards for sure.