News Report: New, Cheaper 1440p Next-Gen Xbox in the Works

bit_user

Polypheme
Ambassador
It strikes me as odd that they'd make it less powerrful than the XBox One X, in any respect. And from the sound of it, the GPU performance is indeed worse.

IMO, the smart thing to do would just be to deliver a cost-reduced version of the One X, maybe with a few updates (like newer CPU cores & improved GPU features). However, they could simply target the One X, and know that anything which performs well on it @ 1440p would also work well on LockHart.
 

atomicWAR

Glorious
Ambassador
It strikes me as odd that they'd make it less powerrful than the XBox One X, in any respect. And from the sound of it, the GPU performance is indeed worse.

IMO, the smart thing to do would just be to deliver a cost-reduced version of the One X, maybe with a few updates (like newer CPU cores & improved GPU features). However, they could simply target the One X, and know that anything which performs well on it @ 1440p would also work well on LockHart.

We were just talking about this on another forum. Someone was claiming they thought the IPC increase for RDNA 2 would be around 50-60 percent. If true then a 4TF RDNA 2 chip would perform like a 6TF RDNA 1 chip or stronger if over 50% IPC increase. Personaly I think an IPC increase in the range of 35-40% as more likely, at least from what AMD is claiming on RDNA 2 gains and what reality actually gives them in actual frame rates. This would put the XBSS just behind the XB1X. With the console targeting a lower resolution than the XB1X it would still be a solid upgrade from last gen graphics, particularly if they go for 1080P. Regardless of how close those guesses are I am very interested to see how the XBSS plays out
 
It strikes me as odd that they'd make it less powerrful than the XBox One X, in any respect. And from the sound of it, the GPU performance is indeed worse.
Teraflops are not necessarily directly comparable between different architectures though, and are measuring compute performance, not gaming graphics performance. An RX 580 is roughly a 6 Tflop card, and its GCN architecture is similar to what an Xbox One X uses. However, the newer RX 5500XT is typically a little faster, but it's only a 5 Tflop card as far as compute performance is concerned, due to its newer RDNA architecture offering more graphics performance relative to compute performance. It's possible we could see things shift a bit further in that direction with RDNA2. The updated architecture will supposedly be improving efficiency further, and part of that might involve trading away some more compute performance for a given level of graphics performance. New features like variable rate shading could also allow for some additional performance in games that support them. And this is assuming that this rumored "about 4 teraflops" number is even accurate.
 

Chung Leong

Reputable
Dec 6, 2019
493
193
4,860
It strikes me as odd that they'd make it less powerrful than the XBox One X, in any respect. And from the sound of it, the GPU performance is indeed worse.

The next-gen GPU supports variable rate shading. Games can hit the frame-rate target with less raw computational power. Scaling should be largely automatic thanks to VRS. On the cheaper console, peripheral parts of the scene will simply be rendered more frequently at half or quarter resolution.
 
Apr 1, 2020
1,438
1,089
7,060
Uh, why? If the PS5 reveal was anything to go the flagships are going to be struggling to even to 4K30, but cut the power by 66% and the resolution to just 2560x1440, and expect to get 60fps? I don't see it, not without games shipping with an alternate set of textures and detail settings to lower the load level...
 

alextheblue

Distinguished
not without games shipping with an alternate set of textures and detail settings to lower the load level...
Yes. PC games do it all the time. The devs will dial things in to run well out of the box.

I think the goal here is to use newer architectures (Zen, RDNA) to achieve similar performance to the One X at a lower production cost. Of course, these cost savings may be offset if they're using an SSD, but IMO that would just mean a better overall experience at a similar price point.
 

Jim90

Distinguished
Teraflops are not necessarily directly comparable between different architectures though, and are measuring compute performance, not gaming graphics performance. An RX 580 is roughly a 6 Tflop card, and its GCN architecture is similar to what an Xbox One X uses. However, the newer RX 5500XT is typically a little faster, but it's only a 5 Tflop card as far as compute performance is concerned, due to its newer RDNA architecture offering more graphics performance relative to compute performance. It's possible we could see things shift a bit further in that direction with RDNA2. The updated architecture will supposedly be improving efficiency further, and part of that might involve trading away some more compute performance for a given level of graphics performance. New features like variable rate shading could also allow for some additional performance in games that support them. And this is assuming that this rumored "about 4 teraflops" number is even accurate.

Exactly!!
As a reminder to others getting confused (or deliberately spreading lies) by implying Teraflop and 'overall' performance are directly linked, please watch Mark Cerney's PS5 "Road to PS5" video...
View: https://www.youtube.com/watch?v=ph8LyNIT9sg
 
  • Like
Reactions: alextheblue

bit_user

Polypheme
Ambassador
As a reminder to others getting confused (or deliberately spreading lies) by implying Teraflop and 'overall' performance are directly linked,
I don't think anyone in this thread is deliberately spreading lies, but the specs differences seemed large enough that I didn't expect RDNA's efficiency gains would necessarily cover them. However, I probably didn't account for the additional gains of RDNA2.

In particular, I think the decrease RAM size is notable. From the sound of it, total physical RAM decreased from 12 GB (One X) to 8 GB (LockHart). Though my original post didn't mention it, that was one of the factors motivating it.

please watch Mark Cerney's PS5 "Road to PS5" video...
Thanks for posting, and I have a lot of respect for Mr. Cerny, but 52:44 (+ ads) is a lot to ask of someone.
 
  • Like
Reactions: alextheblue

Fleet33

Honorable
Nov 13, 2015
3
1
10,515
Teraflops are not necessarily directly comparable between different architectures though, and are measuring compute performance, not gaming graphics performance. An RX 580 is roughly a 6 Tflop card, and its GCN architecture is similar to what an Xbox One X uses. However, the newer RX 5500XT is typically a little faster, but it's only a 5 Tflop card as far as compute performance is concerned, due to its newer RDNA architecture offering more graphics performance relative to compute performance. It's possible we could see things shift a bit further in that direction with RDNA2. The updated architecture will supposedly be improving efficiency further, and part of that might involve trading away some more compute performance for a given level of graphics performance. New features like variable rate shading could also allow for some additional performance in games that support them. And this is assuming that this rumored "about 4 teraflops" number is even accurate.


Exactly! There is too much simple comparisons and misunderstandings spreading like wildfire. Then have these console gamers posting false information across the internet. That Jaguar/GCN APU was weak, mostly in the CPU(department).

A console aiming for 1440p (with possibly some heavy graphical settings turned down) would be a huge smart play by Microsoft. Besides, it's going to look great on a 1080p television anyways....
As someone stated already, this has always existed on PC to turn down some demanding settings (if needed) that in some cases aren't that noticeable.

Even with that "Checkerboard" idea in the last generation looked great in 4K. Who's to say this LockHart won't deliver 4K gaming? It's a nice compromise for the money you'll be saving! Now if u want True 4K native, then go for the Series "X"

The problem would be all the general public (and PS fanboys) who don't understand the tech which will just spread more false information. I haven't purchased a console in over 15 years, but that Series "X" is tempting.
I need to build a new PC (as the current one is going on 10 years), that Series "X" could satisfy me for 3 years. While I wait for new PC parts advancements and improvements in the coming years.
 

bit_user

Polypheme
Ambassador
A console aiming for 1440p (with possibly some heavy graphical settings turned down) would be a huge smart play by Microsoft. Besides, it's going to look great on a 1080p television anyways....
How many people do console gaming on a PC monitor, though? That's one thing that really jumped out at me about 1440p. I suspect game developers are going to do all of their testing at 4k and 1080p, with maybe a token run or two at 1440p and 720p to make sure they're not completely broken.

I need to build a new PC (as the current one is going on 10 years), that Series "X" could satisfy me for 3 years. While I wait for new PC parts advancements and improvements in the coming years.
Why do you need to wait an additional 3 years, to upgrade a PC that's already 10 years old? You can find plenty of benchmarks comparing multiple generations, and it seems well worthwhile to upgrade even a Sandybridge, already (though I suspect you're on Nehalem - Sandybridge will be 10 years in 2021).

That said, if you're only considering an Intel CPU, I'd have to agree that Comet Lake's high power-consumption & corresponding cooling requirements make it rather unappealing. Still, you probably have only about 15 months until Intel launches its first 10 nm desktop CPU.
 

alextheblue

Distinguished
How many people do console gaming on a PC monitor, though? That's one thing that really jumped out at me about 1440p. I suspect game developers are going to do all of their testing at 4k and 1080p, with maybe a token run or two at 1440p and 720p to make sure they're not completely broken.
I would bet many devs will offer options (like how some games let you favor fidelity vs framerate) and/or automatically tweak things depending on output resolution. If you're hooked up to a 1440p or 4K panel, increasingly-popular dynamic resolution may offer benefits vs a 1080p panel. I don't want to overstate the potential benefits, but it may not be like times of yore where a console game targeted a single resolution and set of quality settings and simply scaled output resolution. But again it will vary substantially by game and developer... the main reason to use a PC-centric panel is response time. I've considered using a low-latency PC display combined with an OSSC for my classic consoles, but the cost always makes me reconsider. Maybe some day!
In particular, I think the decrease RAM size is notable. From the sound of it, total physical RAM decreased from 12 GB (One X) to 8 GB (LockHart). Though my original post didn't mention it, that was one of the factors motivating it.
Well, if they're using an SSD like the Series X, I don't feel the reduced RAM will be a substantial problem. Especially with developers getting used to heavily leaning on the SSD on both Series X and PS5. So yeah, I still feel they'll be in the One X ballpark in terms of visuals, while enabling a better overall experience out of a smaller APU. The power and thermal benefits could be quite substantial too, so I expect this to be a simpler, smaller package. My initial reaction matched others, I was like why not just dieshrink the One X chip? But really I think this is a better use of their resources, especially if it was being developed in tandem with the larger APU going in Series X. RDNA (2?) and Zen (2?) will be wildly more efficient than cat cores and GCN, and I bet the higher IPC (regardless of final core count) will chafe devs less when scaling down from the top model.
 

Fleet33

Honorable
Nov 13, 2015
3
1
10,515
How many people do console gaming on a PC monitor, though? That's one thing that really jumped out at me about 1440p. I suspect game developers are going to do all of their testing at 4k and 1080p, with maybe a token run or two at 1440p and 720p to make sure they're not completely broken.


Why do you need to wait an additional 3 years, to upgrade a PC that's already 10 years old? You can find plenty of benchmarks comparing multiple generations, and it seems well worthwhile to upgrade even a Sandybridge, already (though I suspect you're on Nehalem - Sandybridge will be 10 years in 2021).

That said, if you're only considering an Intel CPU, I'd have to agree that Comet Lake's high power-consumption & corresponding cooling requirements make it rather unappealing. Still, you probably have only about 15 months until Intel launches its first 10 nm desktop CPU.

1440p resolution isn't the focus, it's that it could be capable of higher res than 1080p so it can still look good on a 4k television. Remember that Unreal Tech Demo running at 1440p on the PS4, everyone was impressed but wasn't even 4K. Forget PC monitors, don't think many gamer use a console with a PC monitor. But a more affordable console for the masses, especially in countries with unfair price markups that pay nearly double or more could offer those locations a more affordable console. Think Australia and plenty of other countries in Africa and middle east pay more per console unit than the United States or European countries.


Yes, I'm waiting to build a new PC with better parts than are currently out. Improvements in Ray-tracing with better 4K FPS without shelling nearly $1,000 NVIDIA RTX cards. Some refined Navi cards on the next line of AMD cards should do the trick. Faster SSDs with higher TB for under $200. A newer Ultra-wide 3440x1440 144hz monitor with OLED or (MicroLED? pls?).

I'm in no rush at all, I'm just very picky to what I want lol. I also could be looking to renting out my house, to move closer to my job. I'll find an apartment and have the GF move in so we can both enjoy some games on the television. Depends on big the apartment and purchase another house so I can setup my PC.

I don't need, I want a new PC. I'll still continue to use my current computer, but a 4K ray-tracing system for around $400-500 is a damn good deal. AFAIK, 1st party Xbox games can be purchased once and play on PC using the same account. PlayStation doesn't offer this, nor does PlayStation offer support for older games.
Sell the Series "X" after 3 years or so and continue my PC gaming lifestyle ;)
 
Last edited:
  • Like
Reactions: alextheblue

bit_user

Polypheme
Ambassador
I've considered using a low-latency PC display combined with an OSSC for my classic consoles,
IMO, the best scenario (though not very practical) would be to hack the console and replace its video signal generator with a digital output. I'm not sure how hard that would be, on different consoles. I think it'd be easiest on those with a conventional framebuffer + discrete RAMDAC.

if they're using an SSD like the Series X, I don't feel the reduced RAM will be a substantial problem.
Depending on how the RAM is used, it could be problematic to fall back on virtual memory. You're not going to get away with swapping out stuff that's needed from one frame to the next.

Basically, what this means is that games are going to have to be designed for 8 GB of physical RAM, except for the extra textures and model resolution needed for 4k. However, all of the baseline datastructures, AI, maps, OS, etc. (everything that's needed at any given point in time) will all have to fit in that 8 GB of physical RAM. So, that means the only thing they can do with the extra 8 GB RAM in the high-end version is just higher-res textures and models. I think that's pretty disappointing, actually.

My initial reaction matched others, I was like why not just dieshrink the One X chip?
A big problem with this approach is memory. The "One X" had a 384-bit GDDR5 setup. That adds cost in the memory and motherboard. So, I get why they wanted to use a cheaper 256-bit interface, but a sad consequence is RAM size. I wonder if they really couldn't have contracted some 12 gigabit GDDR6 dies.
 
  • Like
Reactions: alextheblue

bit_user

Polypheme
Ambassador
Yes, I'm waiting to build a new PC with better parts than are currently out. Improvements in Ray-tracing with better 4K FPS without shelling nearly $1,000 NVIDIA RTX cards. Some refined Navi cards on the next line of AMD cards should do the trick. Faster SSDs with higher TB for under $200. A newer Ultra-wide 3440x1440 144hz monitor with OLED or (MicroLED? pls?).
That's all stuff you can add incrementally, if you just chose to upgrade your CPU, motherboard, and memory, today.

As for the monitor, you'll probably die of old age before OLED becomes a practical and affordable option. I've been waiting for like 15 years, and it's still not here!

I don't need, I want a new PC.
Yeah, a Sandybridge i7 is still meeting my needs. I can't really justify upgrading it, this year.

PlayStation doesn't offer this, nor does PlayStation offer support for older games.
Didn't they say the PS5 would support all PS4 games?
 
  • Like
Reactions: alextheblue

alextheblue

Distinguished
IMO, the best scenario (though not very practical) would be to hack the console and replace its video signal generator with a digital output. I'm not sure how hard that would be, on different consoles.
I'm thinking mainly of 8 to 32 bit era consoles, in which case you might just take the raw RGB signal and bypass the video encoder chip. There are sometimes mods that do just that, resulting in an improved video output (to various degrees), but it's still analog and needs further processing for/by a modern display. Even those aren't generally worth it IMO - well, unless you had a particular model with a dodgy video output chip and/or shoddy board layout to start with, as they often used various components and board revisions over their lifetime. In many cases the stock internals are good enough, and/or the system just needs fresh capacitors.

For most users, the bigger issues stem from stock output types, what inputs a modern TV has, and how modern TVs process those signals. It's better to prioritize those larger issues before worrying about internals, IMHO. After that, there may not be enough room for improvement to even concern yourself with (again, certain exceptions aside). There are various plug-and-play options, but if you have multiple consoles, I think the OSSC is the best overall. At least when coupled with good quality, well shielded SCART cables + a really good SCART switch. Consoles -> Switch -> OSSC -> HDMI. If you have a single old console in mind, there are other good options, depending on your display. Generic "X analog input to HDMI" adapters tend to be garbage plus they add extra lag, but there are more focused solutions.
 
  • Like
Reactions: bit_user

alextheblue

Distinguished
Depending on how the RAM is used, it could be problematic to fall back on virtual memory. You're not going to get away with swapping out stuff that's needed from one frame to the next.
Ah, I should have been more clear. I was referring to games built from the ground up with the SSD in mind. For example, they develop a title which utilizes the SSD for streaming for both the Series X and Series S (scaled down for S as appropriate), and then also build versions for older Xbox models that can't rapidly stream and decode compressed assets. There's no way they'll use the SSD as virtual memory and attempt to straight up run One X games on it. They will have to run original Xbox One binaries and assets for older titles, for backwards compat.

So yeah, the system's graphics should match the One X, but not for existing code. Devs will obviously be able to update their games (if they choose) to offer an enhanced version (just as some original Xbox One games were updated to be "One X Enhanced"). But even updated versions of existing titles probably won't quite reach One X levels of fidelity, given it would take extensive work required to get the best results. Any improvements are welcome, but I don't think you'll see same gains as games that are built with streaming from the SSD in mind from the start.
 
Last edited:
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
I'm thinking mainly of 8 to 32 bit era consoles,
By the time of the 32-bit era, I think consoles were framebuffer-based. So, it'd be cool if you could just swap out the RAMDAC with a HDMI output section. Of course, I'm assuming they had a separate RAMDAC. Probably by the next generation, it was already integrated into the GPU.

Anyway, I bought a first-gen PS3, specifically because it had a full PS1 and PS2 + HDMI out. It even had a few output enhancements you could enable.

In many cases the stock internals are good enough, and/or the system just needs fresh capacitors.
Do the capacitors fail only with use, or is sitting on the shelf for long enough all that's required for them to degrade?