[SOLVED] Next-gen consoles and their impact on system requirements

Apr 8, 2020
38
3
35
Hi, i am going to be building a new mid range pc very soon, however i cannot wait for the Ryzen 4000 and Nvidia 30xx series, i have to do it in like a few weeks.

Iwas wondering how will the consoles launching later this year going to affect PC system requirements? Besides work, i also heavily use my PC for gaming and I'm worried that I am going to waste 900 euros by building something that's going to be totally obsolete in 2 years.

I need opinions on this topic, how will the next gen impact low end and midrange part of the market since that's where the most people are?

I don't need it just for myself, but for every fellow gamer in the same bucket as me, and we all need opinions, so please, write away! Thank you!
 
Solution
That's a great explanation, i really appreciate the time it took to write it, however my main question is will we see drastic increases in system requirements once when the next gen becomes the standard?
1)It may not seem like it, but the console market is larger than the DIY PC one.
2)Progression of the DIY PC hardware far outpaces that of consoles.
3)Unfortunately, DIY PC hardware is held back by the slow pacing of software development, which tends to run in line at times with consoles.

I'd say no. If you've got something decent on the PC end, then you are ahead of, and waiting, on the consoles to catch up.
A lot of people are observing that the next gen consoles are using 8 core processors as the basis. That's leading them to speculate games coded for the consoles will also need an 8 core processor when ported to PC, e.g. Red Dead Redemption 2.

I don't know if that's a valid assumption as consoles could easily need the cores for other concurrent tasks they'll be expected to do that PC's won't. For instance: you may not care to stream with your PC so 6 cores is quite enough but if a console is designed to stream concurrently at gaming resolution it will be capable of it whether or not you use it.

Also, i can't help but believe console games share any game's problem of depending on one thread more heavily due to temporal dependencies, which ultimately makes it the bottleneck. But I'm sure a lot of opinions will surface on this...and be interesting to read.
 
Last edited:
Apr 8, 2020
38
3
35
A lot of people are observing that the next gen consoles are using 8 core processors as the basis. That's leading them to speculate games coded for the consoles will also need an 8 core processor when ported to PC, e.g. Red Dead Redemption 2.

I don't know if that's a valid assumption as consoles could easily need the cores for other concurrent tasks they'll be expected to do that PC's won't. For instance: you may not care to stream with your PC so 6 cores is quite enough but if a console is designed to stream concurrently at gaming resolution it will be capable of it whether or not you use it.

Also, i can't help but believe console games share any game's problem of depending on one thread more heavily due to temporal dependencies, which ultimately makes it the bottleneck. But I'm sure a lot of opinions will surface on this...and be interesting to read.
That sure makes sense, but i guess at least 2 of those 8 threads are going to be used for something else, and not games (OS, streaming as you said, etc.) They are also probably going to be severely underclocked compared to their PC counterparts
 
....They are also probably going to be severely underclocked compared to their PC counterparts

Not saying they are overclocked...but from what I've read (never owned a console myself) they can still be toasty-hot systems. Probably as much because they're densely packed, but loss of cooling (fan failure, in a cabinet with no ventilation) can lead to very early catastrophic failure.
 

IDProG

Distinguished
Lots of people seem to get the wrong idea about this.

Here, let me explain.

There is really no such thing as "this game is optimized for 8 cores" or "this game is optimized for 4 cores". It's all just a hyper-simplified explanation for non tech-savvy people so that they can understand, and we don't have to explain lots of things that they wouldn't understand anyway.

However, since you ask this question, I'm going to assume that you're tech savvy enough that you will understand the actual explanation about this.

So, here is the true explanation as to why system requirements are as they are.

First, I will tell you how CPU and GPU work when gaming.

CPU processes the instructions from the game, GPU visualizes the processes that CPU does.

The more CPU can process, the more FPS it can produce.
The same with GPU, the more process from a CPU a GPU can visualize, the more FPS it can visualize from the CPU.

If a CPU can process more than a GPU can visualize, the CPU will not use its full power and instead synchronize its processing power to match GPU's maximum visualization capability. That's what is called GPU bottlenecking.

If a GPU can visualize more than a CPU can process, the GPU will not use its full power and instead synchronize its visualization power to match CPU's maximum processing capability. That's what is called CPU bottlenecking.

An easy way to see a bottleneck is to see CPU's and GPU's usage while gaming. If GPU's usage is 100% and CPU's usage is below 90%, then it's GPU bottlenecking. If vice versa, then it's CPU bottlenecking.

It is recommended to have GPU bottlenecking instead of CPU bottlenecking.

This is because if your CPU does not run at full 100%, there will still be some room for the CPU to do many things, like multitasking, playing music, and, most importantly, streaming.

Now, why does this "optimized for X cores" argument exist?

The answer is quite simple, actually. It's because GPU's power increases over time.

The more power a GPU can visualize, the more power a CPU must have to be able to keep GPU usage at 100% so that CPU bottlenecking does not happen.

Back when PS4 and Xbox One were announced, the GPU were not very powerful. Even the most powerful consumer GPU at the time, GTX 780 Ti, is now only slightly more powerful than GTX 1050 Ti. That's why quad core CPUs like i7-4790K was enough keep 780 Ti's usage at 100% and buying higher core count CPU would be a waste.

Nowadays, GPUs have gone an insane power increase. The top-tier GPU, RTX 2080 Ti, is around 3 times more powerful than GTX 780 Ti and requires 3 times the power that 780 Ti needs to be kept at 100% usage. That's why quad-cores like 7700K are not enough for 2080 Ti at 1080p.

Now, of course, all of that applies IF YOU PLAY AT 1080p.

If you face a CPU bottlenecking with your PC at 1080p, there is a way to mitigate this problem: Increase the resolution. The higher the resolution, the more work GPU needs to visualize 1 FPS from CPU, resulting in less visualization capability.

In simpler words, if you increase resolution, the GPU will work harder, while your CPU will still be the same.
 
Last edited:
Apr 8, 2020
38
3
35
Not saying they are overclocked...but from what I've read (never owned a console myself) they can still be toasty-hot systems. Probably as much because they're densely packed, but loss of cooling (fan failure, in a cabinet with no ventilation) can lead to very early catastrophic failure.
Yeah they are indeed dense, and i don't think when you have such a dense system that even on lower clocks it won't get hot. For example my TV router thingy (sorry English is not my native laungauge) is painfully slow when used to control the menus and all but it is so hot i think you could bake an egg on it,beocuse it is small qnd relies only on passive cooling.
 
Apr 8, 2020
38
3
35
T
Lots of people seem to get the wrong idea about this.

Here, let me explain.

There is really no such thing as "this game is optimized for 8 cores" or "this game is optimized for 4 cores". It's all just a hyper-simplified explanation for non tech-savvy people so that they can understand, and we don't have to explain lots of things that they wouldn't understand anyway.

However, since you ask this question, I'm going to assume that you're tech savvy enough that you will understand the actual explanation about this.

So, here is the true explanation as to why system requirements are as they are.

First, I will tell you how CPU and GPU work when gaming.

CPU processes the instructions from the game, GPU visualizes the processes that CPU does.

The more CPU can process, the more FPS it can produce.
The same with GPU, the more process from a CPU a GPU can visualize, the more FPS it can visualize from the CPU.

If a CPU can process more than a GPU can visualize, the CPU will not use its full power and instead synchronize its processing power to match GPU's maximum visualization capability. That's what is called GPU bottlenecking.

If a GPU can visualize more than a CPU can process, the GPU will not use its full power and instead synchronize its visualization power to match CPU's maximum processing capability. That's what is called CPU bottlenecking.

An easy way to see a bottleneck is to see CPU's and GPU's usage while gaming. If GPU's usage is 100% and CPU's usage is below 90%, then it's CPU bottlenecking. If vice versa, then it's GPU bottlenecking.

It is recommended to have GPU bottlenecking instead of CPU bottlenecking.

This is because if your CPU does not run at full 100%, there will still be some room for the CPU to do many things, like multitasking, playing music, and, most importantly, streaming.

Now, why does this "optimized for X cores" argument exist?

The answer is quite simple, actually. It's because GPU's power increases over time.

The more power a GPU can visualize, the more power a CPU must have to be able to keep GPU usage at 100% so that CPU bottlenecking does not happen.

Back when PS4 and Xbox One were announced, the GPU were not very powerful. Even the most powerful consumer GPU at the time, GTX 780 Ti, is now only slightly more powerful than GTX 1050 Ti. That's why quad core CPUs like i7-4790K was enough keep 780 Ti's usage at 100% and buying higher core count CPU would be a waste.

Nowadays, GPUs have gone an insane power increase. The top-tier GPU, RTX 2080 Ti, is around 3 times more powerful than GTX 780 Ti and requires 3 times the power that 780 Ti needs to be kept at 100% usage. That's why quad-cores like 7700K are not enough for 2080 Ti at 1080p.

Now, of course, all of that applies IF YOU PLAY AT 1080p.

If you face a CPU bottlenecking with your PC at 1080p, there is a way to mitigate this problem: Increase the resolution. The higher the resolution, the more work GPU needs to visualize 1 FPS from CPU, resulting in less visualization capability.

In simpler words, if you increase resolution, the GPU will work harder, while your CPU will still be the same.
That's a great explanation, i really appreciate the time it took to write it, however my main question is will we see drastic increases in system requirements once when the next gen becomes the standard?
 

Phaaze88

Titan
Ambassador
That's a great explanation, i really appreciate the time it took to write it, however my main question is will we see drastic increases in system requirements once when the next gen becomes the standard?
1)It may not seem like it, but the console market is larger than the DIY PC one.
2)Progression of the DIY PC hardware far outpaces that of consoles.
3)Unfortunately, DIY PC hardware is held back by the slow pacing of software development, which tends to run in line at times with consoles.

I'd say no. If you've got something decent on the PC end, then you are ahead of, and waiting, on the consoles to catch up.
 
Solution
Nowadays, GPUs have gone an insane power increase. The top-tier GPU, RTX 2080 Ti, is around 3 times more powerful than GTX 780 Ti and requires 3 times the power that 780 Ti needs to be kept at 100% usage. That's why quad-cores like 7700K are not enough for 2080 Ti at 1080p.

Now, of course, all of that applies IF YOU PLAY AT 1080p.
View: https://www.youtube.com/watch?v=B0QmnBWoHao

That's a great explanation, i really appreciate the time it took to write it, however my main question is will we see drastic increases in system requirements once when the next gen becomes the standard?
The current consoles already do use 8 cores,8 crappy cores but still,before they came out everybody was all like doom and gloom dual and quad cores are dead already 8 cores is going to be the minimum,now we are in 2020 and except for one or two games everything still plays fine on even dual cores.
 
Apr 24, 2020
30
0
30
Hi, i am going to be building a new mid range pc very soon, however i cannot wait for the Ryzen 4000 and Nvidia 30xx series, i have to do it in like a few weeks.

Iwas wondering how will the consoles launching later this year going to affect PC system requirements? Besides work, i also heavily use my PC for gaming and I'm worried that I am going to waste 900 euros by building something that's going to be totally obsolete in 2 years.

I need opinions on this topic, how will the next gen impact low end and midrange part of the market since that's where the most people are?

I don't need it just for myself, but for every fellow gamer in the same bucket as me, and we all need opinions, so please, write away! Thank you!
My friend bought a 1000$ pc back in 2014 and he still using it untill no problem but he is playing most of games on low due to his gtx 660 ti but his other compents are still rock he will just upgrade his gpu.So if you don't mind lowering settings you can enjoy your pc for 6-8 years who knows maybe more
 

IDProG

Distinguished
Yeah, because comparing a hyperthreading CPU to a non-hyperthreading CPU is definitely a great idea.

You know, I also found this video

So yeah, comparing a non-hyperthreading CPU to a hyperthreading CPU is NOT quite a valid comparison.

However, if we compare two hyperthreading CPUs together, that's when we actually see a real difference.
 

IDProG

Distinguished
So you need a 2080ti you need to run at 1080p and you need a badly coded game to make the point that a 4c/8t isn't enough anymore...
and it still runs smoothly and at high FPS.
The only thing that prevents me from actually showing you that these graphics cards (GTX 1080 Ti, RTX 2070 Super, RTX 2080, RTX 2080 Super, RTX 2080 Ti) are bottlenecked by 7700K is the fact that games are not optimized for high frame rates.

Most games are optimized around consoles, which only run games at 60 FPS maximum. Usually, when you surpass a certain frame rate (varies depending on games, but usually are around 60-100 FPS), the GPU will stop being at 100% usage rate, and the frame rates will become very unstable. Worst case, Detroit: Become Human straight up refuses to run at more than 60 FPS. Even games that are optimized for higher than normal frame rates like CS:GO, when you surpass 240 FPS, you will experience the exact same thing as the others.

Well, your defense to bottlenecking is because "it runs at smooth FPS"?

Look, I'm sorry if you have 7700K, but that's the fact. The 5 aforementioned graphics cards are bottlenecked by 7700K at 1080p. And FYI, the future will be absolutely insane.

Turing was actually said to be like "Kepler trying to run DX12". It was an experimental generation, used by Nvidia to showcase Ray Tracing and by developers to make Ray Tracing games. It was supposed to not age well. So, comparing with a Turing card is not really a great idea.

The real deal would be Ampere or RDNA 2.0 later this year.

We're talking about 3060 that reaches 2080 Ti in performance, at least in Ray Tracing.

If 3060, a mid range card, is bottlenecked by 7700K at 1080p, then I guess I'm not wrong if I say that quad-cores are at the brink of obsolescence.
 
the fact that games are not optimized for high frame rates.

Most games are optimized around consoles, which only run games at 60 FPS maximum. Usually, when you surpass a certain frame rate (varies depending on games, but usually are around 60-100 FPS), the GPU will stop being at 100% usage rate, and the frame rates will become very unstable.
So what you are saying is that the 7700k is even "less worse" because games are made for 60FPS and the framerates you are showing us are just fine.
 
Read about "Strawman Argument Fallacy".
Is that what you are trying to fail at?!
Hey nobody is going to say that the 7700k isn't going to underutilise a 2080ti let alone an even newer future GPU but your first video did not show that and the others just show that the 7700k is doing just fine, even though it doesn't push the 2080ti to 100% it runs all the games just fine without any issue.
There is no reason to start a scene anytime someone mentions an 7700k in 2020.
 

IDProG

Distinguished
Is that what you are trying to fail at?!
Well, it seems to me that that's it.

I'm going to leave this discussion before it gets too off-topic and I trigger unnecessary infractions or bans.

But before that, I will tell you some facts.
  1. A bottleneck IS a bottleneck. It doesn't matter if the game runs at 10, 100, or 1000 FPS. If the CPU is at its limit and the GPU isn't at 100% usage, it's a CPU bottleneck.
  2. 5 graphics cards are currently bottlenecked by 7700K at 1080p.
  3. The number above will be highly increased later on.

Don't worry, though. Quad cores are not dead, yet.

Though, the high-end 7700K has been reduced to a low-end CPU. 3300X beats it in every way, and it's a $120 vs $300 (yes, it's a used price) CPU. I feel sorry for everyone who bought it.

Peace out.
 
View: https://www.youtube.com/watch?v=B0QmnBWoHao


The current consoles already do use 8 cores,8 crappy cores but still,before they came out everybody was all like doom and gloom dual and quad cores are dead already 8 cores is going to be the minimum,now we are in 2020 and except for one or two games everything still plays fine on even dual cores.
So The Division 2, COD:MW, Wildlands: Breakpoint, Battlefield 5, AC:Odyssey, Jedi: Fallen Order.......will show no framerate increase by going from a dual core to an 8core CPU?