[SOLVED] New GPU Will It Under Perform?

Dec 14, 2020
4
0
10
Hi I currently have
Processor i7-8700K CPU @ 3.70GHz
Motherboard
ASUSTeK COMPUTER INC. TUF Z370-PLUS GAMING (LGA1151)
32GB DDR4 1069MHz
Currently 1080 TI but looking at MSI GeForce RTX 3080 10 GB GAMING X TRIO Video Card
Do you know how low this GPU will perform? As in "bottleneck"
What CPU would I need to get the best out of the 3080? A i9 - 900k?
 
Solution
Cpu is fps. Gpu is fps onscreen. The cpu isn't reliant on resolution, it gives the pre-rendered frame to the gpu and then the gpu has to take that, apply resolution and details and shove it out the door.

The higher the resolution, the harder the gpu has to work, but the cpu doesn't change.

So a cpu like the 8700k will kick out a decently high fps, then it's upto the gpu to finish coloring at 4k, which is 4x as many pixels as 1080p or 1440p which is 1.7x as many. At 4k you'll usually find that the gpu is the limiting factor, even the 3090.

Karadjgne

Titan
Ambassador
Cpu is fps. Gpu is fps onscreen. The cpu isn't reliant on resolution, it gives the pre-rendered frame to the gpu and then the gpu has to take that, apply resolution and details and shove it out the door.

The higher the resolution, the harder the gpu has to work, but the cpu doesn't change.

So a cpu like the 8700k will kick out a decently high fps, then it's upto the gpu to finish coloring at 4k, which is 4x as many pixels as 1080p or 1440p which is 1.7x as many. At 4k you'll usually find that the gpu is the limiting factor, even the 3090.
 
Solution
The cpu isn't reliant on resolution, it gives the pre-rendered frame to the gpu and then the gpu has to take that, apply resolution and details and shove it out the door.

This, is most certainly not what happens between a CPU and GPU when gaming. Ughh. TBH, a GPU is much more complex than that.

For the un/initiated, this is a better description of what occurs : How Graphics Cards Work | HowStuffWorks
 
What about a 3090? Would both of them get around their full usage on 4K and 1440P?

A 3090, might push it a little. With that said, you're talking a small enough percentage.

At 1080p is an I9 going to get more FPS with a strong GPU, over an 8700k. Yes. But not crazy like.

I'd agree with @hotaru.hino in that a 3090 is not worth the increase in price over the 3080. The 3080 is the sweetspot.

Whilst I hardly agree with anything @Karadjgne said in their post, I would agree that at higher resolutions the GPU is the limiting factor above 1440p
 
Dec 14, 2020
4
0
10
A 3090, might push it a little. With that said, you're talking a small enough percentage.

At 1080p is an I9 going to get more FPS with a strong GPU, over an 8700k. Yes. But not crazy like.

I'd agree with @hotaru.hino in that a 3090 is not worth the increase in price over the 3080. The 3080 is the sweetspot.

Whilst I hardly agree with anything @Karadjgne said in their post, I would agree that at higher resolutions the GPU is the limiting factor above 1440p
So would you say I should be alright with a i7 8700k and a 3080 then if I'm just playing 1440p?
 

Karadjgne

Titan
Ambassador
The CPU, working in conjunction with software applications, sends information about the image to the graphics card. The graphics card decides how to use the pixels on the screen to create the image
From your link

The cpu isn't reliant on resolution, it gives the pre-rendered frame to the gpu and then the gpu has to take that, apply resolution and details and shove it out the door.
kinda looks exactly the same

The CPU takes the game code and pre-renders a frame, it links objects, positions, Ai, all the who-what-where stuff. It then ships that pre-rendered frame to the GPU which takes that info and creates a wireframe first, then applies the colors, shadows, details etc according to the User Detail settings, then finish renders all that according to the output resolution. After all thats done and said, it pushes the frame down the cable to the monitor.

which is what i said before, and what the link you supplied also said. Resolution has nothing to do with the cpu. Its all gpu rendered. The pre-rendered frame from the cpu Has no resolution yet applied nor rendered
 

Karadjgne

Titan
Ambassador
What about a 3090? Would both of them get around their full usage on 4K and 1440P?
You do NOT want a GPU to ever reach even close to full usage. Like many, you equate USAGE with being the amount in % that the GPU is USED. It is NOT. USAGE is the amount in % of resources the GPU needs to USE in order to render the maximum amount of frames it can in 1 second.

Picture this, hammer a bunch of nails as fast as you can into the drywall, without missing the nail and hitting the wall. Thats going to take all your skill, and use every single muscle in your hand, wrist, arm to accomplish. What it will not use, is all the strength possible in your hand, wrist, arm as its not needed, its just drywall, just a small hammer. Not a 2 handed sledge hammer. USAGE is the amount of muscle power you apply to the hammering, NOT the amount of muscles you apply, which is 100%.

If the gpu even comes close to 100% USAGE, it has nothing left to give, no more possible fps, no more possible vram use, nothing . So when your dude is running through town and a wall explodes and all those particles go in every direction, the gpu has to deal with all that, and fps bottoms out, because it was already using the maximum resources it could, so must slow down the fps in order to make room for all the new instructions per frame.

ideally, you want usage to be 60-70% at most, just so there IS room for changes, like explosions etc that can have a very large impact on sudden fps. A giant fireball in 4k is massive pixilation, as is a field of grass.
 
From your link


kinda looks exactly the same

The CPU takes the game code and pre-renders a frame, it links objects, positions, Ai, all the who-what-where stuff. It then ships that pre-rendered frame to the GPU which takes that info and creates a wireframe first, then applies the colors, shadows, details etc according to the User Detail settings, then finish renders all that according to the output resolution. After all thats done and said, it pushes the frame down the cable to the monitor.

which is what i said before, and what the link you supplied also said. Resolution has nothing to do with the cpu. Its all gpu rendered. The pre-rendered frame from the cpu Has no resolution yet applied nor rendered

Are you for real. Those statements are not the same in anyway:

'The CPU, working in conjunction with software applications, sends information about the image to the graphics card' - Note, it does not send pre-rendered frames to the GPU, it sends information (aka - Data) about the image to the GPU.

It then ships that pre-rendered frame to the GPU

Again, this understanding is not true The CPU doesn't send pre-rendered frames anywhere!!! Its send data. Only data. Nothing else. The GPU then uses that data to create images. The CPU can sometimes be a bottleneck because it can't send data fast enough to the GPU to process the data into images on screen. Hence bottleneck. When the CPU is fast enough to feed the GPU the data it requires, then you can push more FPS from the GPU.

As for this:

You do NOT want a GPU to ever reach even close to full usage. Like many, you equate USAGE with being the amount in % that the GPU is USED. It is NOT. USAGE is the amount in % of resources the GPU needs to USE in order to render the maximum amount of frames it can in 1 second.

Picture this, hammer a bunch of nails as fast as you can into the drywall, without missing the nail and hitting the wall. Thats going to take all your skill, and use every single muscle in your hand, wrist, arm to accomplish. What it will not use, is all the strength possible in your hand, wrist, arm as its not needed, its just drywall, just a small hammer. Not a 2 handed sledge hammer. USAGE is the amount of muscle power you apply to the hammering, NOT the amount of muscles you apply, which is 100%.

If the gpu even comes close to 100% USAGE, it has nothing left to give, no more possible fps, no more possible vram use, nothing . So when your dude is running through town and a wall explodes and all those particles go in every direction, the gpu has to deal with all that, and fps bottoms out, because it was already using the maximum resources it could, so must slow down the fps in order to make room for all the new instructions per frame.

ideally, you want usage to be 60-70% at most, just so there IS room for changes, like explosions etc that can have a very large impact on sudden fps. A giant fireball in 4k is massive pixilation, as is a field of grass.

I'm genuinely not sure what to say about that post. It's misleading at best, and factually incorrect in many ways.

This is all off topic anyway. My apologies to other posters for letting that happen.
 

Karadjgne

Titan
Ambassador
What do you think a pre-rendered frame is? It's the frame outline data. It's the length, angle, position of vertices, it's the object file links, it's the Ai, it's everything the gpu requires to Form the picture. It's the complete frame in data form. Pre-rendered.
The CPU can sometimes be a bottleneck because it can't send data fast enough to the GPU to process the data into images
Nope, can't happen. A bottleneck is a component that slows down the flow of data, be it physical or virtual. A cpu cannot be a bottleneck as it is the source of the data. If it's slow, it's slow and the gpu gets off easy, but in no way can a cpu slow down a gpu output. The gpu deals with whatever it gets from the cpu. The gpu can and often IS the bottleneck, if the cpu can ship more frames than the gpu can output to the monitor. This happens more at higher resolutions and higher details settings. If a cpu can send 300 frames worth of data in 1 second, and the gpu can only render 200 of them, that's a bottleneck. If a cpu can send 200 frames worth of data in 1 second, and the gpu can not only render all 200, but has resources to render more, that's not a cpu bottleneck, that's just a gpu capable of more in a different game or with different settings or resolution.