[SOLVED] (new system) rtx 3070 underperforming/ stuttering

Sep 15, 2022
19
0
10
Hello!
I built a pc around 5 months ago and I want it to last me a while, however I've been slightly disappointed with its performance honestly.
I've been thinking of returning my 3070 and replacing it with another 3070 however I just want to make sure that the card itself isn't an issue-

Specs:
Gigabyte GeForce RTX 3070 EAGLE OC LHR
AM4 Ryzen 5 5600
Kingston FURY Beast RGB DDR4 3200MHz 16GB (dual)
Kingston NV1 M.2 NVMe SSD 2TB
Corsair RM850x V3
TUF GAMING B550-PRO
2 monitors. Both 1920 x 1080 however my main monitor for gaming is 75 hz while the latter is 60 hz.

Elden ring at max setting performs well with slight dips (usually from 60-59 if it does, rarely to 54 but not lower).
No mans sky temps 65-69, but fps dips from 110 to 60 or even 40. Cap it at my refresh rate 75, and it goes down the same, especially if I play for a while.
R6 cooks my gpu. Temps go to 75, average 70-72, and sometimes has dips in fps but its uncommon and not a big issue.
GTA on max settings: temps 65. Fps should stay a consistent 180 if not for fps drops. They dip down to the low 25's so often it gets hard to play. I have afterburner info at the top that indicates my GPU load is usually 98-99% when the fps dips. I don't know if that info is valuable but it's something.
Cyberpunk ran so terribly at times- 60 fps dipping to 40s and low 20s often at almost max shading quality and medium ray-tracing. Temps weren't so high from what I remember, around 65-ish.


Heaven benchmark score: 4985 FPS: 197,9. Min FPS:9,5 Max FPS: 405,3.


GPU-Z shows I run at x16, so that isn't a problem.
Can it be temps? Is the card itself an issue? Sorry for the slight ramble or providing unneeded info, I'm no computer techie but it feels wrong with how it performs.
Also side note, back when I built this pc, it overheated once which caused shutdown and thereafter consistent shutdowns that I traced back to said faulty PSU. I replaced it to the current one I specified and everything worked fine, but can that first overheat have caused damage?

Thanks in advance!
 
Solution
Thats interesting, I thought both played equally as much meaning that the gpu can underperform for some reason honestly!
This cpu is completely new though, along with the rig- and I know it should bottleneck my gpu a little but is the difference really this big? Not to mention the "cpu activity" issue that I can't pinpoint now haha
Game code by itself does nothing. It's mostly just jpg or other picture file types with some instructions mixed in. The cpu takes all those files and has to organize them into a frame packet. It has to assign dimensions, point of view, lighting, shadows, gradients, collisions, Ai, all of that and more to every single object, no matter how big or small or varied. Also has to do calculations for...

Karadjgne

Titan
Ambassador
Gpu is eye-candy. That's its only job. Cpu is FPS.
Base clock 3.5 GHz, turbo 3.4 GHz (avg). Taken from that UBM.

Fix cpu issues, fps will improve. 38% with 'absolutely everything' closed, is a lot of wasted potential and will make results inaccurate (lower than normal) but gpu is doing absolutely fine. It only reports lower %, because you are up against ppl overclocking their gpu, that's the lines above where you are at now.

Having turbo run at lower than base speeds, when it should be hitting 4.2-4.6GHz turbo is what's killing your fps.
 
Sep 15, 2022
19
0
10
Gpu is eye-candy. That's its only job. Cpu is FPS.
Base clock 3.5 GHz, turbo 3.4 GHz (avg). Taken from that UBM.

Fix cpu issues, fps will improve. 38% with 'absolutely everything' closed, is a lot of wasted potential and will make results inaccurate (lower than normal) but gpu is doing absolutely fine. It only reports lower %, because you are up against ppl overclocking their gpu, that's the lines above where you are at now.

Having turbo run at lower than base speeds, when it should be hitting 4.2-4.6GHz turbo is what's killing your fps.
Thats interesting, I thought both played equally as much meaning that the gpu can underperform for some reason honestly!
This cpu is completely new though, along with the rig- and I know it should bottleneck my gpu a little but is the difference really this big? Not to mention the "cpu activity" issue that I can't pinpoint now haha
 
Sep 15, 2022
19
0
10
Considering your background usage was kind of high during the UB run, I would run something like Malwarebytes, (free version.) just to make sure you have no nasties in there and see if it picks anything up.
Done! Nothing of note- nor any red flags.
I did the benchmark again and I keep getting different results for everything except GPU all the time:

Here's the first one: https://www.userbenchmark.com/UserRun/55899305

The second one I did after closing literally everything possible: https://www.userbenchmark.com/UserRun/55899364

It seemed to perform even worse when I did close everything, but usage did go down I guess.
I'm at a loss haha
 
Last edited:
Sep 15, 2022
19
0
10
Strange, that first test did boost to 4.4GHz but the second one didnt? Have you downloaded the latest chipset driver from the AMD site?
New results!
https://www.userbenchmark.com/UserRun/55901856
It did boost to 4.4GHz again!

Heaven benchmark scored 5152 on the same test (extreme)!

I did see that they released a new driver recently, maybe that could have been the issue- but I'm unsure so I'll test around a bit more
 

Karadjgne

Titan
Ambassador
Thats interesting, I thought both played equally as much meaning that the gpu can underperform for some reason honestly!
This cpu is completely new though, along with the rig- and I know it should bottleneck my gpu a little but is the difference really this big? Not to mention the "cpu activity" issue that I can't pinpoint now haha
Game code by itself does nothing. It's mostly just jpg or other picture file types with some instructions mixed in. The cpu takes all those files and has to organize them into a frame packet. It has to assign dimensions, point of view, lighting, shadows, gradients, collisions, Ai, all of that and more to every single object, no matter how big or small or varied. Also has to do calculations for movement, vector analysis, physX and all that on every object in comparison to the prior frame so a moving bullet looks like it moved, but faster than the dude walking.
So all that makes up a frame packet. The amount of times the cpu can compile all that in a second, and ship it off to the gpu is max FPS.

The gpu takes that incoming packet, makes a wire frame model, adds colors as specified, renders a picture according to the resolution and sticks it on the screen. The amount of times the gpu can do that in a second is the FPS you see.

The cpu is responsible for making fps, the gpu can only limit fps. So if the cpu sends 100fps to the gpu, at 1080p it might be able to put all 100fps on screen, regardless of detail levels, and thats all you get, no matter if it's a 3060 or 4090, you get 100fps. Cpu limited.

On the flip side, if the resolution was 4k, the 3060 would struggle, you only get 60fps at ultra, but lowering detail levels will lower object detailing, makes less work for the gpu, so fps might go upto 80 at medium or 100 at low. The 4090 is a stronger, faster card, can do more than the 3060, so even at high you get 100fps and at ultra you get 80fps. Gpu limited.

But at no point can the gpu exceed what the cpu sends (except in cases of direct rendering by the gpu).

If the cpu is starting out 38% used, then add the game usage on top of that, somewhere in there will be a loss of performance, loss of fps, and if cpu clocks aren't hitting boosted speeds, it's just going to take much longer to create each individual frame. 100fps = 1 frame every 0.01 seconds, if the frame takes 0.0105 seconds = 95fps, 0.015 seconds = 67fps. Doesn't take very much time differential to drastically reduce fps. 5/1000th of a second dumps your fps by 1/3rd.

So fix the cpu. If the cpu can put out max fps as it should, then any issues are on the gpu somehow, but if the cpu is limited to start with, any gpu issues at all are moot and you end up chasing you tail in circles trying to fix something that may or may not be an issue to start with.
 
  • Like
Reactions: Dark Lord of Tech
Solution
Sep 15, 2022
19
0
10
Game code by itself does nothing. It's mostly just jpg or other picture file types with some instructions mixed in. The cpu takes all those files and has to organize them into a frame packet. It has to assign dimensions, point of view, lighting, shadows, gradients, collisions, Ai, all of that and more to every single object, no matter how big or small or varied. Also has to do calculations for movement, vector analysis, physX and all that on every object in comparison to the prior frame so a moving bullet looks like it moved, but faster than the dude walking.
So all that makes up a frame packet. The amount of times the cpu can compile all that in a second, and ship it off to the gpu is max FPS.

The gpu takes that incoming packet, makes a wire frame model, adds colors as specified, renders a picture according to the resolution and sticks it on the screen. The amount of times the gpu can do that in a second is the FPS you see.

The cpu is responsible for making fps, the gpu can only limit fps. So if the cpu sends 100fps to the gpu, at 1080p it might be able to put all 100fps on screen, regardless of detail levels, and thats all you get, no matter if it's a 3060 or 4090, you get 100fps. Cpu limited.

On the flip side, if the resolution was 4k, the 3060 would struggle, you only get 60fps at ultra, but lowering detail levels will lower object detailing, makes less work for the gpu, so fps might go upto 80 at medium or 100 at low. The 4090 is a stronger, faster card, can do more than the 3060, so even at high you get 100fps and at ultra you get 80fps. Gpu limited.

But at no point can the gpu exceed what the cpu sends (except in cases of direct rendering by the gpu).

If the cpu is starting out 38% used, then add the game usage on top of that, somewhere in there will be a loss of performance, loss of fps, and if cpu clocks aren't hitting boosted speeds, it's just going to take much longer to create each individual frame. 100fps = 1 frame every 0.01 seconds, if the frame takes 0.0105 seconds = 95fps, 0.015 seconds = 67fps. Doesn't take very much time differential to drastically reduce fps. 5/1000th of a second dumps your fps by 1/3rd.

So fix the cpu. If the cpu can put out max fps as it should, then any issues are on the gpu somehow, but if the cpu is limited to start with, any gpu issues at all are moot and you end up chasing you tail in circles trying to fix something that may or may not be an issue to start with.
That is very interesting, I really appreciate you simplifying since I had no clue how they worked together honestly haha, guess I learned something new today! Basically, this issue is narrowed down to my CPU- not the GPU most likely.

I do not want to sound like a broken record, just to put my mind at ease- do you have an answer as to why it then always says my GPU load is 100% while the CPU still has a lot of legroom of use that it doesn't put to work? I get the issue probably isn't the GPU. From the absolute beginning I suspected the CPU to be weaker therefore holding the GPU back but due to task manager and HWinfo indicating my GPU at max during use I've been doubting what the issue could be, and ended up asking on forums haha
Sorry for the stupid question, but it just seems odd to me. Can the programs have gotten it wrong? Am I looking in the wrong places?

Thanks in advance, highly appreciate your answer :,)
 
Sep 15, 2022
19
0
10
The last two runs you made, in the first one your drive says 1.5TB and the second says 1TB, what have you downloaded that's 500GB?
haha yeah that cant be right since I've not downloaded that much, max download I did between the two was 80 GB and that was for doom eternal

Also after testing, the speed was varying. On 3D mark my cpu went back to the usual 3.4. Every test I do yield different results lmao
 

Karadjgne

Titan
Ambassador
Start with the basics. Clean out the pc. Use windows or ccleaner to get rid of temp files etc as the cpu is accessing those whether used or not. Which takes time and resources. Check the startup, see what's actually running, can also check services tab and task manager. Do a malware scan and an anti virus scan, those are not the same thing. Make sure your pc is streamlined and running smooth.

If using Geforce Experience, check it's settings, make sure you aren't running a 4k DSR, that's almost a default setting with a strong enough gpu when optimized. DLSS should be on, Ray Tracing off. Try and get the pc running smooth first, then run tests like Cinebench and Msi Kombuster or Friestrike or TimeSpy, see where your turbo settings are landing you, is cpu/gpu boosting as they should.