[SOLVED] Unacceptable performance from 1070ti

Sep 30, 2018
16
0
10
I recently "upgraded" my GTX 970 to a Zotac 1070 ti mini and I've seen a performance decrease in most/all cases.

Setup is as follows:
1070ti (zotac mini)
i5-7600k (factory clock speed)
16gb RAM
1920x1080 Asus VG278Q

I usually play Fortnite and I noticed right off the bat that I gained almost nothing from the upgrade. I did a clean uninstall of all graphics drivers using DDU, restarted, then reinstalled GeForce experience and the latest 411 GeForce drivers.

In AC: Origins, I ran a benchmark on ultra, AA on low, and I'm averaging just over 35FPS. During the benchmark, it was showing GPU usage at around 50%? I'm not sure how to monitor this or how to change it. I don't have benchmarks to back this up, but I was typically getting 45-55FPS in AC: Origins with my GTX 970 with the same settings.

The 1070 Ti typically benches like 75% better than the GTX 970 so I'm convinced something isn't right here.

Any help is greatly appreciated

*edit - Turned on all the monitoring from MSI afterburner during the AC: Origins benchmark and my CPU is at 100% on all cores, but my GPU never exceeds 51%. Not sure what this means, but I have a hard time believing an i5 7600k is bottlenecking the GPU so badly as to not allow it to run at 100%.
 
Solution
"I still don't really understand how higher resolution would allow more utilization of the GPU, seems the CPU bottleneck would exist..."

GPU is responsible for processing all the pixels that your see on you screen. The higher the pixel count, the harder the GPU has to work.
CPU "relatively" does not care if you game at 760p, 1080p or 4k (again, relatively). It's approximately the same work for the CPU.
So if you GPU can push 135fps in a game and your CPU cannot keep up, your CPU will be at 100% load while your GPU is not fully utilized, because your CPU can't process more than say 87fps.
Upping the resolution will make your GPU work to around full load at 70s to 80s fps. That will be close to your CPU capability.
I am assuming you have GeForce Experience enabled? Nvidia is automatically turning on DSR and will render your game in 4K even if your display is only HD. This greatly diminishes performance and offers nearly no benefit whatsoever to the user.

You can change the resolution in game and in GeForce experience. I personally would just uninstall GeForce Experience as it doesn’t confer any meaningful benefits.
 


No, I don't have it enabled. I had to reinstall GeForce Exp and I didn't even scan for games. I just used it to install the latest drivers.
 
You need to monitor your CPU temps and clocks. Based on the GPU usage, I'd guess that your CPU is bottlenecking, and that is likely changed due to knocking the CPU HSF loose. Other possibilities is that your GPU fans aren't working, or blocked by things in the case. Check the GPU temps as well as the CPU temps.
 


CPU usage was at 100% the whole time, temps didn't exceed 50C on the CPU ever.

CPU doesn't use the factory HSF; I have a Corsair liquid cooler, recently cleaned the fan and radiator when I swapped the GPU the other day
 


Uninstalled GeForce Exp, reset defaults in NCP, and I'm downloading the 3Dmark software... It's downloading ridiculously slow so I'll report back once it's done
 


Post the result link when it's done.
 


User benchmark is crap at best.

That said depending on the game the CPU can bottleneck the GPU at 1080P.

So yes it could perform worse if it's bottlenecking.

The 970 and 1060 (6GB) perform pretty much the same give or take a small percentage.


 


It's a 1070Ti, not a 1060.

And ok sure, I mean, I could cite other benchmarks of the 1070Ti; not just relying on one site. There's a lot of evidence to suggest I'm not getting the performance I should be.

And it's not as if site A says I should get 100 FPS and I'm getting 95. Rather, site A benched 115FPS average and 82 minimum and I'm getting 60 average and 30 minimum. My CPU is an i5 7600k. It's like a year and a half old and not exactly entry level, I don't think(?) I thought it was a pretty decent CPU. I don't really get how that would bottleneck at 1080p, especially not that bad.

Just tried Just Cause 3, I was getting 60-70FPS. Here's a bench showing 113 average. Before you say it, I realize they are using an i7 8600k, but is that seriously responsible for the additional 50FPS? Seems doubtful, but I don't know, I'm asking.

https://babeltechreviews.com/the-gtx-1070-ti-performance-review-35-games-benchmarked/5/
 


The GPU score is right on, ave is 19,835.

You are running a budget MB with slow memory and an i5, that will hold you back in games at 1080P.

Don't expect miracles.


Would have been better to get a Z270 MB and DDR4 3000 memory so everything would run at full speed, that's even with the 7600K.

However 1080P is harder on the CPU than running at 1440P and 4K.

Here my old i7 7700K and GTX 1080 Fire Strike compare to your system.


https://www.3dmark.com/compare/fs/16534279/fs/15602654
 


Got me a little confused... So hypothetically, you're telling me that a 1080 Ti wouldn't improve my performance at 1080p? So what's the biggest bottleneck? If you were going to upgrade something, where would you target? I don't really understand how 1080p is harder on the CPU when I'm getting low FPS... I could see that being an issue if I'm cranking out 120+
 


Nope, in your system you CPU would be a massive bottleneck for a GTX 1080Ti.

The problem is the CPU can't keep up with the high frame rates the GPU wants to push at 1080P.

But then there is that budget MB and slow memory too.

The system is out of balance for top performance, mistakes made when you picked out the parts.

At the higher resolutions like 1440P and 4K the balance shifts to the GPU taking more of the load.

At 1080P the CPU takes a lot of the load.

You bought that GTX 1070Ti, but you don't have the system power to keep up with it at 1080P gaming.
 
Your graphics score in 3DMark is fine.
Your cpu is not up for the 1070ti at high fps.
You need an i5 8400 or higher. Plus a 3000+ MHz ddr4 ram.
And at the end, 1070ti js not the huge beast you imagine aheah of the 970.
 



See on my last system, I had an i5-2500k for like 5 years and just kept upgrading the GPU every year, got good performance increases every time.

My previous GPU was a 970 so it worked fine with the rest of the gear.
 


I still don't really understand how higher resolution would allow more utilization of the GPU, seems the CPU bottleneck would exist regardless, but for the sake of argument, what if were pushing resolutions of a 21:9 monitor? Would that somehow allow for higher GPU usage?

I've had my eye on a Acer Predator 25__x1080 monitor
 
Sounds to me like your CPU is throttling or stuck in a low clock speed state even though your temps are fine - this can happen somtimes.

Run Intel Extreme tuning
https://downloadcenter.intel.com/download/24075/Intel-Extreme-Tuning-Utility-Intel-XTU-

Check the clock speed and the throttling when running the game.
 
Check your in game menu for resolution and make sure it’s 1920 x 1080. Nvidia turns on DSR by default and tries to run certain games at higher resolution than needed.
 
"I still don't really understand how higher resolution would allow more utilization of the GPU, seems the CPU bottleneck would exist..."

GPU is responsible for processing all the pixels that your see on you screen. The higher the pixel count, the harder the GPU has to work.
CPU "relatively" does not care if you game at 760p, 1080p or 4k (again, relatively). It's approximately the same work for the CPU.
So if you GPU can push 135fps in a game and your CPU cannot keep up, your CPU will be at 100% load while your GPU is not fully utilized, because your CPU can't process more than say 87fps.
Upping the resolution will make your GPU work to around full load at 70s to 80s fps. That will be close to your CPU capability.
 
Solution