[SOLVED] Unacceptable performance from 1070ti

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Sep 30, 2018
16
0
10
I recently "upgraded" my GTX 970 to a Zotac 1070 ti mini and I've seen a performance decrease in most/all cases.

Setup is as follows:
1070ti (zotac mini)
i5-7600k (factory clock speed)
16gb RAM
1920x1080 Asus VG278Q

I usually play Fortnite and I noticed right off the bat that I gained almost nothing from the upgrade. I did a clean uninstall of all graphics drivers using DDU, restarted, then reinstalled GeForce experience and the latest 411 GeForce drivers.

In AC: Origins, I ran a benchmark on ultra, AA on low, and I'm averaging just over 35FPS. During the benchmark, it was showing GPU usage at around 50%? I'm not sure how to monitor this or how to change it. I don't have benchmarks to back this up, but I was typically getting 45-55FPS in AC: Origins with my GTX 970 with the same settings.

The 1070 Ti typically benches like 75% better than the GTX 970 so I'm convinced something isn't right here.

Any help is greatly appreciated

*edit - Turned on all the monitoring from MSI afterburner during the AC: Origins benchmark and my CPU is at 100% on all cores, but my GPU never exceeds 51%. Not sure what this means, but I have a hard time believing an i5 7600k is bottlenecking the GPU so badly as to not allow it to run at 100%.
 
Solution
"I still don't really understand how higher resolution would allow more utilization of the GPU, seems the CPU bottleneck would exist..."

GPU is responsible for processing all the pixels that your see on you screen. The higher the pixel count, the harder the GPU has to work.
CPU "relatively" does not care if you game at 760p, 1080p or 4k (again, relatively). It's approximately the same work for the CPU.
So if you GPU can push 135fps in a game and your CPU cannot keep up, your CPU will be at 100% load while your GPU is not fully utilized, because your CPU can't process more than say 87fps.
Upping the resolution will make your GPU work to around full load at 70s to 80s fps. That will be close to your CPU capability.


I'm not trolling. Not everyone with ideas different from yours is a troll, believe that or not.

Medium looks fine, textures are still 2K at that level of mipmapping. post processing in that game is garbage. Who uses MSAA or MFAA anymore? SMAA and FXAA are still the best out there, and those on Intel chipsets can use CMAA which is even better.

Skyrim still looks great with 1024 and 512 textures, and Oblivion with forced 16x AF looks even better, albeit a bit...old.

Like, even Low with Ultra textures and 16x AF looks better than you think probably because you've never used it.

Like I said, If you need all settings as high as possible to enjoy a game, you need to rethink how you're living your life. Such an elitist attitude is harmful to society.

Also, don't post quote images. That's just lazy and improper forum etiquette.
 


I'm very quick to point out that you don't need to Max out settings, but get real. Unless you are playing competitively, very few people are going to lower all their settings to reach 144 FPS.

To me, you are the one with an elitist attitude. The OP is trying to play at quality settings, you are acting like low settings are the way we are supposed to enjoy the game. I'm sorry if that offends you.

And lets be honest here. FXAA, for many, looks worse than no AA. As far as what AA is best, that varies greatly from game to game, and person to person.
 


You're misinterpreting what I'm saying. You said OP wanted 144hz. I said he needs to lower settings. Then YOU got all upset because you act like it's black and wite, either highest of high and lowest of low. Which it's not.

OP can still achieve the Ultra treatment as 90% of a game's visuals is models and textures and the rest is just poorly done postprocessing.

FXAA looks leagues better than no AA, you may need to get your glasses adjusted, or use a less poorly made monitor.
 


Going back to reread stuff, and I don't see where the OP wanted 144 FPS. I just thought he should get more FPS than he has. You just responded to someone that said that a 1070ti wasn't meant for 1440p/4K at 100+ FPS and you called him a liar.

Anyway, I don't like FXAA because it causes a lot of blurring. I'd rather have a sharper image than FXAA in most cases. FXAA is just about my least favorite type of AA because of how much it blurs the image. If you like it, I won't hold it against you, but it has nothing to do with glasses. If I had better vision, it would likely be even worse to me. There are plenty of other options I'd rather use.
 


Man, you'd really hate TAA/TXAA



Such as? Even 2x MSAA has worse performance impact than FXAA. SMAA is a legit choice here but very few games support it withotu forcing it in a hacky way.
 
Try swapping out your PSU with a known good one. Running furmark is a good way to know if your PSU is working properly. Another problem is you are more CPU bound at 1080p. Try running it at 1440p, and see if it runs as good as your old one did at 1080p, that also tells you it's working right.
 


Do people ever read the rest of the thread before making comments? We've already solved OP's problem. His CPu is the bottleneck.
 


Well I put it in layman's terms. I don't the the OP really gets what you are saying.
 


Well I put it in layman's terms. I don't the the OP really gets what you are saying.[/quotemsg]

No you didn't. You told them to get a new PSU because their current one may be producing dirty power. Which was not the case.

Their problem is they have a weak CPU. That has nothing to do with the PSU.
 


I said FXAA, not TXAA, not that I'm all that impressed by TXAA. What I use is game dependent. SMAA is generally a lot better form of post process AA. I usually test what is available, including using DSR or downsampling options (these aren't in game options). Just because FXAA doesn't have a large impact on performance, doesn't mean the effect is a good one. Visually, I find it makes most games look worse. It's most akin to making an image unfocused so you can't noticed the imperfections.
 


Yeah, I know. I was saying that temporal antialiasing has much worse blur than FXAA by an order of magnitude.



No argument from me there.



Tell that to LITERALLY everyone who uses this retarded thing called ENB.