[SOLVED] Is my card failing? Was it used for mining? Am I doing something wrong?

RyanBohan

Commendable
Jul 31, 2017
13
0
1,510
Hey all, I need some help here because I feel like I'm either doing something wrong or I got ripped off a few years ago buying this GTX 1070...

So I have a Founders Edition GTX 1070 I got off eBay right at the peak of mining (I needed a new card at the time and I couldn't afford brand new)
I feel like it's not performing the way it should, I don't get very good FPS in any game at all really...
I just recently upgraded to a Ryzen 7 2700x with G.SKILL F4 DDR4 3200 C16 2x8GB and I also upgraded my MB to a B450 Tomahawk Max
I use an LG 34UC79G 34" (21:9) Curved 144 Hz 1MS FreeSync IPS Monitor too if that means anything

I am at a loss here I have friends with the same GPU as me and they get 100+ FPS in Destiny 2 at MAX SETTINGS I hover around 50-60 and if stuff goes crazy I get 30-40 and I feel that it's ridiculous...

I'm not sure if ya'll need any more info but I am here all night I need help, I need answers :'(
 
Did you reinstall Windows when you changed to the B450 and 2700X?

If not you will need to start there as although it has improved over the years, a motherboard change still causes unforeseen problems that are too hard to try and diagnose.
I have not, I didn't know I had to, so I guess I can do that, but I was having this FPS issue far before changing anything out 🙁 I went from an AMD R290x to the GTX 1070 so I had a huge gain but even now people with the same 1070 get better FPS
 
Start with a reinstall of Windows, if you are having frame rate issue after that report back.

Also keep in mind that although Ryzen are good CPUS, the 1st and 2nd gen have weaker single core performance then Intels.

Also check via MSI Afterburner that there isn't any thermal issues with your card, it might be throttling as blower style cards are notoriously terrible for thermals.
 
Start with a reinstall of Windows, if you are having frame rate issue after that report back.

Also keep in mind that although Ryzen are good CPUS, the 1st and 2nd gen have weaker single core performance then Intels.

Also check via MSI Afterburner that there isn't any thermal issues with your card, it might be throttling as blower style cards are notoriously terrible for thermals.

Same thing, I just did a clean install, downloaded newest drivers, booted up Destiny 2 max settings and still only getting like 40-60 🙁
 
Are your friends running the exact same resolution as you? Your monitor has 33% more pixels (workload) that a standard 1080p monitor. If comparing to a standard 1080p monitor there will be significant differences in FPS.

No they're not using ultrawide, but they are getting 140+ FPS on max settings in Destiny and I get like 40-60 depending on area, that sounds pretty drastic for using an ultrawide
 
No they're not using ultrawide, but they are getting 140+ FPS on max settings in Destiny and I get like 40-60 depending on area, that sounds pretty drastic for using an ultrawide
You are right, something else is going on here.
To summarise:
  • Fresh Windows installation
  • New GPU drivers
  • Good thermals
  • Fast, dual channel RAM

So, what is the make and model of your PSU?
Do you have the latest BIOS and chipset drivers?
 
You are right, something else is going on here.
To summarise:
  • Fresh Windows installation
  • New GPU drivers
  • Good thermals
  • Fast, dual channel RAM
So, what is the make and model of your PSU?
Do you have the latest BIOS and chipset drivers?

HMW - 800W ATX Power Supply
Model: XG-H800

Not gonna lie, I don't know much about upgrading BIOS or chipset drivers, I just recently replaced my older MB for the B450 and I just assumed it came ready to go
 
Ok that’s too much of a gap but at best your only going to achieve 2/3 the performance of 1080p.
It doesn't actually work like this.
CPU determines the FPS that a system is capable of output to the monitor while GPU determines at what quality those FPS will be displayed. Higher resolutions put less strain to the CPU and a lot more to the GPU, making the outcome different than just saying 33% more pixels so, 33% less FPS. You can confirm this by watching the benchmarks of games on different resolutions that do not go 1/4th of fps going from 1080p to 4k
 
It doesn't actually work like this.
CPU determines the FPS that a system is capable of output to the monitor while GPU determines at what quality those FPS will be displayed. Higher resolutions put less strain to the CPU and a lot more to the GPU, making the outcome different than just saying 33% more pixels so, 33% less FPS. You can confirm this by watching the benchmarks of games on different resolutions that do not go 1/4th of fps going from 1080p to 4k
I knew all that but we are discussing a scenario where he should be gpu limited and running the same settings as a similar pc with the only difference being resolution. While 33% is a finger in the air guess it does sort of sit about right with benchmarks of other games at those resolutions.
 
Anyone have any other ideas?
Just by having that unit in your system, you risk it. It might as well be the root of your problems but in the event that it does not, it might have affected multiple components.

I knew all that but we are discussing a scenario where he should be gpu limited and running the same settings as a similar pc with the only difference being resolution. While 33% is a finger in the air guess it does sort of sit about right with benchmarks of other games at those resolutions.
He never said that he has friends with similar system. He said:
I am at a loss here I have friends with the same GPU as me and they get 100+ FPS in Destiny 2 at MAX SETTINGS I hover around 50-60 and if stuff goes crazy I get 30-40 and I feel that it's ridiculous...

which is very vague, they might as well have a 9900K paired with that 1070, or a 2120 with 4GB DDR3, etc... He didn't even say what resolution they are playing at.
 

TRENDING THREADS