as the title describes.
The reason I am looking into this is because of the fact that the higher the temps go the lower the clock speed gets. It does this incrementally and quite aggressively.
On top of that, it seems to intelligently recognize when the load on the GPU is extreme and would just assume the temps would go really high so it forces the voltage to drop down closer to 1021mV which cause the clock speeds to go down to under 1900Mhz which I don't like.
The example I give for this is when I run the Unigine Superposition Benchmark in 1080p Extreme mode. In fact, the only time I can get it to not do this when running the benchmark is when I use any mode less than 1080p Medium. Going to medium or higher results in the card doing this.
So what I want to know is if there is an AiO solution that is fairly plug and play for the Strix 1070 that is competent enough to grant me lower temps when under load allowing for the GPU to not auto drop my voltage and clock speeds combined just to "keep it cool". Otherwise, I will obviously have to look into a custom loop with an EK block because even when running a regular game with moderately high settings, as the temps go from the 50s to the 60s it drops the mV from 1093 to 1081 back and forth which when the mV drops by one increment like that I immediately lose 13Mhz of speed and then it'll also just drop the speed by 13Mhz increments on top of that to supposedly "keep it from getting to hot" and to follow some kind of stupid curve profile that is baked into it. Which means that if I want to get the speeds I know I can on this card, I need to keep it cold, and the built in air cooling that it has just isn't good enough for that.
The reason I am looking into this is because of the fact that the higher the temps go the lower the clock speed gets. It does this incrementally and quite aggressively.
On top of that, it seems to intelligently recognize when the load on the GPU is extreme and would just assume the temps would go really high so it forces the voltage to drop down closer to 1021mV which cause the clock speeds to go down to under 1900Mhz which I don't like.
The example I give for this is when I run the Unigine Superposition Benchmark in 1080p Extreme mode. In fact, the only time I can get it to not do this when running the benchmark is when I use any mode less than 1080p Medium. Going to medium or higher results in the card doing this.
So what I want to know is if there is an AiO solution that is fairly plug and play for the Strix 1070 that is competent enough to grant me lower temps when under load allowing for the GPU to not auto drop my voltage and clock speeds combined just to "keep it cool". Otherwise, I will obviously have to look into a custom loop with an EK block because even when running a regular game with moderately high settings, as the temps go from the 50s to the 60s it drops the mV from 1093 to 1081 back and forth which when the mV drops by one increment like that I immediately lose 13Mhz of speed and then it'll also just drop the speed by 13Mhz increments on top of that to supposedly "keep it from getting to hot" and to follow some kind of stupid curve profile that is baked into it. Which means that if I want to get the speeds I know I can on this card, I need to keep it cold, and the built in air cooling that it has just isn't good enough for that.