RX570 G1 (Temps oh well)

dsr07mm

Distinguished
So since I'm having my first AMD card after many years, and now it's winter I started with heating, it's hell in my place.

Is it okay to run GPU (RX570) at 65c in some games with Enhanced Vsync and 60fps cap while in some as Fortnite when my wife is playing temps are up to 72-75c in locked 140fps ?

1080p gaming though.

My 4770k is going from 55 to 65c max, usually around 58-60c during loads. I had lower temps when I had EVO212 2 fans on it, now its some T4 CM cooler with one fan if I remember correctly.

I have huge fan intake in front, 2 outtakes at the top, 1 outtake behind and 1 intake bottom near PSU.

I mean, I can open side on case, but don't like having dust directly in PC since it's in bed room.

If those temps are fine I guess I'm okay.
 
Solution
Those are great temperatures, nothing to worry about.... for reference my GTX 1070 stays comfortably at 80-82C when gaming.
Also the CPU temps are respectable.
To be clear, temps are related to noise, not to how much your PC heats up the room it's in. My Raspberry Pi runs at 70C, but it would take 20+ of them to make any noticeable change to the temperature in my office.

On temps - Most GPUs have a target temperature somewhere between 65C-75C where the GPU will adjust fan speeds to hit this target. For AMD GPUs, if you go to AMD Settings -> WattMan Tab (Global) and scroll to the bottom of the tab, you'll see a "Target Temperature" slider that you can adjust. This will affect fan speeds. More on the WattMan tab below...

Wattage = Heat. If you want to reduce heat, you have to reduce power, AMD provides a large number of options:

(1) Under global settings, there are toggles for "Power Saver", "Frame Rate Target Control", and "Chill".
FRTC works to limit the GPU frequency so that the delivered frame rates do not exceed a given value. You can use this very effectively on 60Hz monitors so the GPU isn't cranking away at full speed delivering 120fps while the monitor only shows you every other frame.
Chill works similar to FRTC on the upper bound frame rate setting. But Chill will also throttle back the GPU in "scenes with little/no motion on screen" to the lower bound frame rate setting. This saves additional power. I'd only recommend this for variable refresh rate monitors.
Not necessarily a power saver, but I'd recommend enabling "AMD Adaptive Sync" in the global settings

(2) AMD's WattMan utility is built into the driver software, so unlike for Nvidia cards, there's no need to download 3rd party software such as MSI Afterburner.
**I will say that I've had consistent trouble getting WattMan to apply voltages properly after a cold boot, but simply restarting your machine will fix the problem. (aka power up, restart, then you're fine)***
Within WattMan you can select "Auto" or "Manual" for GPU frequency and voltage. If you want to save power across the board, you can select "Manual" for both and do an Undervolt. This will show you the frequency and voltage at each performance "State" for your graphics card. From my experience, as well as others' I've seen online, it seems AMD took the stable voltage for each frequency, then added 50mV to that number. Therefore, it's pretty safe to say you can subtract 50mV from each State on the curve and shouldn't experience any issues. I recommend graphing the frequency/voltage curve on Excel or similar so you can visualize the curve you're making. VRAM voltage acts as the lower limit to how much voltage the core can get under load. I've had good results with a 900-910mV VRAM Setting, but you can test lower if you want. Once you've gotten everything set the way you want it, there's a box on the upper right corner to "Save Profile" so that you can later "Load Profile" without having to redo everything. Experience says that the GloFo 14nm process "kinks" around 940mV. AKA, you'll notice the slope of the freq/voltage curve increases above this point. For the clockspeeds on an RX570, I'd imagine you can set your 1244MHz to 945mV.
As with CPU OCing, if you go to manual frequency or voltage on your GPU, it's a good idea to stress test your settings to ensure stability. FurMark is a good one to try. Run the test for a few hours at each frequency/voltage setting.

By undervolting my personal RX480, I've reduced its power output by 25% without changing core frequency. Many other similar results.
 

dsr07mm

Distinguished
^ Thank you.

My problem with this RX570 4GB Gigabyte Mi 1.1 GPU is that it's undervolted by a lot. I believe. By default in games core clock is only 950-1100MHz, but by putting Power Limit to 50%, I get lock boost clock @ 1244MHz, I also overclocked memory to 8000MHz and that works fine. But I can't overclock anything above that. If I overclock to let's say 1350MHz, core clock is still droping to 1244MHz by a lot and performance gain is not what it should be.

So I assume this gpu is "mining" version from gigabyte with "Mi 1.1" in the name, and voltage is really low. So basically I'm using my gpu at 1244mhz/8000mhz. I just tested fan option in Wattman, I left Auto fan speed, but after setting Target Temp to 68c (which I preffer and I get from opening case), fan speed is automatically increased from 33% to only 36% at 65-68c and temp is never going above. That's great I guess.

So I solved my "issue" anyway. I just hate having 50% power limit just to get default boost clocks. But performance is okay anyway for 200$ which I paid, if I add to math my sold 770 MSI Lightning for 110$ I paid only 90 bucks for new RX570 with 2 years warranty.
 
It would be helpful to know what voltage you're seeing at "default" 1100MHz and what voltage is being applied at +50% Power Limit to achieve 1244MHz. I like to use GPUz to see this info.

There are a couple scenarios that can prevent a GPU from maintaining a stable clock speed when you're using "Auto" settings.
First (and most probable) is Wattage. If the voltage at a given frequency is too high (say the manufacturer set the "safe" wattage limit at 120W), the GPU can exceed the watt/TDP limit and throttle itself to prevent damage and stay under the limit. One way to combat this is to raise the Power Limit to allow more watts to flow (something the OCers oftentimes end up having to do). The second way is to lower the voltage (while still maintaining stability) which in-turn lowers the power draw.
Second, if the voltage is set too low for a given frequency and you have frequency set to auto (% in Wattman), the GPU will lower the core frequency to maintain stability. This can manifest in either a sawtooth frequency graph, or simply a lower than expected frequency.

From my experience with WattMan, if you've got frequency and/or voltage set to Auto, you're riding the freq/voltage curve that the manufacturer set in the BIOS. Again this is likely about 50mV too high for each frequency state. However, as I mentioned before, if you set them both to manual, you can override the C-states to your own design.

If it's any help, my RX480 does 1235MHz at 940mV which equates to an average of 84.1W "GPU Only Power Draw" as measured by GPUz during a Furmark run. At that power draw, my fans are barely audible and the temp in my office doesn't get uncomfortable. As another point of reference along my curve, I set 1145MHz @ 900mV (900mV being the same as my VRAM voltage).
Of course, it's entirely possible that you got the raw end of the deal in the "chip lottery", but I'd highly encourage you to fiddle around with manual voltage settings before throwing in the towel.
 

dsr07mm

Distinguished
I remember that my TDP was lower then what it should be, lowering voltage didn't leave room whatsoever for higher clock speeds, so only way was +50% Power Limit. And now it works @ 1244MHz which is ~7-10fps more then what would some other people play on if they didn't realize what core clock speeds are by default.. That is really bad move from Gigabyte to be honest.

I assume that only way to get higher clocks now is to mess with BIOS and find one which allow higher TDP then what GPU is achieving by default. I tried even increasing voltage but no change whatsoever, TDP is always under expected numbers or rather "safe" numbers for cards as RX580.
 
Again, when you don't provide a voltage number, it's difficult to help. I'm still getting the feeling that you're working on "Auto" settings.

TBH, if you received a card that's not hitting its advertised (1244MHz) frequency at default (0% Power Limit) then you're absolutely eligible for an RMA. If any manufacturer is guaranteeing a frequency out of the box, the card should be able to hit that.

 

TRENDING THREADS