[SOLVED] What should rx 580 power comsumption be watts

Hello, I have a powercolor 2 fan rx 580 and if I'm playing an intensive game that power usage is usually 141 watts the most I've seen is 146 watts even in the most stressful possible scenarios is this normal and a good wattage or should the wattage be higher? motherboard is msi b450m gaming plus
 
Solution
okay I ran ungine valley at 4k resolution with the graphics settings set to max for 20 minutes
and my max power comsumption is 147Watts my temperature didn't exceed 72c
highest fan rpm is 3718 the core does have some drops going down to as low as around 1067 MHz memory maintained a constant 2000MHZ but the clocks kinda seem like they drops when the scene channges max core is 1350MHZ max memory is 2000MHz the core does drop here and there but temporarily and only by 10ish MHz and that isn't caused by changing scenes
well on your gpu u should have switch
seems like u running it on silent mode now
S93db89a0-e8b0-4113-85c7-29faeb439b98.jpg
That's probably a bit low unless you're manually undervolting, but within reason. It's probably best if you tell us what the core frequency is.
well I run it at stock voltages and power limit it isn't underclocked and I usually run it at 1350 mhz core or sometimes 1450 mhz core but I have been having quite unsatisfactory performance I just feel as if my card should be performing better
 
I have also been noticing pretty rare artifacts that sometimes happen on the web like for a second there is a white bar just fast another time some days ago there was this purplish manilla bar that disappeared fast my card is under a year old and was bought new
 
Graphics card power has virtually nothing to do with your motherboard. What exact power supply do you have? How exactly is your graphics card under-performing?
well its just weird like when I play minecraft my performance sucks even though my render distance is at like 20-22 chunks I have my settings heavily lowered and I have 3066MHz 16gb ram and a ryzen 5 cpu I honestly feel like my performance was just as good with my old 1050 ti
 
All motherboards supply 75W to a dGPU. That's all. The rest is handled by the PSU.

You need to monitor GPU frequency during gaming. THAT is what GPU performance is based off of.

My RX480 runs 1400MHz at 1150mV and draws about 130-140W in the process (IIRC, reported by GPUz). Or I can crank the voltage and get much higher wattage at 1400MHz and performance is the same.

Minecraft doesn't require a RX580 anyway, so there's no reason it should be any/much better than a GTX1050Ti. You're probably getting multiple hundreds of FPS on your (assumedly) 60Hz monitor, correct? Minecraft is so graphically simple, that you're almost always CPU-limited. A simple test here is to monitor CPU vs GPU usage (%). Most likely your CPU is at 100% and your GPU isn't anywhere close.
TL,DR - play more graphically demanding games.
 
All motherboards supply 75W to a dGPU. That's all. The rest is handled by the PSU.

You need to monitor GPU frequency during gaming. THAT is what GPU performance is based off of.

My RX480 runs 1400MHz at 1150mV and draws about 130-140W in the process (IIRC, reported by GPUz). Or I can crank the voltage and get much higher wattage at 1400MHz and performance is the same.

Minecraft doesn't require a RX580 anyway, so there's no reason it should be any/much better than a GTX1050Ti. You're probably getting multiple hundreds of FPS on your (assumedly) 60Hz monitor, correct? Minecraft is so graphically simple, that you're almost always CPU-limited. A simple test here is to monitor CPU vs GPU usage (%). Most likely your CPU is at 100% and your GPU isn't anywhere close.
TL,DR - play more graphically demanding games.
well I was getting around 44 - 76 fps non locked with stutters / lag spikes that would freeze for 0.5 seconds roughly not at max settings either
 

TRENDING THREADS