[SOLVED] Worst RX 5700 in the world?

Status
Not open for further replies.

Tenu8395

Prominent
Apr 30, 2023
3
0
520
I bought the gigabyte RX 5700 gaming OC variant with a single 8pin and I feel like something isn’t right about its power consumption and voltage behavior. It manages to draw around 220-230 watts at 860mv actual voltage (in HWiNFO) while staying at 1800mhz. I also had to modify SPPT to allow a 50% power limit increase, and 1800@860 is the best result I got with a reasonable enough power draw.

Reaching the stock 1750mhz target clock is impossible with stock 20% limit (204W max), at the same 840~860mv with 204W limit it hovers around 1650mhz under intense load. From all that, I don’t understand how people are running 5700s flashed with XT bios at 1v+ and XT clocks of 1900-2000mhz, because with how ridiculous the power draw of my card is at 850mv, it will probably suck up 300W+ to run 2ghz and/or 1V+. Did I get really unlucky or is this normal behavior for Gigabyte’s 5700 variant? FYI it doesn’t even remotely overheat, hotspot and VRAM are at 80C at 230W sustained. I also can’t really complain about anything else with this card, has none of the common RDNA1 problems, just the VRAM idle clock bug but it’s manageable.

Setup is a little unconventional but it should be good, it ran an OCed Vega 56 with no problems. (Machinist X99 mobo, Xeon E5-1660V4 + 16GB DDR4, MSI A650BN PSU)
 
Last edited:

Aeacus

Titan
Ambassador
Vanilla RX 5700 is rated for 180W, with base clock of 1465 Mhz and boost clock of 1725 Mhz. Your Gigabyte version is already OCd by the Gigabyte, to the 1565 Mhz base and 1750 Mhz boost. Any OC increases the power consumption. And since you've OC'd the already OC'd GPU even further, to the boost of 1800 Mhz, you seeing ~230W out of it at full tilt would be normal.

From all that, I don’t understand how people are running 5700s flashed with XT bios at 1v+ and XT clocks of 1900-2000mhz, because with how ridiculous the power draw of my card is at 850mv, it will probably suck up 300W+ to run 2ghz and/or 1V+.
As with CPUs, GPU chips are also part of silicone lottery and some chips OC better than others.
As of power draw, who talks about that, when they can boast that they got their GPU to boost to 1900/2000 Mhz?

E.g if you're running big block 426 V8 (7.0 L) HEMI, you're talking about 590 BHP it produces. You're not going to talk about the fuel consumption of it.
 
  • Like
Reactions: CountMike

Tenu8395

Prominent
Apr 30, 2023
3
0
520
So far it turned out that VBIOS was the problem. I found an unverified, almost a year newer Gigabyte 5700 VBIOS on TechPowerUp, risked and flashed it to my card. It used to draw around 200W at 1850mhz/870mv in Cyberpunk on previous VBIOS and now it's around 120~140W, like it is for most 5700s, and it works perfectly.
It also fixed its extremely unstable clocks and voltages on Linux, so the new Windows driver wasn't at fault either. My bet is that Gigabyte did something about sensors and/or power management between the revisions of their 5700s, as my card is rev1.1 and the only verified VBIOS on TechPowerUp was built just after 5700 release date, almost certainly for rev1.0.
 
Status
Not open for further replies.