News AMD Explains Why 110-Degree Operating Temps Are 'in Spec' for RX 5700

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
"We noted in our review of the AMD Radeon RX 5700 and Radeon RX 5700 XT that junction temperatures peaked above 100 degrees Celsius under load. (That's 230 degrees Fahrenheit for those of us forced to live with the clearly inferior temperature scale.)"

Um...since when does 100 degrees C = 230 degrees F? Anyone who has gotten past the eighth grade (in the USA education system) should know that 100C = 212F . So, then, the guy writing this piece is a real bonified tech? As-well-as a writer? Starting to look a little sketchy to me. My-oh-my, how the education system has taken a nose-dive under the watchful eyes of liberal politicians. Oh, BTW, 230 degrees C = 110 degrees F...just so you know! LOL Don't take it to heart - we all blunder from time-to-time. ^_^
Don't throw that many stones,what education system did you go through that didn't taught you what above means?!?!
" junction temperatures peaked above 100 degrees Celsius under load."
 
As for the wall, if AMD is pushing these out with little to no OCing headroom its because they have a wall with the uArch. Much like Ryzen 3000 where the CPUs are pretty much performing as good as they can and can't clock much higher, if at all, than their stock configurations.
A big difference here is that for a graphics card, per-core performance does not matter much when adding more cores is practical. Game graphics performance isn't limited by the performance of a single core, since graphics processing tasks tend to divide up evenly across numerous cores, with cards typically offering thousands of them. As it is, the 5700 XT is a relatively small GPU at just 251 square millimeters, about 12% smaller than the chip used in the GTX 1660 and 1660 Ti. The chips used in Nvidia's competing RTX cards are around twice as large, or three times as large in the case of the 2080 Ti. So AMD likely has a lot of room to build upward, whether that's with a larger monolithic chip, or multiple smaller chiplets. And lower clocks could further improve efficiency, making a card with significantly more cores very possible to fit within a 300 watt TDP. Considering an RTX 2080 Ti doesn't offer more than 50% more performance than a 5700 XT, AMD could likely release a card with a similar TDP that outperforms it by a decent margin, with a chip no larger than what appears in a 2060 or 2070.

Their power is slightly above and their temps even with the better cooler is above the stock RTX cooler.
The reference RTX coolers are not the blower-style coolers traditionally found on reference cards, but rather dual-fan open-air vapor chamber designs. And the RTX reference cards spin their fans at somewhat higher speeds than some of these 5700 XT partner cards, so there may be a tradeoff in terms of acoustics to achieve slightly lower temperatures, so that's not really an apples-to-apples comparison. I do think it would be good to see AMD either switch to a more modern reference cooler design, or make sure partner cards are available a bit sooner for future card releases though.
 
Reactions: TJ Hooker

jimmysmitty

Polypheme
Moderator
"We noted in our review of the AMD Radeon RX 5700 and Radeon RX 5700 XT that junction temperatures peaked above 100 degrees Celsius under load. (That's 230 degrees Fahrenheit for those of us forced to live with the clearly inferior temperature scale.)"

Um...since when does 100 degrees C = 230 degrees F? Anyone who has gotten past the eighth grade (in the USA education system) should know that 100C = 212F . So, then, the guy writing this piece is a real bonified tech? As-well-as a writer? Starting to look a little sketchy to me. My-oh-my, how the education system has taken a nose-dive under the watchful eyes of liberal politicians. Oh, BTW, 230 degrees C = 110 degrees F...just so you know! LOL Don't take it to heart - we all blunder from time-to-time. ^_^
Do you mean 230F = 110C?
 
Reactions: TJ Hooker

jimmysmitty

Polypheme
Moderator
A big difference here is that for a graphics card, per-core performance does not matter much when adding more cores is practical. Game graphics performance isn't limited by the performance of a single core, since graphics processing tasks tend to divide up evenly across numerous cores, with cards typically offering thousands of them. As it is, the 5700 XT is a relatively small GPU at just 251 square millimeters, about 12% smaller than the chip used in the GTX 1660 and 1660 Ti. The chips used in Nvidia's competing RTX cards are around twice as large, or three times as large in the case of the 2080 Ti. So AMD likely has a lot of room to build upward, whether that's with a larger monolithic chip, or multiple smaller chiplets. And lower clocks could further improve efficiency, making a card with significantly more cores very possible to fit within a 300 watt TDP. Considering an RTX 2080 Ti doesn't offer more than 50% more performance than a 5700 XT, AMD could likely release a card with a similar TDP that outperforms it by a decent margin, with a chip no larger than what appears in a 2060 or 2070.


The reference RTX coolers are not the blower-style coolers traditionally found on reference cards, but rather dual-fan open-air vapor chamber designs. And the RTX reference cards spin their fans at somewhat higher speeds than some of these 5700 XT partner cards, so there may be a tradeoff in terms of acoustics to achieve slightly lower temperatures, so that's not really an apples-to-apples comparison. I do think it would be good to see AMD either switch to a more modern reference cooler design, or make sure partner cards are available a bit sooner for future card releases though.
Except if this still has roots in GCN then 4096 is the max shaders they can throw at it and in the past it hasn't dented nVidias performance crown. Thats why I say they ned to get away from any GCN based uArch.

And I would expect AiB cards to be quieter. But the fact still remains that even with AiB it still runs warmer and uses slightly more power than the reference nVidia card. However I would prefer a like for like comparison, so an Asus Strix RX5700XT vs an Asus Strix RTX 2060 Super. Mainly so they both have the same cooler to work with.
 
Except if this still has roots in GCN then 4096 is the max shaders they can throw at it and in the past it hasn't dented nVidias performance crown. Thats why I say they ned to get away from any GCN based uArch.
Why do you think this is the case? Specifically, that it's got its roots in GCN and has a 4096 shader limit.

Using some bits from GCN is a completely different thing than the entire architecture being GCN.

Did Nvidia throw away EVERYTHING from a previous gen card before moving on? Maxwell? Pascal? Turing?
 
Reactions: TJ Hooker

jimmysmitty

Polypheme
Moderator
Why do you think this is the case? Specifically, that it's got its roots in GCN and has a 4096 shader limit.

Using some bits from GCN is a completely different thing than the entire architecture being GCN.

Did Nvidia throw away EVERYTHING from a previous gen card before moving on? Maxwell? Pascal? Turing?
Not saying everything but there is still a lot and I have yet to see anything that shows Navi has moved past the limitations of GCN.

I guess we shall see in the near future but until then AMD will play second fiddle to nVaidi. And unless Intel just really puts their all into it I don't think nVidia will be challenged enough to bring pricing down back to where it was with Pascal.
 
However I would prefer a like for like comparison, so an Asus Strix RX5700XT vs an Asus Strix RTX 2060 Super. Mainly so they both have the same cooler to work with.
That still wouldn't be a perfect comparison though, since the 5700 XT has generally been shown to be a faster card than even the 2070 in most of today's games. And it can depend on what game or games are being run to get the power readings as well, and how performance of the two cards compares in those games. I don't believe Tom's Hardware even includes frame rate numbers for the game they use for power testing in their reviews, 2013's Metro: Last Light, which seems like an oversight to me.
 

ASK THE COMMUNITY

TRENDING THREADS