Benchmarking AMD Radeon Chill: Pumping The Brakes On Wasted Power

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
For the people stating that the GTX 1060 takes less power than the RX 480, do you live under a rock ? Have you took time to notice that there are batches of RX 480 with improved manufacturing process that makes them use less power than ever before and that are running cooler even than a GTX 1060 ?

Such proof can be found on youtube on Jayztwocents when he benchmarks an XFX RX 480, and in Jockerproduction benchmark of Dishonored 2, where his new RX 480 card is running 10°C lower than the GTX 1060, AdoredTV made a video explaining what is happening with the new cards and why they are so efficient.

But sadly the "tech review giants" don`t even care about this
 


That's terrible arguing. Those aren't neutral/unbiased posters, those are trolls. You've just come to the conclusion, because 4-5 trolls posted in this thread that AMD has a terrible fanbase and that you won't be buying from AMD because of that.. It's quite intriguing how senstive people can be. This is probably the epitome of generalizing and/or pre-judice.

Every well-sized community comes with various personalities. If you want to know how both Nvidia and AMD users are really like, visits the sites below:

https://www.reddit.com/r/Amd/
https://www.reddit.com/r/nvidia/
http://www.overclock.net/

And absolutely don't stick with that horrendous mentality, because it isn't an accurate representation of either side.

The trolls/uneducated always scream the loudest, no matter which community.
 
The presence of Chill, and what it does, is just one more little factor that may affect buying decisions. If you're on the fence about a purchase (and a lot of people are, based on the number of "which one?" forum threads), Chill might be the nudge that makes the difference. I don't see its presence or absence being any kind of dealbreaker, and I don't think this article presented it that way; it's just one more factor.
This article made no outlandish claims, it merely presented some data, and described how it was obtained. Take it for what it is, not for what you wish it to be.
 

Jayztwocents' video more so proves that it's possible to aggressively bin Polaris 10 GPUs and get golden chips. Not so much that the average Polaris GPU power characteristics are improving.

And it doesn't really make sense to compare an arbitrary Rx 480 and GTX 1060 and say that one GPU runs cooler than the other in general, as that depends as much or more on the specific model/cooler as it does on the GPU.

Don't think I've seen the AdoredTV video yet though.
 
Moderators have rules too. For example, RedJaron, as an active thread participant, pretty much is not allowed to also moderate the thread. If something has been deleted, it was by someone else.
One of the "Don't"s in the Rules is: "Be rude or impolite. Civility is essential on Tom's, and remember that behind each user is a real person. Personal attacks (ad hominem) and insults are not allowed."
This rule, though vague, should be interpreted in a manner that encourages "getting along." There are no "safe spaces" here, and differing opinions are encouraged, but don't let a difference of opinion lead to personal attacks, irrelevancy, or other disruptive posting.
 
The biggest potential of watts saved is not power bill its having watts avaliable when you need them. Power budget determins if your card boosts up, or clocks down.

If you are running yoru card 100% flat out when you dont need to, its going to clock down to maintain that 100%. If instead you limited the card to 80% when you dont need it, now when it needs it it will boost up instead and cut down that long frame because it had room to use extra power when it needed it.

The most important graph here is the last frame time graph 'witcher - running' . If you look right of center, where the blue line is significantly below the black one. This is what im talking about above, it chilled on the left half of the graph, and it had extra room for a bit on the right half. After that extra power budget was used up, then the blue line tracked back up to the black one again. But during that time, you got smoother performance.

As long as chill doesnt screw you the rest of the time, it should help when you need it most. At least until the power budget that was saved is used up, then it stops helping again.
 


True. But for me- the most important would be the hard test- evaluating chill together with freesync. I have seen reports that for example in WoW, when it chill-drops to 40 fps, you can see some animations become less fluid. If that is so- it could theoretically be easily fixed by freesync starting at about that fps limit. So I wonder if freesync adjusts perfectly and makes it smooth in whatever you do- running or 'chill'ing.
 


Especially if the arguments are plain rubbish.
 
Um, kinda but not really. Boost clock is determined by heat, which is affected by how much power the chip is drawing, but saying it this way is a little backward.

Not a given. It compeltely depends on your GPU's cooling. A properly cooled card can run at 100% clock rate without throttling itself, even when overclocked.

Possibly. If it stays a few degrees cooler some of the time, then yes, it's more likely to boost the clock on more occasions before then slowing back down to that 80%. But again, this only applies to a card under thermal constraints in the first place.

You might want to check the scale of that graph. That portion is a difference of 2ms, or less than 5 fps. At framerates around 30 fps, it's enough to just push something over the edge to playable. But the game was already between 45 fps and 50 fps. And don't forget that the time variance is a bigger factor in perceived smoothness. This is also a manually run test, so even though you can get it fairly consistent, the margin of error would also grow. A +/- 2 fps wouldn't surprise me between the two. More testing would be needed to fully confirm if the Chill can consistently give this slight occasional boost or not.
 
I'm trying to use OCAT but i don't see the FPS count in the game, also if you try to use with games like overwatch or BF4, plataforms games does not work.
 


(I clipped down to the above excerpt to include the relevant parts and as a TL/DR)

The basic premise of your argument is wrong. You're assuming there's no benefit from Chill in those "175 minutes." The "spinning and jumping" test is designed to test what players are usually doing in a game! They're in a small section of ground killing creatures, reading text, maps, inventory, etc. [/quotemsg]

The basic premise of your argument is your own invented. There is no mention of of any presence of other entties in scene. The subsequent math is wrong.

 
Status
Not open for further replies.