Benchmarking AMD Radeon Chill: Pumping The Brakes On Wasted Power

Status
Not open for further replies.

torka

Distinguished
Jan 18, 2011
43
1
18,535
This is a wonderful advancement. One that makes me feel a lot better about having a GPU that would be considered overkill and a waste of power.
 
Lotta of articles of late have missing legends but I assumed the perhaps obvious assumption that on Power Consumption graph, black is no chill, and red is chill on.

Isn't this behavior normal ? I notice when playing on my son's box and walking away from game for a minute (twin 970s, 1440p) ,when I come back, the Temp and Power graphs on the LCD panel taken a dive.

I live the idea as this could help solve many of the heat problems such as in the post I just answered, so while I love the concept, I'd love to know what is the impact ?

I think the true test here would be to set up a few gamers and pick a quest:

a) Have 3 run thru the quest with no chill.

b) have 3 gamers run thru the test with chill.

During the test, have the PC plugged into a kil-o-watt meter and record the start and finish kw-hrs

If I sit down for a 3-4 hour W3 session, I'm only stopping for bathroom and snack breaks, and I'm not that old yet where if taking a bio before I start, I'm going to need another one :)

a) Say I do need a bio, or say FedEx arrives with a package, that's a 60 - 90 second afk.

b) No one really spends any time (I think) jumping and spinning.

I would expect that at the end of the tests, the results won't be much different.

Average of 189 watts over 175 minutes of playing time, 5 minutes of afk at 183 or staring at Vistas (no chill) = 189 x 175 + 5 x 183 = 188.83 average draw

Average of 188 watts over 175 minutes of playing time, 5 minutes of afk at 143 or staring at Vistas (chill) = 188 x 175 + 5 x 143 = 187.88

Assuming the usual 100 hour game playing time ....

1.05 watts x 100 hours x 60 minutes x $0.10 avg cost / (1000 watts per kw x 0.87 eff) = $0.72 in power cost savings,

Looking at the performance impact of the 3 conditions:

a) Performance wise, there's a 33 % hit on performance for a 22% energy consumption when standing still and for that I say "who cares", ya want fps for smooth movement and if ya ain't movin' ....

b) There's s a 28% drop in performance for jumping and spinning for a 19% drop in energy consumption which again is of not concern to me but may be bothersome for folks who like to pan around and admire vistas.

c) I actual normal play while traveling or fighting, it has no impact.

Not trying to minimize what AMD has done here, certainly commendable. But during normal active play over all 3 conditions, we see a 0.5% decrease in power over a 3 hour session for a 0.6 % decrease in avg fps (*while running) when you really need it. OTOH, the ROI on the technology is negative ... the impact on performance is far greater, percentage wise, than what ya get back in power saved.
 

rwinches

Distinguished
Jun 29, 2006
888
0
19,060
So those of us that use the builtin sliders for their intended purpose to match or hardware to the game being played are not "real gamers" especially since we purchased AMD GPUs that have "architectural disadvantages" and should have gone with nVidia

Got it.
 

neblogai

Distinguished
It is funny how biased the Toms reviewer is again. Instead of finding positives in this new tech, it's possible uses (for example- to save battery life on laptops + WoW) and best case tests, almost every paragraph in the review has rants on how AMD is bad and not as good as nVidia :)
 
I have to say that's the 1st time I ever saw anyone accuse Tom's of an AMD Bias .... just look at the "Best GFX card articles" over the last few years. You don't think it odd that a site that focuses on gaming would mention the fact that there's a huge difference in OC headroom ? When two cards are in a certain performance / price niche and one OCs 25% and the other 6% , that shouldn't affect a potential buying decision ?

This article read to me like he was trying real hard to say something good about AMD without making it subject to crushing and truthful responses.

Battery life ? during gaming ? Unless gaming is defined as starting the game and staring at the scenery w/o moving, what's the gain here ? If you actually play, with the 0.5% power savings in a typical session, your 90 minute **gaming battery life** just became 89.6 minutes. What exactly do you expect the author to rave about ?

If we are going to talk about bias ... and ignoring the fact that tests weren't done on a Mobile GPU or if the Mobile GPU wasn't already doing this, you will consume 143 watts with your 480 .... but the guy across the aisle on the train w/ his 1060 laptop is consuming 120 watts... and he gets to actually **play** the game. He can really play the game 20% longer than you could "stand still and stare at the scenery" with the 480.

While what AMD has done with this so far is commendable, the greater significance is perhaps they could expand the functionality on this as time goes on beyond AB's power sliders. As of now, it has to be said, it's power saving impact is limited to staring at the screen and ***not playing the game***.

Now if you are playing the game on a lappie and walk away from the screen to hit the bathroom or whatever ... are you going to leave the lappie unattended ?... aren't you going to close the lid which will result in real power savings when lappie sleeps while you gone ?

I love the fact that AMD has starting looking for ways to reduce power consumption, I love that the 4xx series has cut power usage compared with 3xx ... but while complimenting them on "their improvements" the gap between the two is still very wide....and it has widened with the last generation.

In a story between rival football teams where the losing team lost 42 - 21 instead of the 42 - 10 last year, the author will likely compliment the losing team for their improved performance, but for the author not to note "who won the game" would be the author "not doing his job".

I read every article with the mind set "will this change anything" with regard to what I select or what I put in when I build for others. This doesn't change anything. Like most of today's news, especially in the political arena", the headline is misleading. The headline could have just as easily read:

AMD's new Chill Utility saves power as long as you don't Play the Game

It's accurate, more descriptive of what it does ... putting the brakes on doesn't really tell me how well it did. Reading it, they applied the brakes, the braking distance was better ... but let's be realistic, they still hit the deer".

My take on the article was a positive in that it clearly shows that AMD is concerned about this issue. They are working on it. OK, so it won't have much of an impact yet, but they did what they could with the resources they have. It's a start; as it matures, I expect we'll see user selectable settings whereby you can tell it not to exceed a certain power draw....or "I don't need 95 fps.... cut power as needed to keep me at 55-60 fps.

Now I'm curious (question for you Igor) if this didn't arise out of the 480 Fix where the reference 480s were exceeding their power limits. Of course, like with the fix, when you cut power, you cut performance. Right now it's a poor trade off with performance hits far exceeding the power savings reduction on a percentage basis. Hopefully over time it will get better.

Right now, when I do a 480 build, the PSU needs to be 100 watts bigger than a 1060 build. A user with marginal PC would then be in a position to buy a 480 now and set a safe power limit, until such time as he could afford a PSU upgrade. (Recommend the 1060 / 1070 / 1080 in higher end builds / below the $280 niche, its all AMD) So yes, the technology has several possible upsides.

Technology isn't yet in a place where most of everything else is ... hope it never gets there in my lifetime. Today:

-Everybody gets a trophy
-No Dodgeball cause 1st player out might suffer a hit on his self esteem
-Moms are inserting themselves into negotiations for the offsprings 1st job after college, even attending job interviews
-Editoral content is dictated by a "stick to the positives" mandate from the advertising department.

If I could talk to nVidia / AMD on this topic, this is what I'd say

nVidia: You have a distinct advantage in power efficiency, it's real, don't get lazy and keep at it.

AMD: OK, this new technology won't have a real impact for most of us but will for those that like to "stop and smell the roses". We appreciate your attention to this issue and this technology appears to have the potential to have greater effects over time, so i will be watching for further developments. You are not going to "get a trophy", but we must commend you for the attention and effort. Keep going in this direction as well as in GPU / PCB design to further narrow the gap.
 
What the . . . ? Are you people complaining Igor is bashing AMD actually serious? Did you bother to read the whole thing?


Okay, back up your claims. Show me actual examples from the article that say, much less imply, any of this. It explains what Chill is meant to do, a little about how it's supposed to do it ( though it can't explain much since AMD hasn't revealed all the details ), explains the testing methodology, and gives the results. The commentary on the results say the technology pretty much works as advertised, shows a few niggles where it will be limited, and overall says AMD did a good job on it. The only time NVidia is even mentioned is when talking about Ansel and saying NVidia's tech has the same limitation as Chill and that it only works on a limited number of games. That's not a negative, it's a fact about both methods.

This crap about blasting others with completely false and unfounded accusations may work on so-called network news, but doesn't fly with people doing actual critical thinking. Either put up or shut up.
 

rwinches

Distinguished
Jun 29, 2006
888
0
19,060
Here are links that provide proper review and info about the full largest SW release for Radeon.

https://www.techpowerup.com/reviews/AMD/Radeon_Crimson_ReLive_Drivers/5.html

http://www.pcworld.com/article/3147292/software-games/feature-stuffed-radeon-software-crimson-relive-debuts-amds-rivals-to-shadowplay-fraps.html

http://arstechnica.com/gadgets/2016/12/radeon-crimson-relive-driver-details-download/

http://www.digitaltrends.com/computing/amd-crimson-relive-edition-driver-suite-released/

http://www.guru3d.com/articles-pages/amd-radeon-crimson-relive-edition-driver-overview,5.html
 

neblogai

Distinguished


Ok, I can answer to that:
1) Maybe you do not know- AMD released a huge feature and driver package today. Toms has not even reported it- only made a small test on one of the features for this article. Here is how much was really released:
http://videocardz.com/64496/amd-preparing-crimson-relive-driver-update
2) Article is full of negativity, like : "Ah, but there are restrictions", "Sounds cool, right? Unfortunately"
3) Game chosen for testing is Witcher 3- one of the most demanding in the list of supported games. Most of them can run 100-200fps on RX480 and will run even faster on upcoming more powerful GPUs. So certainly energy savings on witcher3 will not be as great as on a game which drops 150fps to 40fps.
4) "What do you do as a manufacturer when you are constantly being criticized for higher power consumption" - is there really anyone criticizing AMD at this point for having a GPU that runs 30-40W more than competition? On desktop- no one cares, as $10 higher electricity bill can be more than compensated by $100 dollar cheaper adaptive sync. And on laptops- Polaris GPUs are very energy efficient if run at lower clocks: https://www.youtube.com/watch?v=-5dJD-fLfk8
This "chill" feature can help laptops a lot on top of that.

Also, with latest driver updates, reviewers are already finding that even RX480 reference has caught up with GTX1060 in DX11, and leads more in DX12: http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73945-gtx-1060-vs-rx-480-updated-review-23.html
With all the new AMD features that are available today, same or better performance, and much cheaper adaptive sync- AMD is simply a better buy, and nVidia is behind. The thing is- here at Toms nVidia is somewhat more energy efficient- so light-years better :)
 

Bartendalot

Distinguished
Apr 18, 2010
174
0
18,690
The commenters in this thread as well as other threads like it on other sites are why I won't be buying AMD again.

The fans of that brand sound like petulant children who have chosen this battle - with everything going on in the world - to champion. Putting down an author, website and users of the other multi-billion dollar company. This vocal minority is not one I would like to be a part of.

I will continue to spend my money on products I like and get good use out of and visit sites I enjoy.
 
What's wrong with thoroughly wording a response? Do you have a problem reading more than a few paragraphs? Judging from your response to the article, I can only conclude you do. If you want to attack someone else rather then their argument itself, prepare to have the same done to you.

I must have missed the part where you actually explain why you think it's lazily written, rather than lay baseless accusations.

Again, why is this a problem? This is a preview and announcement, not an in-depth experiment. If you would bother looking around, you'd notice that all the other previews about Chill and the new Crimson suite were posted today as well. This signifies some kind of NDA expiration. A single, demanding game is fairly representative of the majority of the supported list. Tech Power Up also only used one game formally ( Far Cry Primal ). And they only gave you power readings, not frame time as Igor did.

Just stop. The final paragraph says "if a game is well-suited to the feature's strengths, then there are clear savings to be had by enabling Chill. This feature does help supported cards run cooler, and thus quieter. There's not much more we could ask for except a longer whitelist or, better still, an option to enable Chill at our own risk for every game, even the ones not yet officially supported. For the time being, AMD's software team is to be commended for the innovation it continues piling onto the hardware."

Please explain to me how you can possibly interpret that as FAILURE.

No more than you do when saying Igor is calling Chill a failure.

And this proves what? Nothing. First, how do you expect Jack to control who does and does not pick his post as a "Best Answer"?

Second, the BA category is based on the tags assigned to a thread by the forum software at creation, Very few users bother adding or editing the tags automatically assigned. So using that alone to "prove" a bias is incredibly bad research.

Third, going by the current market saturation, NVidia GPUs far outsell AMD right now. Meaning there are far more question threads about NVidia GPUs than AMD's. Thus, a lot more NVidia BAs will be awarded than AMD, even if you include AMD CPUs since they are far outsold by Intel as well.

Fourth, if you're going to use BAs to asses participation and tech knowledge, Jack's GPU BAs are more than twice as many as all of yours combined across all categories.

In short, using BAs to prove anything other than simple forum participation is pointless.

Then you need to apply this to yourself, in particular your rampant AMD bias you're displaying.

Don't pretend like you did much more than spew out some unsubstantiated claims just because Igor didn't lavish praise on AMD. If you've never worked as a reviewer within an NDA envelope and don't know how those deadlines work, along with all the other articles you're working on, you don't know what you're talking about. Rushed? Perhaps since I don't know the schedule Igor was given. But again, show your evidence that proves this was poorly or lazily written, or that his methodology or conclusions are flawed.

If you're complaining that Tom's only looked at Chill and nothing else, perhaps you ought to look better on the site. Perhaps you should bother reading the other write-up that came out today.

As for a "proper" review, apparently you missed the fact that nearly all of those lacked any sort of in-depth look at how Chill actually performs. Nearly all of them focused on repeating AMD's claims of Chill's 31% savings when playing WoW, but little else. Only Guru3D bothered to put up any chart on WoW itself, but it doesn't explain its methodology. For all we know, they simply repasted numbers given them by AMD.

TechPowerUp posted some graphs about power draw while playing Far Cry Primal, but again the methodology isn't fully explained, meaning we don't know the detail settings, which scenes or maps were used, or how the game was being played. They do however mention that with Chill on in WoW, you will see stuttering when standing around because the GPU thinks you're inactive even though other players are running around the screen. Guess that means TPU is completely anti-AMD too.

Igor is the only one who bothered to test and record actual game performance and user experience, not just power draw, meaning he went far beyond what those others did. Yet he's the lazy one?


Completely false, as I already listed. The rest of the announcement is in a different article. This was a more in-depth look at one feature. So in actuality, Tom's actually ran TWO articles about AMD when many others only ran one. According to your logic, that means they must have a pro-AMD bias.


THAT'S what you consider negativity? Wow, get some critical analysis skills, please. Yes, there are restrictions. It's only for DX9 and DX11. That's a fact, not AMD bashing. And that "unfortunately" was also applied to NVidia's Ansel. Wanna try again?

To be a non-partial reviewer you report on the good and bad. To not do so is the definition of bias.

Um, yeah. There's this idea called "stress testing". Basically, if you want to see how well something works, you push it as far as it can go, not stay just in its comfort zone. You don't market a heavy duty truck by saying that it has no problem hauling a couch on a trailer. Why is this a problem?

Complete BS, or at least completely unquantified BS. What resolution? What detail settings? Most of the Chill enabled games are tested at that link to Hardware Canucks you left. Wanna guess how many broke 100fps?

Um, file that in the "Well, DUH!" folder. What's this have to do with anything?

Please tell me which game is going to drop like that. The only games that are going to reach the 150fps range are things like CS:GO, StarCraft, and Overwatch, maybe WoW. In order for the framerate to drop, you've got to be standing still. Wanna take a guess how often you're standing still in those games?

The bigger savings will be in games like Witcher, like Tomb Raider, and other slower paced adventure games when you can afford to just look around a bit without getting sniped from the other side of the map. Hence, why Witcher 3 was actually a good selection to test this.

It's called reputation. It's not something that goes away easily. How long has it been since AMD held a power consumption advantage over NVidia? People have complained about AMD's power draw since the RX 200 vs GTX 700. Polaris improved a lot, but it's still far behind Maxwell.

First, yes, people do care on desktop. Power draw equals heat, and anyone building mATX or ITX systems will understandably favor NVidia over AMD at this point. Second, don't make the mistake of applying USA's electricity costs to the rest of the world.

Again, the games that can see the best power drop with this tech are the ones that are least likely to utilize it. Also, how many people game on laptops while on battery compared to plugged into the wall?

Yeah, this is what many of us have theorized for a while. What does this have to do with Chill?

You must have selectively read that Canuck's article as much as this one. Did you miss that portion at the end that said while the 480 may have an advantage over the 1060 in the future, it's still hobbled by inflated price and the better buy is whichever card is currently on sale?
 
Wrong, VSync just makes the frames render in time with the monitor's refresh rate. It doesn't limit how fast the GPU will send frames to the monitor. With VSync on, if the GPU throws out frames faster than they can be displayed, it simply drops the excess frames, thus the GPU's work is wasted.
 

ammaross

Distinguished
Jan 12, 2011
269
0
18,790


(I clipped down to the above excerpt to include the relevant parts and as a TL/DR)

The basic premise of your argument is wrong. You're assuming there's no benefit from Chill in those "175 minutes." The "spinning and jumping" test is designed to test what players are usually doing in a game! They're in a small section of ground killing creatures, reading text, maps, inventory, etc. The traveling bit does happen for the 10 seconds it takes to get to the next 10-20sec fight or other point of interest. Realistically, Chill will be in effect for more than 50% of gameplay (yes, a guesstimate, but way more accurate than the "none" you suppose as you isolate Chill only to "afk" time). So, correction to your math:

Average of 189 watts over 175 minutes of playing time, 5 minutes of afk at 183W or staring at Vistas (no chill) = (189 x 175 + 5 x 183) / 180 = 188.83 average draw

Average of 188 watts over 88 minutes of "running," 87 minutes "chill is working" hack'n'slash time at 148W, 5 minutes of afk at 143W or staring at Vistas (full chill) = (188 x 88 + 148 x 87 + 5 x 143) / 180 = 167.42

Assuming the usual 100 hour game playing time ....
21.41 watts x 100 hours x 60 minutes x $0.10 avg cost / (1000 watts per kw x 0.87 eff) = $14.76 in power cost savings

Let me repeat that: $14.76 in power cost savings over 100 hours. Easily that PER MONTH for gamers.
 
@Jasonelmore: actually VSYNC doesn't do that - most cards fill up their buffer as fast as they can and implement a double or triple buffer, and only the latest filled buffer is used for display. This technology actually throttles the card to match a given frame rate; some games actually did try to hit a given frame rate which incidentally reduced GPU load, but this is the first time a driver actually balances a target frame rate for a given level of activity with a card's power state.
 

USAFRet

Titan
Moderator


hmmmm......
http://www.rapidtables.com/calc/electric/energy-cost-calculator.htm
HyUiO8k.png


I could be wrong, of course...
 

neblogai

Distinguished


That is wrong calculation. 100h x 21W x $0.10 avg cost = 2.1Kw x $0.1=$0.21 per month. Difference in Watts has to much greater to really matter in electricity costs- but greatly reduced energy use will help a lot for laptops- to run longer on battery, or to fit into lower power limit/thermal design while still providing good performance.
 

g-unit1111

Titan
Moderator


Oh you have no idea. :lol:
 

TJ Hooker

Titan
Ambassador

Yes, (AMD's) fanboys are annoying, but acting like only the 'other side' has them is in itself kind of fanboy-ish...
Also, saying you won't buy AMD because some of AMD's customer's are obnoxious doesn't make any sense.
 
That must be it. All the facts and supporting arguments I've listed are nothing more than lies and blind devotion. You caught me.

No detail? Let's see, he describes what Chill is, the idea behind it, how it works ( at least as much as AMD is letting known at this point ), listed the restrictions under which it can run ( specific game titles and rendering modes ), and then tested how it actually performs in real-world examples, along with the recorded data and observations. Unless you want him to post AMD's source code, and or test every single game on the list with every single supported GPU, you'll have to specify what specific detail it is you're looking for that isn't reported here.

Again, oh sagacious one, can you present any specifics about the flaws in the methodology here? Igor took a supported game and GPU, ran them under three specific scenarios, recorded the results, presented them, then commented on those observations. He even went a step farther and showed not just power draw, but frame time variance. That's pretty much the definition of the scientific method ( pose a hypothesis, run a test, record results, analyze if results support hypothesis ). The only thing left to do will be to retest the findings to see if they are consistent and then explore other variables, which I'm fairly certain is underway right now.

Again, why do you say that? What specific markers show this was rushed or poor quality?

I'll let you in on a secret: the moderation staff's proscribed duty is to monitor the forums to enforce the rules and encourage good discussion. Forum regulars usually try to uphold those values as well. You and others firing accusations out of your hindquarters, not to mention making demonstrably false statements, is not conducive to good discussion. So yes, our attention is drawn here.

What negativity? That Chill only works with DX9 and DX11? That AMD GPUs have a ( deserved ) reputation for being power hungry? That Chill's benefits are realized only in certain scenarios? That Igor said AMD's engineers have done good work? Those are all statements of fact. I realize many people these days think it's insulting, shaming, and bullying to state facts if those facts don't conform to a person's utopic world view, but the snowflakes need to get over themselves if they want to survive in the real world.

I repeat myself, but I'll ask again: give specifics in how this article could be improved.

Please specify what you think is missing, both here and the other articles you referenced.

Yeah, and why is this a problem? Let's break this down.

"Benchmarking AMD Radeon Chill" - Ok, so they're saying they're going to benchmark Chill, and Igor did indeed test it. No problems there that I see.

"Pumping The Brakes On Wasted Power" - Well, brakes are used to slow down. So AMD is trying to slow down on wasting power? Well, a GPU uses more power the faster and harder it's working. And when a GPU is still going full tilt to render a scene that isn't changing much and doesn't benefit from a higher refresh rate, and/or rendered frames are being discarded, then yes, that could be described as wasting power. So yeah, this part looks good too.

"AMD's new drivers introduce a feature called Radeon Chill" - Yep, this is indeed a new feature of the drivers that just got released. Still not seeing the problems with the title.

"which promises to reduce power consumption situationally." - And this is what Chill claims to do. Igor even gives a short run down on how it does it. So we're still ship-shape.

"We test its impact" - Igor tests the effect on framerates, power draw, and frame time variance in multiple situations. So yeah, its impact was tested. He even used the proper possessive form of "its" so it's not a grammar error.

So it looks like you're wrong again. There's nothing misleading about the title.

If it's so obvious, then it should be incredibly easy for you to identify where it falls short. Instead you've been asked multiple times to cite evidence and proof to support your claims, and you've failed to do so. Instead you double-down on your previous unsupported assertions. So, even though you can't prove your point, the rest of us are supposedly the blind followers. Right . . .
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
I think there is another underlying benefit that is not quite touched on. Operating temperature. The hotter your card runs, the louder the fan will be. Since AMDs run hotter, they run louder. Even if it's just a little bit, every watt not used helps with fan noise.
 
Status
Not open for further replies.

Latest posts