iknowhowtofixit :
That is quite the long winded response.
What's wrong with thoroughly wording a response? Do you have a problem reading more than a few paragraphs? Judging from your response to the article, I can only conclude you do. If you want to attack someone else rather then their argument itself, prepare to have the same done to you.
iknowhowtofixit :
I must of missed where you addressed the lazily written article
I must have missed the part where you actually explain why you think it's lazily written, rather than lay baseless accusations.
iknowhowtofixit :
and the fact that only one game was tested
Again, why is this a problem? This is a preview and announcement, not an in-depth experiment. If you would bother looking around, you'd notice that all the other previews about Chill and the new Crimson suite were posted today as well. This signifies some kind of NDA expiration. A single, demanding game is fairly representative of the majority of the supported list. Tech Power Up also only used one game formally ( Far Cry Primal ). And they only gave you power readings, not frame time as Igor did.
iknowhowtofixit :
before proclaiming FAILURE.
Just stop. The final paragraph says "
if a game is well-suited to the feature's strengths, then there are clear savings to be had by enabling Chill. This feature does help supported cards run cooler, and thus quieter. There's not much more we could ask for except a longer whitelist or, better still, an option to enable Chill at our own risk for every game, even the ones not yet officially supported. For the time being, AMD's software team is to be commended for the innovation it continues piling onto the hardware."
Please explain to me how you can possibly interpret that as FAILURE.
No more than you do when saying Igor is calling Chill a failure.
iknowhowtofixit :
A quick look at the awards on your profile tells us the whole story. You have more best answers for Nvidia than you do for Intel and AMD combined.
And this proves what? Nothing. First, how do you expect Jack to control who does and does not pick his post as a "Best Answer"?
Second, the BA category is based on the tags assigned to a thread by the forum software at creation, Very few users bother adding or editing the tags automatically assigned. So using that alone to "prove" a bias is incredibly bad research.
Third, going by the current market saturation, NVidia GPUs far outsell AMD right now. Meaning there are far more question threads about NVidia GPUs than AMD's. Thus, a lot more NVidia BAs will be awarded than AMD, even if you include AMD CPUs since they are far outsold by Intel as well.
Fourth, if you're going to use BAs to asses participation and tech knowledge, Jack's GPU BAs are more than twice as many as all of yours combined across all categories.
In short, using BAs to prove anything other than simple forum participation is pointless.
iknowhowtofixit :
You aren't impartial. This is normal. Most people aren't impartial (especially those who are active on tech forums).
Then you need to apply this to yourself, in particular your rampant AMD bias you're displaying.
iknowhowtofixit :
This article was rushed and it shows. Don't pretend like it is anything more than that.
Don't pretend like you did much more than spew out some unsubstantiated claims just because Igor didn't lavish praise on AMD. If you've never worked as a reviewer within an NDA envelope and don't know how those deadlines work, along with all the other articles you're working on, you don't know what you're talking about. Rushed? Perhaps since I don't know the schedule Igor was given. But again, show your evidence that proves this was poorly or lazily written, or that his methodology or conclusions are flawed.
rwinches :
Here are links that provide proper review and info about the full largest SW release for Radeon.
If you're complaining that Tom's only looked at Chill and nothing else, perhaps you ought to look better on the site. Perhaps you should bother reading the
other write-up that came out today.
As for a "proper" review, apparently you missed the fact that nearly all of those lacked any sort of in-depth look at how Chill actually performs. Nearly all of them focused on repeating AMD's claims of Chill's 31% savings when playing WoW, but little else. Only Guru3D bothered to put up any chart on WoW itself, but it doesn't explain its methodology. For all we know, they simply repasted numbers given them by AMD.
TechPowerUp posted some graphs about power draw while playing Far Cry Primal, but again the methodology isn't fully explained, meaning we don't know the detail settings, which scenes or maps were used, or how the game was being played. They do however mention that with Chill on in WoW, you will see stuttering when standing around because the GPU thinks you're inactive even though other players are running around the screen. Guess that means TPU is completely anti-AMD too.
Igor is the only one who bothered to test and record actual game performance and user experience, not just power draw, meaning he went far beyond what those others did. Yet he's the lazy one?
neblogai :
Ok, I can answer to that:
1) Maybe you do not know- AMD released a huge feature and driver package today. Toms has not even reported it- only made a small test on one of the features for this article. Here is how much was really released:
Completely false, as I already listed. The rest of the announcement is in a different article. This was a more in-depth look at one feature. So in actuality, Tom's actually ran TWO articles about AMD when many others only ran one. According to your logic, that means they must have a pro-AMD bias.
neblogai :
2) Article is full of negativity, like : "Ah, but there are restrictions", "Sounds cool, right? Unfortunately"
THAT'S what you consider negativity? Wow, get some critical analysis skills, please. Yes, there are restrictions. It's only for DX9 and DX11. That's a fact, not AMD bashing. And that "unfortunately" was also applied to NVidia's Ansel. Wanna try again?
To be a non-partial reviewer you report on the good and bad. To not do so is the definition of bias.
neblogai :
3) Game chosen for testing is Witcher 3- one of the most demanding in the list of supported games.
Um, yeah. There's this idea called "stress testing". Basically, if you want to see how well something works, you push it as far as it can go, not stay just in its comfort zone. You don't market a heavy duty truck by saying that it has no problem hauling a couch on a trailer. Why is this a problem?
neblogai :
Most of them can run 100-200fps on RX480
Complete BS, or at least completely unquantified BS. What resolution? What detail settings? Most of the Chill enabled games are tested at that link to Hardware Canucks you left. Wanna guess how many broke 100fps?
neblogai :
and will run even faster on upcoming more powerful GPUs.
Um, file that in the "Well, DUH!" folder. What's this have to do with anything?
neblogai :
So certainly energy savings on witcher3 will not be as great as on a game which drops 150fps to 40fps.
Please tell me which game is going to drop like that. The only games that are going to reach the 150fps range are things like CS:GO, StarCraft, and Overwatch, maybe WoW. In order for the framerate to drop, you've got to be standing still. Wanna take a guess how often you're standing still in those games?
The bigger savings will be in games like Witcher, like Tomb Raider, and other slower paced adventure games when you can afford to just look around a bit without getting sniped from the other side of the map. Hence, why Witcher 3 was actually a good selection to test this.
neblogai :
4) "What do you do as a manufacturer when you are constantly being criticized for higher power consumption" - is there really anyone criticizing AMD at this point for having a GPU that runs 30-40W more than competition?
It's called reputation. It's not something that goes away easily. How long has it been since AMD held a power consumption advantage over NVidia? People have complained about AMD's power draw since the RX 200 vs GTX 700. Polaris improved a lot, but it's still far behind Maxwell.
neblogai :
On desktop- no one cares, as $10 higher electricity bill can be more than compensated by $100 dollar cheaper adaptive sync.
First, yes, people do care on desktop. Power draw equals heat, and anyone building mATX or ITX systems will understandably favor NVidia over AMD at this point. Second, don't make the mistake of applying USA's electricity costs to the rest of the world.
Again, the games that can see the best power drop with this tech are the ones that are least likely to utilize it. Also, how many people game on laptops while on battery compared to plugged into the wall?
neblogai :
Also, with latest driver updates, reviewers are already finding that even RX480 reference has caught up with GTX1060 in DX11, and leads more in DX12:
Yeah, this is what many of us have theorized for a while. What does this have to do with Chill?
neblogai :
With all the new AMD features that are available today, same or better performance, and much cheaper adaptive sync- AMD is simply a better buy, and nVidia is behind. The thing is- here at Toms nVidia is somewhat more energy efficient- so light-years better
You must have selectively read that Canuck's article as much as this one. Did you miss that portion at the end that said while the 480 may have an advantage over the 1060 in the future, it's still hobbled by inflated price and the better buy is whichever card is currently on sale?