AMD Radeon RX 480 8GB Review

Page 11 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
^ https://www.youtube.com/watch?v=rhjC_8ai7QA
In that video you can see a practical example where power draw over PCIe is a problem.
RX 480 is a step forward for AMD. It's just too far behind Nvidia.
970 same performance and currently same price with less TDP and better overclock even though it's previous gen card built on 28nm vs 14nm of 480 :p
Leaked specs for the 1060 suggest that it's a better card.

It's not about negativity, it's making fun of the over-hyped product that does not meet the expectations (and IMHO does not meet today's standard in terms of performance and power efficiency).
I admit that it's almost like making fun of cripple, but at least I'm mocking fanboys as I really sorry for AMD and their inability to be competitive with both Intel and Nvidia.
 
G

Guest

Guest
Behind Nvidia how? It is faster than the 960, which is in it's same tier. The 970 is technically not in the same tier, and is outdated. Why are we suddenly mocking a card for only competing with a card above it from the previous gen? I'm not a fanboy at all, I buy what makes sense for the money at the time. I run an AMD GPU because I like their GPU's better. I recommend both to others depending on what they want. Fanboys are the ones mocking a new GPU for shredding last gen same tier GPU's. Ths thing wasn't overhyped, it performs much like everyone though it would. I never saw this kind of negativity toward the GTX 960, which failed hard at jumping past the 760. The reviews of that card were great. The GTX 970 was a $300+ card on release, not the same tier, and is only very cheap right now because Nvidia does not have a new card that competes in that price range. This is not a fail, but a success on AMD's part. Making the opposition lower prices is always a success.

Anyways, with that I'm out. This forum has really began to suck.
 


GTX 1070 with 60% more performance with the same power draw is success ;)
970 is priced as is to clear the shelfs for 1060 and 1050.
 

PedroCE

Commendable
Jul 3, 2016
4
0
1,510
Hi guys, back when the 750ti was release you review it and it doesn't use any pin and still sucks up to 141w only from the PCIe, so, do you know any 750ti user who break the MOBO? And why you don't point it as a bad thing back then?
 


750ti is a 60w part ;)
 

PedroCE

Commendable
Jul 3, 2016
4
0
1,510

On the Tom's Hardware's test it suck up to 141 and almost all the time keep higher than 75w, they said it was cuz the 750 ti is 'too efficient'. Actually on the test the lower power consumption they mark was 64w.
http://i.imgur.com/hpsWAZx.jpg
 

Math Geek

Titan
Ambassador
you're gonna have to link the full review and not a random graph from it.

you are obviously not reading it right more than likely. the issue with the 480 is that it MAINTAINS the high power draw from the pcie. that means the AVERAGE draw is over 85w, or way over the pcie spec. i'm willing to bet the 141w the graph shows is random power spikes which is not the same thing and acceptable. every card does that. the thick green line on the graph you linked sitting around 65-70w is the AVERAGE power draw for that card. which is well within pcie specs. i bet if you actually READ the review it will say pretty much what i just said. or link it and i'll read it for you and point out where it says that. i went to college and everything so i can read lots of words now. HOOKED ON PHONICS WORKED FOR ME!! :D

you're looking for excuses and bias when it is not there.
 

PedroCE

Commendable
Jul 3, 2016
4
0
1,510


There you have, they doesn't give a dang about it taking over 75w http://www.tomshardware.com/reviews/geforce-gtx-750-ti-review,3750-20.html
 

PedroCE

Commendable
Jul 3, 2016
4
0
1,510
It isn't random peaks when almost all the graphic is over the 75w line, that 64w avg is just because it run on 12w while on idle, and even on game it may idle.
 

Math Geek

Titan
Ambassador
as i said, those spikes are perfectly normal and EVERY card does it. but clearly you just want there to be some conspiracy involved so i'm gonna move on to something actually important. do yourself a favor and look up more of the reviews for cards from both companies and you will see these spikes are very normal for EVERY CARD ON THE MARKET. or don't and just straighten up your tin foil hat and pout that everyone is being so mean to AMD.

either way, i've wasted enough time on your tantrum. best wishes to you.
 


85W is not way over the spec. It's 10W over the spec. Generally it shouldn't cause issues.

The problem is with overclocking, because then it will go way over spec.
 

Math Geek

Titan
Ambassador
ok my bad. seems there is no problem with the card at stock and the whole world of tech sites showing it is an issue is wrong.

my apologies for believing the experts. i'll go ahead and start believing the fanboys instead and just stop reading anything written by people who know. thanks for clearing that up.

seems we are done with anything intelligent in this thread as it's just trolls and random nonsense now so i'm out. enjoy the whatever this has turned into. good day.
 

TJ Hooker

Titan
Ambassador

A quote from that article:

That just doesn't make sense to me. The entire point of viewing a graph of power over time is to observe instantaneous power. But they then proceed to act as though instantaneous power spikes aren't important, and arbitrarily filter some out.
Also, I'm don't know what kind of switching VRM topology graphics cards use, but I'm not sure why it would cause large spikes in instantaneous power when a phase turns on.
 


The problem is that the GTX 970 is a nearly 2 year old card at this point, built on the 28nm fabrication process that's been used in graphics cards for the last 4 years. A GPU built on a 14nm process, which is two full process generations newer, should obviously be more efficient. The gaming efficiency is fairly good compared to AMD's last generation of cards at this performance level, but those weren't exactly all that efficient either, and it doesn't seem like they have done anything to improve efficiency outside of the process shrink. The idle power consumption appears to have actually gotten worse.

The RX 480 should only be competing against the GTX 970 until the GTX 1060 comes out, which sources indicate may be as soon as this month. Leaks also seem to indicate that the 1060 will be as fast or faster than the RX 480, with a notably lower TDP. Of course, that card will probably cost more as well, and the one thing AMD's card has going for it is its price.

Those complaining about power use are mostly just disappointed that AMD still hasn't broken their trend of being well behind the competition in terms of efficiency in recent years. This goes for both their GPUs and CPUs. A decade ago, AMD's Athlon 64s were ahead of Intel's processors not only in price for performance, but also in efficiency and features. Now, they are mostly just offering lower cost, less efficient alternatives in the low to mid range. Likewise, Radeon graphics cards often had better efficiency than Nvidia 's, but they seem to be headed in the same direction as part of AMD.
 

I think you're confused about total power draw versus power draw on the PCIe slot. Total power draw over 75 watts is no problem. Sustained power draw from the PCIe slot that exceeds the 75 watt specification has the potential to damage the motherboard. Like this guy found out the hard way.


https://bitcointalk.org/index.php?topic=1433925.msg15438155#msg15438155
 


I guess the guy is lucky that the system didn't catch fire.

 

InvalidError

Titan
Moderator

No luck needed there: use of either non-flammable materials (such as solid metal) or materials laced with flame retardants (PCBs, connectors, wire insulation, etc.) is required in equipment that meet safety requirements.
 


the test setup uses an i7 6700k and 16gb ddr4..... what cpu do you have?
 

truegenius

Distinguished
BANNED
hey Chris Angelini, do you still have rx480 for review, or bought one if yes then can you conduct power consumption test with fan manually set at 100 % ?
i think it will change the power comsumption figures dramaticly ( almost linearly with temps )
 

TJ Hooker

Titan
Ambassador

What makes you think that? Do you think power consumption will drop significantly with lower temps?
 

bit_user

Polypheme
Ambassador
One of the few things I know about electronics is that the electrical resistance of most materials increases with temperature. So, you'd expect lower temps to result in higher power consumption. Is that what you're thinking, TJ?
 

TJ Hooker

Titan
Ambassador
I'm guessing you meant to say lower temps result in lower power consumption? Yeah, that's what I was thinking.

I just didn't imagine it would be that significant. But I'm no expert, maybe @truegenius is right.
 
Status
Not open for further replies.