Best Graphics Cards For The Money: October 2014 (Archive)

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I have no desire to get in a pissing match, but if you're asking for a fisking, I'm happy to oblige. If you feel so inclined to actually discuss this, by all means do so and I look forward to further discourse. Otherwise the below will be the last I write on this in direction to you.

And which "point" would that be? That you can't imagine why people would be concerned about power consumption on a $200 GPU? I fully agreed that your imagination and insight are weak if you can't think of any case where this would matter. At least two people already "refuted" you with one specific case, so I didn't feel the need to add to it. However you only seem to have moved from "there's no reason for this," to "there's only one reason for this," suggesting you're still not reevaluating your stance, even in the face of contrary evidence.

You then attempted to make some point about a "consumer" card, but since you won't ( or can't ) define what you mean by "consumer" then any point you wish to make is ethereal at best.

What kind of weasel excuse is this? You're wrong, but you're not wrong? Treat it as a pyrrhic victory if it makes you feel better. Yes, I can back it up. I spent over $200 on my last GPU. When the Radeon 300 series releases, I'll consider spending over $200 again. I still care about heat, power draw, and acoustics, and no I don't have an ITX machine. That's one example right there, which does refute your claim. I can provide more references for this should you require. If you want to go by your own moronic rules of, "If you can't prove me wrong, then I'm right," keep in mind the same will apply to your opposition if you can't prove they're wrong. In such a case you can neither say you're right nor wrong since neither can be proven. Why not graciously say, "Oh, good point, I hadn't thought of that," instead of "Well I'm not totally wrong!"? Doing the former smooths the whole thing over and people forget about it almost immediately. Defiantly reaffirming your flawed point just draws more attention to the error.

Meaning what exactly? I saw the replies. For all you know there were dozens of people that were going to make the same or similar argument, but then saw someone else had already said it and moved on. Are you saying your points and principles are only strong as long as few people argue against them? Being right or wrong has nothing to do with the amount of people who agree or disagree with you.

Honestly, no I didn't and still don't understand what you mean because you either can't or won't explain yourself. No two people share the exact same outlook or classification on something, which is something you seem to fail to grasp. You figure everyone shares your world view, which they don't My good buddy Onus and I disagree on what we consider the minimum GPU performance for our computers. He's fine with something at the GTX 750 level while I prefer something more robust, like an R9 270X. So having that understanding gives us a common barometer when we discuss performance.

You threw out the word "consumer." Now, contrary to what you may believe, a consumer is someone who buys and consumes goods and services. That's it. So in actuality, I don't care if you're using integrated Intel graphics or triple SLI 980s, you're still using consumer graphics because you paid for a commodity. The most often used dichotomy with consumer is professional, someone who gets paid for a service, or classifying goods that are used primarily for a professional to work on to get paid. That is why I threw that out there. But I was pretty sure you weren't talking about actual professional level cards, which again made me wonder what exactly you meant by it. Hence I offered some other terms to gauge exactly what you were meaning. And as of now, I'm still wondering.

Until you actually explain what you mean by "consumer card" I have no idea what you're talking about. The best I can guess is you're talking about entry-level graphics. In that case I heartily agree. However I hope you realize the vast shades of grey in ranking performance levels of the available GPUs out there. Where do you draw the line between "consumer" and whatever term you use to describe "non-consumer and non-professional"?

If they're buying a commodity or service, yes they are. I myself am a consumer, and I do pay that much for GPUs. Or are you saying this is the dividing line between consumer and non-consumer to you? So a card that is initially released at $210 is not consumer, but if it goes on sale or the MSRP drops it suddenly becomes consumer, even though the actual performance and specs didn't change?

Are you trying to say that consumer level refers strictly to people who buy inexpensive pre-built computers? If so, then that would imply that consumers simply don't buy discrete GPUs at all since the vast majority of the people buying Dells and HPs don't upgrade the GPU.

I'm well aware they're two very different things. As I've already said, I brought up the term "professional" since that is the most often used contrast used for "consumer" and attempted to understand exactly what you were talking about.

Now you're changing your scale and moving the goal posts. I purposefully left my car scale low because we were talking about single GPUs. Using your scale would equate a 750 Ti to a Mustang GT500, and I'm relatively certain no one brags about a 750. Gamers running two or three GPUs is what I would equate to Porsche, Ferrari, and Lambo because those are the exotic GPU setups that relatively few people actually run. But that's semantics. The point is, pick a consistent and usable scale. You're mentioning everyone from those buying $300 Best Buy specials to Bugatti Veyron owners.


It's pretty simple here. What the blazes do you mean by "consumer GPU"? For that matter, what term do you use to describe "non-consumer and non-professional"? What are your criteria? You can rant and rave at me all day, but the only thing to blame here is your poor vocabulary and vernacular and refusal to answer some simple, genuine questions.
 
I have to agree with Red Jaron, although we're starting to drift from a discussion on bang/buck, which is the subject of the source article.
I am another that considers ( =/= "is ruled by") power use. I like smaller systems, in which heat can be an issue, and I am philosophically opposed to "wasting" (as opposed to "using"). If two cards each meet my performance requirements, I'm next going to listen for noise. The quieter card probably produces less heat, and likely uses less power as well. If neither is "too loud," I'm going to pick the one that uses the least amount of power. The difference may only be pennies per day, but why waste even that?
 
RedJaron,

I agree with a number of your points, but you may be beating animalosity a little too hard on the "professional" vs "consumer" aspect. For me, "professional" graphics = those cards which the mfg calls "workstation" class and whose main purpose is for GPU-assisted professional applications like SolidWorks, AutoCAD, Maya, etc. For AMD & Nvidia, it's their FirePro/Quadro cards respectively. I don't classify those as "consumer" boards because the primary usage of such is different from one whose primary use is gaming. But I don't worry too much if another person says it should be called something different, either. :)

I think if animalosity had said "budget-gamer" instead of "consumer" graphics, his point would have been made in regards to the original intent of his comment.

... probably could have left out any car analogy ... I like Coke instead of Pepsi ... I thought it was white and gold ... yada yada yada ...
 


I second the thoughts on fan noise! I wish more graphics card mfgs would concentrate more on how to quiet down the fans, which are undoubtedly the loudest part of my whole system.
 
I'd be pretty upset if I'd bought two gtx 970s to game at 4k with decent settings after all the guff about 3.5gb + 512m (Super slow). As it is I game at 120hz 1080p and my single GTX 970 does just fine. Still feel a little let down reckon this will effect the second hand retail value well darn ! Bring on Direct X 12 it might dig Nvidia out of that hole they dug.
 
@RedJaron

You said you weren't going to get into a pissing match which is exactly what you have done. I've shared my opinions, you've shared yours. Does that make either of us right? Rhetorical question. You know what they say about opinions....

That said, in the "consumer" sense of economics, sure, a person who purchases a specific product is called a consumer. Perhaps I could have worded this a bit differently. I would not call GTX 960's or (in general) $200 components that one might spend on any given PC component to be a "general consumer" purchasing "general consumer" hardware. We're well past that, and I believe you are taking everything I have written extremely out of context. How about we utilize "Entry Level" in place of "Consumer Level." Would that make you happier? Perhaps, "Budget Oriented," however by your logic as opposed to mine, you would simply try to be obstinate, by telling me something along the lines of: "Well, person A, has 'budget X', so you can't call it that either," inferring that a budget is relative, and simply does not apply. An individual spending more than $200 on a specific component is well past the point of "budget level" due to they have calculated their decisions based on specific needs well past that of basic desktop applications.

My car analogy still works. I'm sorry you don't like it. We can differ on what we believe what auto manufacturer more closely resembles what specific GPU, but you don't have to squash it simply because you want patronize me more and somehow boost your own ego.

You can explain to me all you want how you believe power consumption matters once we're at the this level of discreet graphics. Those who have bought similar (or higher end) GPU's have obviously put thought into what kind of power supply they might need to run all their components. Specifically discreet graphics these days require dedicated 6 or 8 pin 12v power simply because the bus isn't enough. Are you really trying to argue that "consumers" are that worried about their power bill at the end of the month because they chose a GTX 960 (or higher)? I'm throwing the BS flag. Heat I can see being the ONLY source of concern in smaller form factor builds, but you know as well as I that Nvidia has had pretty phenomenal reference cooling on just about all their products since Kepler (GK 104) architecture. Most discreet graphics these days are designed to exhaust the heat out of the rear of the heat sink since most all GPU's these days are double slotted designed with a larger heat sink to accomplish such cooling. I'm still not buying the power concern, and frankly, there's nothing more you're going to respond with that will convince me otherwise. I simply do not care if you know a guy who knows a guy that instantly proves me wrong. You're going to have to do better than that....
 
I think both Nvidia and Amd is wondering long and hard right now. The process node shrink from 28 nm to 14 nm is the last one to give great results. the next one is 14 to 11 and the THE END . What will they do to make us buy new GPU´s?
 
Higher power usage means more heat which means higher operating temperatures of the whole rig and/or more fan noise for cooling.
Someone who needs a particular performance level must choose a card that provides that performance, dealing with power by selecting an appropriate PSU and appropriate cooling. At a given performance level (having or lacking a desired feature like PhysX counts as "performance"), the only rational basis for choosing a card that needs more power over one that uses less is a significantly lower price, including the cost of additional cooling the hotter card may require. This will be true whether you are talking about "consumer," "prosumer", "professional," or "other buzzword" cards. Consumers tend not to look at TCO (unfortunately, imho), but they will care about heat and noise.
 
Does the GTX 580 really belong in the same tier as the GTX 670 and the GTX 960? I tried looking for a direct comparison with relatively recent drives and cross-referencing these two links was the closest to an aswer that I found:
GTX 580 compared to GTX 760- 87% of the 760
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_760/27.html
GTX 760 compared to GTX 960- 83/94=90% of the GTX 960
http://www.techpowerup.com/reviews/ASUS/GTX_960_STRIX_OC/29.html

That puts it as only 79% of a GTX 960, aka the GTX 960 is about 25% faster. That's a pretty big difference for two cards in the same tier, especially since the 580 seems to be a bit slower than a card in a tier below it. One could argue that the comparison between the links and Tom's testing methods is not valid, but Tom's own 2015 1080p aggregate charts also show a huge performance difference, granted I have no idea how old the GTX 580 tests in them are:
http://www.tomshardware.com/charts/2015-vga-charts/20-Index-1080p,3683.html
 
Oh, I so agree with this. It often becomes a 3-point sliding scale, almost zero sum, between 3D performance, power consumption/heat, and noise. Everybody and every scenario will have their own specific priority order and requirements.



A little harsh? Maybe I was. I was asking sincere questions because what he said made no sense to me and got lambasted for my troubles. I'm not the type to back down.

That's the way I use them too, and I don't mind if someone else uses different terminology for themselves. However I need to know what that terminology is if they want to properly discuss it. Otherwise the whole thing is pointless.

Possibly, but I didn't want to assume anything on that, which was why I asked.



Now was that so hard to answer? And my happiness is not based on proving myself right or others wrong. However this does help me understand what you mean.

Let's leave the ad hominems out of this.

You're approaching this from the level of a tech savvy person building a brand new system. In such a case, you're right. However for people that are trying to upgrade a Best Buy special or an older computer, power consideration becomes critical. Does their existing PSU have the wattage for the card? Does it even have the power cables to connect it? Did they get stuck with an el cheapo PSU from before that can't handle what it was advertised for? Say you've got little Jimmy that's been given permission to upgrade his parent's old machine that he can't play new games on anymore. In situations like this the 960 becomes a good value because you can get great 1080p performance without needing a particularly strong PSU to do it.

I'm going to guess you live in the USA. Travel to some other countries where the power bill is considerably higher. Those "pennies a day" mentioned by Onus can quickly add up to a notable amount, especially if you have the machine under load a good portion of the day. Will that alone be a dissuading factor in the purchase? Maybe, maybe not. When on a tight monthly budget, you try to save where you can, so it would at least be a consideration.

Yes, NVidia's reference coolers have been quite good the last two generations. Now, have you ever had to use an ITX or SFX PSU? Their wattage and PCIe cable count are much lower than full ATX models.

Designed to, yes. Compare how many cards are available right now with a reference cooler and not an aftermarket solution, particularly on the Radeon side of things.

I'm not trying to convince you of anything other than there are other perspectives than yours that are just as valid. You've had first-hand testimony from me and at least one other person saying power consumption matters to us. And you say you don't care what we say. By dismissing it, you're effectively calling us liars. Thank you, and have a nice day.
 
@RedJaron

I will concede on some of my ad hominems. I will concede on my accusations as well. Where I have to refute regarding other countries and their (potential) concern for their utility usage each month lies with your own argument. Although you and Onus bring up a valid point about pennies adding up each month, I have to consider this scenario. This perhaps may be a bit narrow thinking, but if (Insert pronoun here) purchases a GTX 960 in say the EU, and is concerned about their utility usage each month then perhaps discreet graphics such as the 960 may not be suitable. One, the cost of hardware I can only assume is certainly more expensive since I know the American dollar to Euro ratio is averaged about 1.6:1, then effectively a consumer in Europe is purchasing the same GTX 960 for roughly about $320 yes? My point is, that if a consumer is worried about their monthly power bill, then perhaps purchasing frivolous things such as a discreet GPU, might not be the best investment.

I realize I kind of have tunnel vision on this, but perhaps I'm not too far fetched here to believe that a GPU's required power usage is (to me I admit) is the least of a consumer's worries when making their purchase decisions.
 
To further clarify on my last paragraph in the above reply, I meant that the power usage may be the last of a consumers worries in their purchase decision when they are in the market for discreet graphics such as the GTX 960. Chances are if someone is in the market for GPU's such as these (or even more enthusiast level for that matter), then they tend to know what they are getting into. Although this may not be the case as in "Little Jimmy's" case, chances are that said parents or other consumers have already done their homework to understand what is required to utilize this kind of hardware. I shall hope anyway...
 
So according to this article the "Discrete: GTX 690" which is 2nd position is better than say the "GTX 780" so numbers don't really mean anything?
 
Animalosity, I definitely agree with your point that if a few cents a day is a budget-buster, maybe you have more important uses for your money than a high-end graphics card.
Although I am opposed to waste on general principles, my concern is heat. I had a motherboard fail in use a while back (a VRM fried shorted), and the only possible cause I could come up with in that stock system was the heat a HD7870 was adding to its mini-ITX case. More heat comes from more power, so that's another reason to keep it down. Another 15W-25W doesn't sound like much, but I don't know anyone who makes a habit of grabbing even 15W soldering irons while they are in use. Add fans to get rid of the heat, and you've added more noise (and a few more watts to run them).
 


Sometimes fanboys just cannot help themselves. I would probably want the GTX 960 over the R9 280, personally. @ 1080p, it is a bit faster than the 280, and uses far less power doing so. Same goes with the 970 vs the 290x. The 290x is a great card, but it also doubles as a space heater. With my interest starting to lean towards mini-itx setups, power consumption/heat is a factor in the choices I make.
 
Correct, and perhaps I didn't make myself clear. I was meaning that even if it is in the budget, you still may want to keep it low as reasonable. Definitely if your budget is so tight you can't spare an extra few bucks on power, you've got bigger worries than gaming.

Allow me to clarify. I'm not talking about the 960's power consumption in a vacuum, but as it compares to its competition, which right now is the 280. Same thing can be said about the 970 vs the 290X. Performance is close between them, but the Radeons are a little slower. So it makes sense that they're a little cheaper. However the Radeons also require significantly more power ( the 280 draws twice as much as the 960 in a gaming loop and the 290X is 80W higher than the 970. ) So, how much extra a month does it cost to run AMD over NVidia in this sense? Supposing you went with the 280 that was $10 cheaper than the 960, how many months does it take for the Radeon to eat up that $10? After that point the 280 becomes more expensive than the 960. This is the total cost of ownership, or TCO that Onus mentioned. Now if electricity is cheap, that tipping point may not be reached for the duration you have that card. And even if it does, some people still may not care.

So if we're talking about performance / money value, I think the TCO does need to be considered. It's not always a flat scale because each card will have strengths and weaknesses for certain needs. I think Onus summed it up well: assuming two cards meet your gaming and heat needs ( and assuming no external needs like specific GPGPU requirements, CUDA compatibility, etc, ) why wouldn't you get the one that draws less power?
 
GT 730 64 bits it's more powerfull than 128 bits? Because it's higher tier in here, and don't seens correct.
 
I'm chomping at the bit to upgrade my old card, but waiting to see where prices go after R9 300 series is released. That goes for both Nvidia and AMD. Buying right now doesn't make much sense if you're able to wait a 2+ months and see how things play out.

I bet I'll be glad I waited. Prices for R9 290X cards have already seen steady drops in just the past week. Not much change in GTX 970's yet, but I can wait.
 
I just bought the Asus Radeon R7 260X 2 days ago: This card is totally unstable, I have got successively Black Screens, Blue Screens of Death and Freezes of my PC.

I played hours without problems, but each time I have stopped for 5 minutes I get either black screen, blue screen or freeze. I never get any of those for years.

The update of the driver and the firmware to the last version does not solve the problem. I don't recommend this card.
 

Had an email offer from Newegg on a 280X for $190. Getting more and more tempting, but I'm still holding out on the R9 300s.
 
Status
Not open for further replies.