Best Graphics Cards For The Money: October 2014

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Fulgurant

Distinguished
Nov 29, 2012
585
2
19,065


Right, and three hours of gaming per day is a lot - certainly much more than I do. But if there's only a marginal difference in up front cost, it's at least worth noting if one card will cost you extra money over the 2-3 years you're likely to keep it.

Your numbers are useful; that's why I complimented your thread. They are not huge numbers, but in comparison with the sticker-price differences under discussion, they are potentially significant.
 


Not offended at all .... just correcting misinformation. We have been building machines for in house use as well as others for well over 20 years. Overclocking GFX cards is far more prevalent than CPUs as the process of overclocking a CPU is far more complicated.

1. Non-Reference Cards come outta the factory overclocked and these are, by a H U G E margin, the far more popular cards (just look at newegg user reviews) .... so most home built boxes alreday have overclocked GPUs from the get go.

2. You are not going to damage a nVidia GFX card by overclocking with Afterburner. NVidia's built in protections (both legal and physical) make that virtually impossible. There is therefore no fear in doing so. To the contrary, you can certainly damage a system applying too much voltage to a CPU; with a GPU, the slider only goes so far and that's just not enough to break anything with 7xx and 9xx series GPUs. Last GPU series to suffer this fate was the reference 570s.... the EVGA 570 SC was one of the cards that had a high instance of such failures as, unlike other non-reference card vendors, EVGA uses the stock VRM on their "non-reference" SC card

3. We have never been asked to OC a GFX card as users who ask us to build for them are quite comfy opening Afterburner and copying the settings from Overclock 3D or other review sites. Again, it is impossible to apply too much voltage so all that's involved is going up and down with the clocks / memory till it's stable.

4. Generally, it's a pretty safe bet that if a user is asking me to build their box, they are not going to undertake CPU overclocking. I'll would take the CPU say 4.5 Ghz (assuming capable cooler) and show them how to do it and we give them written instructions on how / what to adjust and what levels not to exceed for a safe overclock.

5. Yes, OC may not be an interest to everyone but not addressing the issue makes the article useless to everyone who does. In addition, someone who may not be willing to OC today might be interested in doing it tomorrow. For example, you see the 600 watt and 850 watt model of the same PSU line on newegg for the exact same price ? Which one do you buy ? No you don't need 850 watts for your single card build and you may not be planning to SLI / CF but it's foolish not to take the 850 watter .... a) it will do SLI / CF if ya change ya mind b) the 850 watter will be more efficient (PSUs hit peak efficiency at 50% load), c) it's cooling system will keep it much cooler giving longer liufe and d) it will likely have better components

Practices and customs are oft a result of local social attitudes and your group of friends will no doubt be influenced by what one another does, but THG purports to be an "authority on tech" and an "enthusiast oriented site". Not providing information that "enthusiasts" are interested in is not serving your audience.

Look at the posts from peeps asking on these forums about GFX card performance and you will most often see images from or links to Techpowerup which allows the reader to base his decision on **all of the information** which may be of interest to its readers. It ranks the cards in order of out of the box performance but also provides the performance of every card tested when overclock so the reader gets to make an informed decision.

I am always going to look for and rely on sites / reviews that provide the most information rather than rewording a press release and running some benchmarks. For example if manufacturer A gets x% better performance than manufacturer B, can we, based upon that review, make a decision on whether that "win" is just the luck of the draw in the silicon lottery ... or did they get down into the nitty grittly and look at the PCB, what chokes were used, size of heat pipes, fan control features, power delivery system, VRM phases and cooling, memory cooling so that we can see why it performed better.

Now if you don't think that the overclock ability of a card is significant in a buyer's decision, can you provide a explanation as to the wide discrepancy in sales between the 970 and comparably priced / comparably performing cards from AMD.

Among DX11 cards ....

-The 970 accounts for 4.20 % of the cards hitting Steam Servers.

-All R9 200x series cards combined account for 1.24 % of the cards hitting Steam Servers.
-All R7 200x series cards combined account for 0.59 % of the cards hitting Steam Servers.

* Steam can not differentiate between 300 series and 200 series cards so lumped together

So the 970 all by itself over a 12 month period has outsold all of the 200/300 series of cards that came out in the last 24 months by a factor of 2.3 to 1. Given that we agree that 390/290 is a comparable performer and comparably priced, there must be some reason why one is chosen so much more often than the other. When the top card on one side overclocks 30% and the other top card overclocks 5%, a 25% performance difference for the same money seems like a pretty strong "deciding factor"
 


It's not the primary consideration but it is all about price / performance no ? It is after all "for the money"

So when you include the extra cost of the larger PSU (100 watts say on a CF build) and you include the cost of electricity over 3-4 years, this may not be a primary consideration but should be a secondary consideration.

Let's say **for the moment** that two cards cost the same and perform the same, but the FuryX needs a 150 watt bigger PSU. The EVGA G2 850 is $130, the 1000 is $50 more. So the actual cost of the two cards is not the same because right up front you are paying $25 more for each of the two cards by way of the increased PSU needs and perhaps even an extra case fan or two to help push out the extra heat.

980 Ti x 2 = $650 x 2 + $130 = $1,430
FuryX x 2 = $650 x 2 + $180 = $1,480

As for electricity costs, depends where you live but, for me, that's close to a $200 bill over 3 years. If ya in Europe, it could be twice that in some countries.

150 watts x 30 hours per week x 52.14 weeks per year x $0.24 per kw hr / (1000 watts per kw x 90% eff) = $62.57 per year .... after

So no I wouldn't expect an an enthusiast to choose a lower performing card over a higher performing one but we don't have that in any situation that enthusiast are interested in. We have two cards costing the same, performing relatively the same (within 10% outta the box ... 36% overclocked). But the slower choice w/ SLI / CF costs me an extra $240 over 3 years at only 4-1/4 hours of usage per day. Even with a single card, $120 is nothing to be ignored.

There's also the differences when overclocked. Data Sources

http://www.techpowerup.com/reviews/AMD/R9_Fury_X/34.html
http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/33.html
http://www.techpowerup.com/reviews/Powercolor/R9_390_PCS_Plus/33.html

http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_G1_Gaming/33.html
http://www.techpowerup.com/reviews/MSI/GTX_980_Gaming/28.html
http://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/30.html

The following compares the performance of the reference cards at 1440p.... and lists:

Card
1440p score (TPU Reference Card Ranking)
OC % (% OC over reference card obtained w/ non-reference cards)
OC Score (TPU rankinga djsusted for OC)
Cost (to buy)
Bang 4 Buck (OC score x 1000 / Cost)

Top Tier - The 980 Ti overclocked is 36% faster than the FuryX overclocked w/ a 36% better bang for the buck.
Card / 1440p score / OC % / OC Score / Cost / Bang 4 Buck
Fury - X / 100% / 105.05% / 105.05% / $650 / 1.62
980 Ti / 109% / 131.38% / 143.21% / $650 / 2.20

2nd Tier - The 980 overclocked is 19% faster than the 390X overclocked for a % increase in price and 9% better bang for the buck.
390x / 98% / 107.12% / 104.98% / $410 / 2.56
980 / 102% / 122.71% / 125.16% / $450 / 2.78

3rd Tier - The 970 overclocked is 7% faster than the 390 and a 7% better bang for the buck
390 / 99% / 108.21% / 107.12% / $320 / 3.35
970 / 98% / 117.11% / 114.76% / $320 / 3.59

Value Comparison - The 970 overclocked is 2% faster than the 390X overclocked while being $90 cheaper.
390x / 98% / 107.12% / 104.98% / $410 / 2.56
970 / 91% / 117.11% / 106.57% / $320 / 3.33

I'll do an explanation here for the above. The 390x is 8% faster (98/91)outta the box in TPUs comparison summary. The 390X OC's about 7% more compared to the 970's 17% (see throttling issue below). Once overclocked, the 970 is about 2% faster than the 390x which costs $90 more. The bang for the buck is (3.33 / 2.56) or an advantage of 30%.

Extra Peek - There's nothing close to the Fury in price on the other side but the 980 is 6% faster overclocked while being $80 cheaper.... a 25% advantage in bang for the buck.
Fury / 98% / 107.75% / 105.59% / $530 / 1.99
980 / 91% / 122.71% / 111.66% / $450 / 2.48





 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960
When it comes to power consumption, it doesn't really matter to me since I live in an area that is served by a nuclear power plant and the kW/hr rate is less than 0.06 during "off-peak" hours and less than 0.05 during "on-peak" hours. Back when litecoins were worth something, I was using an HD 7950 boost to mine 24/7 for about 2 months. The power bill barely changed vs before I started mining. I now have an R9 290. If we are talking just normal gaming, which I do about an hour or two per day sometimes more if I get into a game, and consider that the 290 is not running anywhere near max load since I am playing in 1080p and have the frame rate limiter set to 60, the money I could have saved if I bought a 780 or 970 instead is insignificant. I'm sure if I lived in an area served by a coal burning plant and I paid 0.25 per kW/h, the difference would be like someone else calculated about $20-25 per year. Well, not that much, PC gaming isn't my day job so it is probably more like $10-15 per year. But, for nuclear power plant customers, power consumption is almost meaningless. Also, like another person pointed out, the only AMD cards that really suck at power consumption are the high end ones like the 280x, 7970 and Hawaii cards. People who bought them when they were new were paying $350, $400, $500+ for them. These are high end cards and the people buying them aren't on a budget. $20 per year isn't going to break their bank account. All these people care about, the high end consumer/pro-sumer segment, is performance. These people are not playing in 1080p like me, they are playing in 1440p on triple monitors or 4k. When they look at benchmarks and see that the 390 is ripping the 970 a new one in those resolutions, they are not even going to hesitate to get a couple 390's over a couple 970's. One person mentioned that the 970 is like 4% or something of steam rigs. Yeah, and they are probably all single cards paired with i5's driving 1080p monitors. I totally agree, if you are playing in 1080p and want a single GPU that can get you maxed out or close to maxed out in pretty much every game, a 970 is a great choice. But, for the pro-sumer who is driving 3x 1440p monitors, the best bang for their buck card for them is the 8GB R9 390 or 390x. There aren't many of those people that is why these cards don't dominate Steam. Also, there is no amount of overclocking you can do to the 970 to get it to make up the lead the 390 has over it in multi-monitor and 1440p and up resolutions. It is sort of like the old car saying "there is no replacement for displacement" where displacement is video RAM.
 

danbfree

Distinguished
Jun 26, 2008
73
0
18,630
I'm manufacturer agnostic, but Newegg is having awesome sales on factory OC'd eVGA GTX 950 and 960's lately at around the $140 and $160 price range, I got my 2nd from the top clocked 950 for $140 plus a $10 rebate. Considering these support HDMI 2.0, and are so efficient, these are a great way to go and have 4K@60Hz ready for desktop and video playback w/full hardware decode too.
 

MSI_GTX_650Ti

Honorable
Nov 14, 2013
9
0
10,510
Tom's Hardware is really going downhill! You listed the 970's specs wrong! It still has a 256bit bus and 224 gb/s memory bandwidth with 3.5gb regardless of the slower 512mb. You are confused with the math on the actual bus width and bandwidth speed!
 

CptBarbossa

Honorable
Jan 10, 2014
401
0
10,860


It would appear all of your information comes from one source, Techpowerup.

Take Jayz 2 cents. He comes to a different conclusion withe overclocks in regards to the r9 390 and the gtx 970. R9 390 still wins out.
 
Nvidia is out selling AMD 8 to 2 out of 10 in some cases 9 to 1 out of 10, so for all intents and purposes Nvidia has won the most important spec...what sells the best. It's ludicrous its not even close, and I blame AMDs driver history going back the ATI which was know for horrible drivers.
 


I wouldn't call $25 per year "barely anything", especially if you plan on keeping your card for at least 3 years

Where did this "$25 per year" amount come from? I keep seeing it repeated here, despite it being a seemingly arbitrary value based on some undefined usage scenario. Even when comparing two cards that have a 100 watt difference in power consumption under load for a few hours per day, most people in the US would only be looking at a cost difference per year of not much over $10.

Of course, the exact value is going to vary a lot due to a huge number of other factors, and the difference in power draw between most current-generation cards with similar capabilities is going to be much less than 100 watts. If you leave your computer turned on most of the day for other purposes, idle power draw could also make a notable difference.

I do think that power consumption is good to keep in mind though. In addition to electricity costs, there's also the heat output, which tends to make a more efficient system quieter to cool than a power-hungry one. Less heat will allow for quieter cooling solutions, both for the GPU as well as for the rest of the system.

I find the power consumption of AMD's recent cards to be a bit disappointing. For a number of years, Radeon cards used less power than their Geforce equivalents, but it seems the situation is different now, at least for the time being. You can only re-release the same cards at higher clock-rates so many times. They apparently have plans for their next cards to use a smaller fabrication process with significantly better efficiency though, so perhaps things will look better next year.
 

Firered2015

Reputable
Nov 9, 2015
5
0
4,510
I checked EA website and Amazon description and it seem that GT 730 can play Battlefiled 3. I just want to know what will the FPS be on a medium processing computer with 2GB ram?
 

cletus_slackjawd

Distinguished
Dec 26, 2006
347
0
18,790
Why so quick to write off the 290x, they are still easily available in retail for under US$300.00 after rebate or promo code.

290x is better than a 390 when overclocked (by 2%) stock the 390 wins (by 1%)

290x has no disadvantage vs 390 if you are running single card config, the extra ram does nothing unless you are considering crossfire and 4k.

390 has been consistently and considerably higher priced although at the moment prices are going up and the two cards are closer.

 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960


Almost nobody in the 48 contiguous states pay anywhere near 20¢ per kwh. The average in the lower 48 states is 12.8¢.

If you are comparing electricity use of a 390 vs a 970 in normal gaming use, the difference is about 68 watts on average. If you have an 80+ PSU, that is 85w drawn at the wall. So, you are gaming 3 hrs. per day every single day for a year: 3 x 85 x 7 x 52 = ~93 kwh. Multiply that by the national average rate of 12¢ = $11.16 /year. For me, since I live in an area served by a nuclear power plant, it is only multiplied by 5¢ = $4.65 per year. Wait, I have an 80+ Gold PSU so it is more like 84 kwh based on 87% efficiency. So, that means I am only saving $4.20 more per year. But, in reality, for someone like me it is even significantly less than that. I haven't even played a PC game since beating Mad Max like 3 weeks ago. My PC gaming average time played is measured in minutes per day not hours and is probably even in the single digits this year. That makes it negligible for me (if I owned a 970 and wanted to know how much I was saving vs a 390). Not everyone plays WoW as their day job. The cheapest 390 on newegg is $280 after a rebate and the cheapest 970 is $290 also after a mail in rebate. Even if you go by the national average of 12¢ and you add 3 years worth of power use @ 3 hrs. per day to the final price, the cheapest 390 now costs ~$20 more than the cheapest 970 assuming the taxes and shipping are the same as well. That figure of $25 per year is exaggerated unless you are looking at maximum load numbers and not normal gaming numbers. Normal gaming will never push a GPU to its max every minute you play. Only something like mining, stress testing or folding will do that. Under normal gaming use, even with an entry level 80+ vanilla PSU and a 20¢ per kwh rate, the difference is $18.60.
 

ErikVinoya

Honorable
May 11, 2014
202
0
10,710


Its in the graph, same tier as the 970
 

nayrnayr1

Reputable
Aug 8, 2015
333
0
4,860


Fast DDR4 RAM might be helping it, but I say yes, it is misplaced.
 

dfg555

Distinguished
Jan 9, 2013
167
0
18,680
Wouldn't the GTX 690 and HD 7990 be in the same level as R9 290 and 290X? Or is this hierarchy chart comparing the graphics cards based on RAW stats not gaming performance?
 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960


In almost every game frame rate comparison, even in 4K, the 7990 beats an "Uber" 290x in this comparison - http://www.anandtech.com/bench/product/1074?vs=1059 This isn't surprising, really, since it is essentially 280x Crossfire vs a single 290x.

As for 690 vs 290x, they trade blows in the games tested but the 690 does look like the better card in 1080p and 1440p while the 290x is the better 4K card in most games - http://www.anandtech.com/bench/product/1184?vs=1059
 

dfg555

Distinguished
Jan 9, 2013
167
0
18,680


It seems like Company of Heroes' engine doesn't like dual GPU setups...
 

XaveT

Distinguished
Jul 15, 2013
205
6
18,765
So can we get some of the AMD integrated GPUs back on the chart? The latest one I see on the chart is the HD 7660D, which is over two years old. The latest Intel integrated is the HD Graphics 530, which was released with Skylake this summer.
 

funny_creature

Reputable
Oct 5, 2014
10
0
4,510
I was really shocked to see my old HD 5850 card at the respectable (for its age) 10th tier. According to the chart, there's no real point to upgrade it with R7 260X for example. I want to upgrade, but this got me thinking. Right now I'm successfully playing Assassin's Creed Syndicate, which I consider minor miracle, because I also have only 4 gb of RAM (which I'll definetely double). Can someone recommend me a real upgrade, or I'd just look at the GPUs 7th tier and above.
 
Status
Not open for further replies.