Best Graphics Cards For The Money: October 2014 (Archive)

Status
Not open for further replies.

CO builder

Reputable
May 2, 2015
2
0
4,510
We need Intel HD 4600 and HD 2500 added. This will let BIYers know how much card they'll need to see improvement over the new integrated graphics.

Is a GT 720 or 730 worth adding when you have the IGP already?
 

Larry Litmanen

Reputable
Jan 22, 2015
616
0
5,010
This is how to show for a card the RIGHT WAY.

If you want to play big Triple A games on medium settings from now for the next 2 years get a GTX 750ti (or it's AMD competitor), if you want to play any game on max from now for the next two years get a GTX 960 (or it's AMD competitor) , and if you want to play every game on max for the next 3 years get a GTX 970(or it's AMD competitor) or higher.

Indies games run on any card because they are not too demanding on graphics. This solution works for 2 years and in 2 years or so you will need to upgrade to a new GPU.
 

Tylerr

Honorable
Jan 6, 2013
165
0
10,690
@Larry Litmanen

What if i just want to be able to play any game on the lower settings for the next 4-5 years? Don't really care that much about graphics as long as i'm able to play the game with a decent frame rate.

Was able to do that with my gts 8800 640 until about a year or so ago when games started taking 1gb+ of vram to play.
 

Gurg

Distinguished
Mar 13, 2013
515
61
19,070
My suggestion would be to lop off the old bottom 2/3 of cards below the 5770. Those are some very old cards. For the cards above that I would suggest color coding the groupings by the resolution game pack max play ability. ie 4k high, medium, low resolution, 1440 H,M,L 1080 H, M, L, Then someone looking at the chart would know that for a given monitor resolution and game settings the GPU will generally meet those requirements,
 

jesh4622

Distinguished
Jun 17, 2011
52
0
18,630
These recommendations are off. Rebates count for Nvidia's bottom line but not AMD's? If you count rebates for both brands, the 290 is going for ~240 and the 290x is going for ~280, while the gtx 970 starts at ~300.
 

alextheblue

Distinguished
We need Intel HD 4600 and HD 2500 added. This will let BIYers know how much card they'll need to see improvement over the new integrated graphics.

Is a GT 720 or 730 worth adding when you have the IGP already?

Depends on your needs. For example for an HTPC build the NVIDIA cards would be better, especially in the driver department but also in terms of decode offloading of newer codecs.
 

maratc

Reputable
Apr 13, 2015
3
0
4,510
The "GPU Hierarchy Chart" page and "Performance per Dollar" page bring the same page, which is Hierarchy chart. Performance per dollar now 404's.
 

manhalfgod

Reputable
May 3, 2015
1
0
4,510
Newegg has way better deals than Amazon go look and see all those prices above are all worthless

SAPPHIRE TRI-X OC 100361-2SR Radeon R9 290X 4GB 512-Bit GDDR5 =$269.99 After $20.00 MIR
SAPPHIRE 100361-4L Radeon R9 290X 4GB 512-Bit GDDR5 PCI Express 3.0 Tri-X OC(UEFI) Video Card=$279.99 After $20.00 MIR
SAPPHIRE 100362-3L Radeon R9 290 4GB 512-Bit GDDR5 CrossFireX Support Tri-X OC Version (UEFI) = $239.99 after $20.00 rebate
http://promotions.newegg.com/NEemail/May-0-2015/MayThe4th-01/index-landing.html?utm_medium=Email&utm_source=IGNEFL050115&nm_mc=EMC-IGNEFL050115&cm_mmc=EMC-IGNEFL050115-_-EMC-050115-Index-_-E0-_-PromoWord&et_cid=17730&et_rid=2583378&et_p1=

http://promotions.newegg.com/NEemail/Apr-0-2015/PowerIntoMay30/index-landing.html?utm_medium=Email&utm_source=IGNEFL043015&nm_mc=EMC-IGNEFL043015&cm_mmc=EMC-IGNEFL043015-_-EMC-043015-Index-_-E0-_-PromoWord&et_cid=17701&et_rid=2583378&et_p1=

http://promotions.newegg.com/NEemail/Apr-0-2015/Top8HallofFame28/index-landing.html?utm_medium=Email&utm_source=IGNEFL042815&nm_mc=EMC-IGNEFL042815&cm_mmc=EMC-IGNEFL042815-_-EMC-042815-Index-_-E0-_-PromoWord&et_cid=17660&et_rid=2583378&et_p1=
 
If you haven't noticed by now and I like toms but this is more less NVidia and Intel site AMD will always be runner up even if they are first... I belive you call that hmmm whats the word???? Bias.....
 

arielmansur

Reputable
Apr 27, 2015
16
0
4,510
When are you going to add AMD APUS? those integrated gpus need to be here too! and add all the missing intel integrated gpus too!.
 

rav_

Distinguished
Jul 24, 2011
38
1
18,530
This is all so much garbage.


DX12 changes EVERYTHING.

In fact a SINGLE AMD A6-7500k running API Overhead Benchmarks running DX12 OUTPERFORMS an Intel i7-4960 with an nVidia GTX 980 running DX11.

Using the 3dMark V1.5 API Overhead Benchmark with DX12 AMD's $100 APU A6-7400k produces 4.4MILLION drawcalls.

Using the same 3dMark benchmark test in DX11 a $1700Intel and nVidia system ONLY produces 2.2 MILLION drawcalls.

This little artical herre is Soooooooooo lame that it is laughable.
 

rav_

Distinguished
Jul 24, 2011
38
1
18,530
Dx12: is it the best friend that AMD has?

About a month ago Anandtech ran some extensive 3dMarkv1.5 API Overhead benchmarks. They tested both dGPU and integrated APU's and IGP.

http://bit.ly/1GCjLzU

Here is an interesting fact.

Using DX11 as a baseline to compare the performance delta the following was undertood.

Intel i7 4960 and GTX980 can produce 2.2MILLION draw calls running DX11.

i7-4960 has 6 cores and 12 threads.

Intel i7-4960 = $1200

nVidia GTX-980 = $540

Total = $1740

Of course DX11 is the API that all benchmarks have been running up until now.

However when you run 3dMark API Overhead test using DX12 something interesting happens.

AMD's A6-7400 APU can produce 4.4 million draw calls.

AMD A6-7400 costs $90-150 depending upon outlet.

A6-7400k has 2 cores. Hmmmmm..... 2 cores vs 6 cores? $100 vs $1200?

Of course when you run the same benchmark on A6 using DX11 API the Draw Call Overhead drops to 513,000. When compared to the Intel/nVidia system costing $1700 the justifcation becomes clear. You spend the money for 2.2 million draw calls or a 4x performance increase over a $100 cpu!!!

Seriously? $1700 just for a 400% peformance increase over a $100 APU?

Mantle and DX12 has changed the game.

Last year the media was comparatively benching very expensive dGPU silicon just gain a few percentage points for a score that NOW can be achieved with a $100 AMD APU. Not ONLY achieved but can gain a 100% increase in performance over the more expensive system.

Still think DX12 will have no impact?

Intel and nVidia has been ripping off the consumer using DX11 when a much better API; Mantle and now DX12 makes low priced and low performing $100 APU's OUTPERFORM the "BEST ON THE MARKET".

Now that XBOX will be adopting DX12 the gain in performance will be far better than ANY combination of Intel CPU and nVidia GPU you can put together and currently running DX11.

In otherwords.....

...if you are happy and satisfied with the performance of your current DX11 $2000 gaming system then you should be ecstatic to achieve 2x the performance with a $400 DX12 AMD gaming system.

How does this relate to how AMD can save itelf?

1. AMD needs to change the way the media benchmarks it's silicon. Now the media bench's Radeon against nVidia on an Intel platform.

Well if I am going to spend $1500 on a new DX12 dGPU card the I want to know EXACTLY what CPU is best for my investment.

In fact if Tom's hardware can't tell me as a consumer the best SYSTEM suing BOTH Intel AMD mainboards then they should just dry up and blow away.

Two proprietary design features of GCN give AMD Radeon APU''s and dGPU a considerble performance advantage: Asynchronous Shader Pipeline and and Asynchronous Compute Engines. Unlike Intel CPU's AMD CPU's can execute graphics instruction on ALL CPU cores.
 

fnmunoz9969

Honorable
Aug 28, 2013
4
0
10,510
I'm sorry but the R9 290 actually beats a GTX 970 in many benchmarks, not even counting the 290x here, and won't have any issue utilizing its full memory buffer. Also the TDP for even a reference 970 is way higher than 145w. 2x 6 pin (2x75w = 150w) plus the 75w through pci-e = 225w the card was room to draw. Personally I've never seen my 290 pull higher than 220w and that uses 8 pin (150) + 6 pin (75w)
 

jdw_swb

Distinguished
Feb 11, 2008
368
0
18,810
I'm sorry but the R9 290 actually beats a GTX 970 in many benchmarks, not even counting the 290x here, and won't have any issue utilizing its full memory buffer. Also the TDP for even a reference 970 is way higher than 145w. 2x 6 pin (2x75w = 150w) plus the 75w through pci-e = 225w the card was room to draw. Personally I've never seen my 290 pull higher than 220w and that uses 8 pin (150) + 6 pin (75w)

Where are the many benchmarks in which the 290 beats out the GTX 970?
 

jamisont1

Reputable
Feb 9, 2015
4
0
4,510
I'm sorry but the R9 290 actually beats a GTX 970 in many benchmarks, not even counting the 290x here, and won't have any issue utilizing its full memory buffer. Also the TDP for even a reference 970 is way higher than 145w. 2x 6 pin (2x75w = 150w) plus the 75w through pci-e = 225w the card was room to draw. Personally I've never seen my 290 pull higher than 220w and that uses 8 pin (150) + 6 pin (75w)

Where are the many benchmarks in which the 290 beats out the GTX 970?

he probably saw benchmarks on certain games that amd cards performs better such as evolve.
overall 970 beats 290x in FHD, pretty equal on QHD, 290x beats 970 on UHD but both cards simply sux for UHD.
best 290x card is probably 290x lightning and vapor-x OC, and top 970 cards like HOF/FTW+/superjetstream/G1/TOP/EXOC/ampex beats those easily.
 

fnmunoz9969

Honorable
Aug 28, 2013
4
0
10,510



Here's a futuremark firestrike 1.1 benchmark from just now with my Core i7 4770k @ 4.5Ghz and R9 290 @ 1100/1400 compared to a "highend PC" with a GTX Titan

http://www.3dmark.com/compare/fs/4740941/fs/594794

20% faster in almost all scores

Here's a 970, above it in the charts is a 290x with its stock 3dmark 11 score

http://www.futuremark.com/hardware/gpu/NVIDIA+GeForce+GTX+970/review
 

fnmunoz9969

Honorable
Aug 28, 2013
4
0
10,510


Also I meant to specify 4k benchmarks mostly, otherwise a 290 and 970 are very close in perfomance. A 290x will be better.
 

jamisont1

Reputable
Feb 9, 2015
4
0
4,510



its not hard to find benchmarks.
but results are similar to this.
http://www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977-6.html


in FHD 970 beats 290x, QHD pretty equal, UHD 290x beats 970.
benchmarks you saw usually are done by reference cards, and here's the thing.
reference 290x = 1000mhz
reference 970 = 1178mhz (boost)

top 290x cards like lightning = 1080mhz (+80mhz factory oc'ed)
top 970 cards like hof = 1380mhz (+202mhz factory oc'ed)

so if benchmarks for reference cards were equal, there's gonna be difference with the cards that you actually buy.

btw your oc'ed 290 scored 12231.0 GS, top 970 cards score around that without overclock.
+13000s easily if oc'ed, ive seen upto 14000s.

http://www.3dmark.com/3dm/6850796

970 super jetstream 12496 GS Core clock 1,190 MHzMemory bus clock1,800 MHz (no overclocked, just factory oc'ed)
you can check its spec here
http://www.gpuzoo.com/GPU-Xenon/GeForce_GTX970_Super_JETSTREAM_D5_4GB.html

 

Nuckles_56

Admirable
This is all so much garbage.


DX12 changes EVERYTHING.

In fact a SINGLE AMD A6-7500k running API Overhead Benchmarks running DX12 OUTPERFORMS an Intel i7-4960 with an nVidia GTX 980 running DX11.

Using the 3dMark V1.5 API Overhead Benchmark with DX12 AMD's $100 APU A6-7400k produces 4.4MILLION drawcalls.

Using the same 3dMark benchmark test in DX11 a $1700Intel and nVidia system ONLY produces 2.2 MILLION drawcalls.

This little artical herre is Soooooooooo lame that it is laughable.
Just going to put it out there but draw calls are not everything, as all those extra draw calls are not going to make up for the multiple billion less transistors in the AMD APU
 

Richard Wolf VI

Reputable
Sep 2, 2014
2
0
4,510
How come you are still recommending a card that hasn't been reviewed? I'm referring to the GT 730. Even the GDDR5 version has been shown to be outperformed by the R7 250 DDR3 which costs less.
 
Status
Not open for further replies.