(Noob) Recommended graphics card for maxing out 1920x1020

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

theultimateeye

Distinguished
Jan 18, 2012
93
0
18,640
So this is the hardware i have already but deciding on the last components to buy:

Corsair 650D
Asus p8z68 v-pro
i5 2500k (keeping at stock clocks unless it bottlenecks whatever card i get)
Hyper 212 evo
Asus Blu-Ray
8gb corsair vengeance 1600mhz
Crucial M4 256gb sata III
Seasonic X-850 watt psu
Video card ???
SSD ???
***edit: running windows 7 home 64 bit not sure if it matters

I'm somewhat of a noob and this will only be my 2nd build. Looking for a gpu that will max most games with full AA @ 1920x1080 on a 46" led 120hz tv (if that matters). Not sure if a single 6970 will cut it or do i need a 7970? Looking to be future proofed for a while and max out Diablo 3 when it comes out as well. Thanks for any input, Jeff.


 
Solution

Games coming out in the next few years will make better and better use of DX11, and GPUs will get better and better at running these new features like tessellation. We're seeing that with the HD 7000 series, and we'll see it with the GTX 600 series, too.

DX11 slows games down on current mainstream systems, but OP is building a future-proof semi-high-end system and games on a large 120Hz display which allows him to see framerates up to 120fps. If OP's willing to pay, an HD 7970 will do wonders...

Gordon Freeman

Distinguished
Jan 10, 2012
433
0
18,790

I to was once drawn into the allure of a Dual GPU on one card solution but the fact is two cards in crossfire are always better and cheaper than one dual GPU solution unless you can get a GTX 460 2 WIN for $300 or less then that is representative of a good deal.
 

theultimateeye

Distinguished
Jan 18, 2012
93
0
18,640


Very good info i didn't know that. When i was researching it seemed most hd cards didn't have good tessellation compared to the gtx series. YES i'm reading the overclock results on hardocp and i think i've finalized my decision. All the stock benchmarks i've been seeing had me on the fence but overclocking it seems pretty easy even for a noob like myself and the gains seem huge. Thanks jessterman
 
G

Guest

Guest
atm I'm running at your resolution on a 6770 OC'd like a pro. Running 6GB RAM, Anthlon II X3 425 CPU and I max out GTA 4 with basic ENB settings which is crazy. A 7970 in your system would be complete overkill. You'd be sufficient with a single 6870.... and OC it to get a maximum FPS boost. And when you start to worry about future games you could just CF with another 6870. Your system does not need a silly 6970, especially a 7970... it's just complete overkill, unless your looking for 100+ FPS in every game.
 

bloc97

Distinguished
Sep 12, 2010
1,030
0
19,460


That was I was saying about a 6950 being more than enough to play games at high with 8xAA.
 

Gordon Freeman

Distinguished
Jan 10, 2012
433
0
18,790

DX11 will not become very useful until the next gen of consoles come out and to be honest I rather run most games in DX9 and most games still are DX9 only and the best looking game out currently is DX9 and DX 11 just slows most games down.
 

Games coming out in the next few years will make better and better use of DX11, and GPUs will get better and better at running these new features like tessellation. We're seeing that with the HD 7000 series, and we'll see it with the GTX 600 series, too.

DX11 slows games down on current mainstream systems, but OP is building a future-proof semi-high-end system and games on a large 120Hz display which allows him to see framerates up to 120fps. If OP's willing to pay, an HD 7970 will do wonders with this setup.
 
Solution

trogdor796

Distinguished
Nov 26, 2009
998
0
19,160
Just to let you guys and the OP know, his "120hz display" isn't a true 120hz monitor. It's a 120hz TV. Unfortunately, no current TV is capable of accepting a120hz input signal. What they do is take a 60hz input and do extra processing, which results in major input lag.

What does this mean? Unlike a 120hz monitor, which can effectively display up to 120 frames, the TV will only effectively display 60 frames. By effectively, I mean that you won't be able to tell the difference between 60 frames and anything above that, whereas with a 120hz monitor, you could actually notice when frames are over 60, it will look smoother.

So what you want to do is set your TV into "game mode" if it has one. This is supposed to turn off all the image processing, reducing the input lag. Another thing that helps is renaming the input you are using(HDMI 2 for example) to"PC". This also reduces input lag.

As for a graphics card, I think that a 7970 would be worth it. It will for sure get you by for a few years if you want to max out games @ 1080p. A 6970/570 would be good too, but it may struggle to run games like BF3 and Crysis2(with DX11 Texture Pack) with all settings maxed out(were talking anti-aliasing, Depth of Field, Tesselation). 6970's and 570's would still run most games on high/ultra for around 2 years, but may struggle with the most demanding games.
 

Ah. Thanks for the info. CURSE YOU INTERPOLATIONNNNN.
 

Gordon Freeman

Distinguished
Jan 10, 2012
433
0
18,790

We need consoles to come onto DX11 than that is when Developers will start to push to limits of DX 11 we have seen nothing yet just because it is new and not very much used but with the next gen of consoles will change all that. DX11 was designed to make games run better and look better not just look better with more eye candy but use hardware more effectively and allow the developers to more easily implement high graphical feature sets. In turn DX 11 was supposed to be more efficient and hence enable higher framerates at the same time enabling more eye candy with a positive impact on performance at the same time however it will be like always the consoles that will hold the industry back with there now 7 years old archaic hardware's.
 

Gordon Freeman

Distinguished
Jan 10, 2012
433
0
18,790

Why the heck would a TV need 120hrz LOL movies are generally shot in 24fps and TV is 24fps crap quality and the consoles are 30fps with a few exceptions one being MW2 which runs at 60fps even on consoles.
 

trogdor796

Distinguished
Nov 26, 2009
998
0
19,160

TV's with a refresh rate that high are good for smoothing out programming such as sports where there is a lot of action going on and things are moving really fast. In cases like this and during the playback of movies, interpolation and input lag don't matter and are OKAY because you are not inputting anything, you are simply watching the tv signal that your tv is receiving.

BTW, 24fps for movies and TV programs does not mean "crap quality". You are just saying that because you don't want to play a game at 24fps, because that would be crap quality. 24 frames isn't good enough for a game because it needs to play smooth enough for you to react, but it doesn't matter with movies or television. Those kind of programs look fine at 24 frames. Every movie you see in a theater is 24 frames, but it still looks good.

As for consoles, yes, most of the games run at 30 frames, which is fine, and playable for most people. Before I started gaming on PC I though it was perfect! But now that I have seen 60 frames on computer I can tell the difference, but I can and still do play console games at 30 frames.

The only reason MW2 can run at 60 is because the game is run on an engine that was made and used before the consoles even released. It's old, and not demanding.

Let's try to stop derailing the thread now, and focus on helping the OP.
 

Gordon Freeman

Distinguished
Jan 10, 2012
433
0
18,790

Nope I said TV is crap Quality however cinema @ 1080P 24hz is and can be very beautiful like Batman The Dark Knight etc. The cameras that are used to shoot TV programing is running @ 24hz and I think some are a bit more or less but that never changes if you just change you monitor to 120hz the programing is still running at 24fps not 120fps. Same goes for a game if the game can only run at 24fps it can not be up converted by the TV/Monitor to 120fps just because you have a 120hz TV/Monitor you need the hardware to run 120fps in the games case being a GPU in TVs case it is the Video camera and it is a mighty mighty expensive one that is needed to shoot @ 120hz actually they my not be in existence.
 
G

Guest

Guest
Once again, stop going off topic. Even posting your last thoughts then saying "ok, let's not derail this thread now..." because your obviously going to get a reply to it. And once again, the OP does NOT need a damn 7970 for his setup... Even getting a 6950 now and CF'ing it later would guarantee mostly maxed setting games for years to come. I know some people are drooling over the new 7000 series but there's no need to convince somebody to waste twice as much on something they don't need. A single 6870 right now would max just about every game out there with his setup. There, I just saved the OP almost $400. There isn't really a discussion for this, OP even mentioned he does not care about anything above 30FPS (nor should most unless your playing a game that needs it, or your just completely O.C.D. and anal about FPS =P)

As a final conclusion to this heavily spammed topic, you shouldn't have to drop more than $200 now to get maxed in-game settings for just about every single game. I, myself, like to do a tad bit over the top, so I'd say buy a 6950 2GB now, and when the time comes, and you feel that your computer needs a little more 'oompf', grab a second 6950 2GB and CF your system. (6950's are a little hard to come by if your looking for a reference design though)
 

bloc97

Distinguished
Sep 12, 2010
1,030
0
19,460


Ok, this is the third time I say this. I said a little before that an "6950 WOULD BE MORE THAN ENOUGH" AND HE should be able to play everything ULTRA at 8x AA (only one screen, not talking about eyefinity here).
 

theultimateeye

Distinguished
Jan 18, 2012
93
0
18,640


Sorry haven't had time to get on until today. Great info in this post. I've heard the tv's basically "filling" in half of the images because it can't replicate the last image shown. Makes me wonder about a large display monitor but they seem uber expensive. I'll probably get flamed for this but i have decided on a 7970. Looking at all the benchmarks at 1920x1080 it seems that this card is getting around 60-80 fps avg. on the newest games with AA cranked all the way up. That seems good for now but i'm not sure what games in the future will look like or what display features they'll require. Or what future hdtv's/monitors will look like. Hopefully it'll be good for a couple years. If games get more cpu intensive then i'll upgrade to an ivy bridge cpu or whatever the next great thing is. Plus there's always the option of another 7970. Again, thanks for all the help everyone. I posted this up on a few different forums and you folks were the only ones to help my noob ass out AND provide me with additional information i didn't know about!
 

theultimateeye

Distinguished
Jan 18, 2012
93
0
18,640
***Update - After doing some research it seems the HDMI drivers for the 7970 are horrible. This is coming from many verified owners commenting on Newegg. Issues with the sound, blank/black screens, blurry screens. Since my hdtv doesn't have a DVI connection perhaps i'll wait until AMD releases better drivers for this card.
 
G

Guest

Guest

You shouldn't be too worried, the 7000 series is just hatching, it's going to take a couple weeks for it to get situated. But I honestly do not know why your even thinking about a 7970 for 1920x1080 =\ Please make sure you've read all of the posts above (or just the non-spam ones that are useless to you). Anyways, I wish you luck with your nice large screen and GPU, hope I've helped.
 

theultimateeye

Distinguished
Jan 18, 2012
93
0
18,640


You have definitely helped. The Sapphire 7970 came yesterday and i am very happy with this card. However i was expecting a little more performance in Battlefield 3. I'm only getting an average of around 43 fps on ultra settings at 1920x1080. It's even lower in Arkham City. I just can't understand how someone can pay over 500 bucks for a graphics card and it doesn't dominate games at 1080p (which is pretty damn standard nowadays). How would i find out how much cpu usage these games are using? My 2500k is still at stock speeds as is the graphics card. Not sure if this is holding me back or not.
 
Status
Not open for further replies.

TRENDING THREADS