The Radeon HD 6950 Sweet Spot: Five 1 GB Cards Rounded-Up

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790
[citation][nom]geok1ng[/nom]the correct award is : NONE.you are either gaming at 1080p, and then a 6770 is way cheaper and will more than suffice, or you are gaming at 1600p or multi-monitor, and the 1Gb memory buffer is too small and you must go for the 6950 2G for astopuding $25 more...[/citation]

If you're running 1920 x 1080 x 3 with a triple monitor set up, most modern games only use about 800 - 1.1gb of Vram. To get over the 1.5gb Mark you'd need 3x 2560 x 1600 monitors (LOL)

I notice a lot of misinformation on Vram useage in this thread.

Outside of extreme resolutions across 3x monitors, there are no performance gains over 1GB atm. Big Vram titles running three 2560 x 1600 monitors in TRIPLE monitor setup *3 times!!! the resolution of the already extremely high 2560 x 1600* only uses about 1.5gb of memory, sometimes a little less, sometimes a little more. And that's with an extremely sick resolution (And even then, with the most EXTREME GPU set ups, you're not going to get playable frame rates on newer games on max settings)

So at this point in time, 1 to 1.5gb cards are more than enough for 99.9999% of PC gamers out there.

 

randomizer

Champion
Moderator


Gotcha, thanks. So most of these cards (other than the XFX or reference 2GB card) would ok in my climate then, some better than others. I'm probably tossing up between the Gigabyte and the HIS at $255 and $240 respectively. The Sapphire is cheaper at $235 but a good deal warmer. $20 isn't going to make or break my bank account, I just need something cool(ish) and quiet.

Or I could just sit on my GTX 275 and keep waiting for the Next Big Thing (TM).

EDIT: Typo
 

Crashman

Polypheme
Former Staff
[citation][nom]airborne11b[/nom]If you're running 1920 x 1080 x 3 with a triple monitor set up, most modern games only use about 800 - 1.1gb of Vram. To get over the 1.5gb Mark you'd need 3x 2560 x 1600 monitors (LOL)I notice a lot of misinformation on Vram useage in this thread.Outside of extreme resolutions across 3x monitors, there are no performance gains over 1GB atm. Big Vram titles running three 2560 x 1600 monitors in TRIPLE monitor setup *3 times!!! the resolution of the already extremely high 2560 x 1600* only uses about 1.5gb of memory, sometimes a little less, sometimes a little more. And that's with an extremely sick resolution (And even then, with the most EXTREME GPU set ups, you're not going to get playable frame rates on newer games on max settings)So at this point in time, 1 to 1.5gb cards are more than enough for 99.9999% of PC gamers out there.[/citation]The benchmarks show that you can benefit from "More than 1GB" in Metro 2033 at 2560x1600. That's about 3x 1280x1024 for 3-way pixel count. But like I've said several times, if you want quality at that resolution you're going to need multiple GPUs, so the best place for 2GB cards is in CrossFire.

There's really not much more to say. For most games you'll need more GPU before you need more graphics RAM, and one way you can get more GPU is with CrossFire.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
That a really informative roundup. However, as always, I have a couple of things to say about the recommendations in this review.

The first thing I want to point out is that the 2GB card will often show superiority in Eyefinity setups where the 1 GB of memory will be too low.Also, I don't fancy factory overclocked cards in general because it's crazy to spend extra for 6% gain in performance. The dual or triple fans do a good job of cooling the GPU but they heat up the inside of your case in the process while the reference designs exhaust half of their hot air from the back of the chassis. I have personally OC'ed my Sapphire HD 4870 to 870MHzGPU/1100MHz GDDR5 and saw little to no benefit in any game. While I really appreciate the innovation of some companies' cooling systems, lower power consumption figures, better OC ability. I'd say that the value of OC'ing your GPU manually ,AND voiding your warranty, is almost ZERO.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
[citation][nom]whyss[/nom]so glad my CPUs up there was getting worried that I should've got an i7[/citation]

Depends on your Usage. If you're a gamer, the i7 provides negligible lead over the i5-2500K
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
[citation][nom]airborne11b[/nom]Ya when I used to use ATI, I always had to use 3rd party software. The hardware isn't "Bad", it's just boarder line competitive with Nvidia.Then take into consideration that Nvidia has amazing driver/software support, superior stereoscopic 3D support, much more user friendly 3x monitor support, and always around the same price. The choice of a GPU is a no brainer imo.There will always be fan boys. Honestly I don't care about brand loyalty. I only care about quality. And quality doesn't stop at the hardware. Quality includes drivers, software applications, technical support, etc.Nvidia is all around quality. ATI couldn't even make a driver to save their life.[/citation]

Are we talking about Nvidia? The company Which released a driver that burnt many graphics cards. I'd gladly live with a few glitches than end up with a fried card. Wouldn't you?
 

geok1ng

Distinguished
Jun 25, 2008
111
0
18,690
[citation][nom]airborne11b[/nom]If you're running 1920 x 1080 x 3 with a triple monitor set up, most modern games only use about 800 - 1.1gb of Vram. To get over the 1.5gb Mark you'd need 3x 2560 x 1600 monitors (LOL)I notice a lot of misinformation on Vram useage in this thread.Outside of extreme resolutions across 3x monitors, there are no performance gains over 1GB atm. Big Vram titles running three 2560 x 1600 monitors in TRIPLE monitor setup *3 times!!! the resolution of the already extremely high 2560 x 1600* only uses about 1.5gb of memory, sometimes a little less, sometimes a little more. And that's with an extremely sick resolution (And even then, with the most EXTREME GPU set ups, you're not going to get playable frame rates on newer games on max settings)So at this point in time, 1 to 1.5gb cards are more than enough for 99.9999% of PC gamers out there.[/citation]

Now that is a statmente no backed by facts or reviews. It is common knoledge since the 1Gb 6950 launch that thse cards do not offer the same level of performance of 2Gb models when gaming at 2560x1600 and higher levels of AA.
http://www.hardocp.com/article/2011/02/24/amd_radeon_hd_6950_1gb_performance_review/4

Radeon HD 6950 1GB vs. 2GB


Certainly, the first thing we wanted to know is if the reduction memory capacity with the Radeon HD 6950 GPU would cause a reduction in performance. We found that at lower resolutions like 1920x1200 the answer is no, performance was not impacted much if at all. Since both video cards use the same GPU the end-result was the same. Playing at 4X AA at 1920x1200 yielded no performance differences in any of the games.


It was only at the highest setting of 2560x1600 with 8X MSAA did we start to see differences. The 2GB Radeon HD 6950 clearly allowed 8X MSAA in some games to be playable, and in others allowed us to use Transparency Antialiasing at 2560x1600. The 1GB Radeon HD 6950 struggled with these higher settings. Still, in some cases performance was the same as long as the AA setting was lower at 2560x1600.

I repeat so that all trolls can flame: The correct award is NONE. DO NOT BUY 1GB 6950. Either you will game at 1080p, and a cheaper card will suffice, like a 6770, or you are gaming at 2560x1600 and above, and a $25 price gap will net you the 2Gb model, with better performance at these resolutions.
 

Crashman

Polypheme
Former Staff
[citation][nom]geok1ng[/nom]I repeat so that all trolls can flame: The correct award is NONE. DO NOT BUY 1GB 6950. Either you will game at 1080p, and a cheaper card will suffice, like a 6770, or you are gaming at 2560x1600 and above, and a $25 price gap will net you the 2Gb model, with better performance at these resolutions.[/citation]Except that you're completely wrong. If the minimum framerates for Metro 2033 1080p were barely acceptable at medium details with these cards, I can guarantee that they'd be unacceptable with "something less".

And if you think the game map used for the benchmark is unrealistically tough compared to live gaming FPS, you can set to higher detail levels. 6950 will always be two steps ahead of 6770.
 

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790
[citation][nom]geok1ng[/nom]Now that is a statmente no backed by facts or reviews.[/citation]

I got my facts from benchmarks, you troll lol.

I said nothing wrong. You think because you found a game or two, with 2650 x 1600 with extreme AA / AF turned on and it bearly goes over 1GB that you disproved what I said? I was very clear with what I said, and you're very confused.

Most games with a single 2650 x 1600 resolution don't go over 1gb of Vram, and as the enthesist geforce cards are already at 1.5gb for the stock ones, that's more then enough (Even with AA /AF).

Not to mention that many would argue the actual visual gains of cranking up AA / AF on high resolutions like 2560 x 1600 anyway.

1GB is enough for standard 1080p even with AA AF.

1.5GB is enough for 1080p triple monitor set ups. or sinlge 2560 x 1600.

2GB+ would only be required for triple 2560 x 1600 set ups.

End of story, troll.
 

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790
[citation][nom]Crashman[/nom]The benchmarks show that you can benefit from "More than 1GB" in Metro 2033 at 2560x1600. That's about 3x 1280x1024 for 3-way pixel count. But like I've said several times, if you want quality at that resolution you're going to need multiple GPUs, so the best place for 2GB cards is in CrossFire.There's really not much more to say. For most games you'll need more GPU before you need more graphics RAM, and one way you can get more GPU is with CrossFire.[/citation]

Ya metro is an exception, it's a Vram hog for sure. But for every other game, 1GB of Vram is enough, even for 2560 x 1600 (Some games might go over a tiny bit with cranked up AA / AF).

But I agree with you. If you're running 2560 x 1600, you probably should be running SLI / Crossfire set up anyway just to get decent FPS. And if you got $1000 single monitor, you should want a good multi GPU setup with good vram on it. (480 / 580 GTX).
 

redlebanese

Distinguished
Oct 21, 2011
1
0
18,510
Just in time Tom, I've been seriously looking at a 6950 this past week or so. Thanks!

I have a question for my fellow hardware enthusiasts: Would I notice a significant bump in IQ and FPS if switching from a GTX 275 to a 6950 (DX10 vs DX11?) I mostly play F1 2011/Rage/ and hopefully Skyrim soon. (I haven't upgraded my PC since the i7-920 came out, I feel a bit rusty, so thanks for any advice!)
 

Invader MIg

Distinguished
Jun 22, 2010
9
0
18,510
What I would liked to have seen is some overclocking. How many people are actually buying a Twin Frozr III and leaving it at the settings it comes with. The Twin Frozr III has already been shown to be able to pass that 1000MHz mark easy and still maintain good temps. I'd like to know whether these other cards could do that, if not, the extra cost for the Twin Frozr is certainly justified.
 

W Craven

Distinguished
May 8, 2009
57
0
18,630
I have tested the 1Gb vs 2Gb myself in Eyefinity @ 5760 x 1080 with BC 2.. The 1Gb setup was CX5850 and the 2Gb setup is 1 flashed 6950 2Gb and i sold off the CX5850 as the 1Gb limit was it's downfall..

Also the BF 3 even in single panal 1920 x 1080 will use more then 1Gb of vram memory as High and URLTA will show this..
 

psiboy

Distinguished
Jun 8, 2007
180
1
18,695
Love it! I purchased the Gigabyte card 2 days before I read this review! Awesome! Runs swimmingly well with my 1100T and 8GB Corsair kit @ 1600mhz! Thanks Thomas! :)
 

Syndicat3

Distinguished
Oct 11, 2011
155
0
18,680
Crashman's comment is false.

I have the 1 GB Gigabyte 6950 and I saved my BIOS and enabled the 6970 shaders. It was a joke and took 2 minutes.
 
Status
Not open for further replies.