Exclusive in TH Labs: Gigabyte GTX 680 OC Wind Force X3

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Anyone knows why nVidia always use non-rounded numbers for their frequences... I mean 1072mhz for the core, why not 1070mhz or 1075mhz... memory same story at 1502mhz... and boost wtf 1137mhz?? another reason to go witth the 7970, none of this stick on an extra 2mhz and turbo boost BS. Just imho.
 
NOISE:
I'm going to be buying a non-reference GTX680 solely to have a lower noise level. I'm not concerned with overclocking. My only other issue will be quality.

(too bad Sapphire Tech doesn't make NVidia cards. Their Vapor-X technology works great.)
 
Cool but why, we have no modern day Crysis to aim for. The kiddie consoles have pretty much ruined PC gaming as we use to have it. We will not see real improvement until the new Kiddie consoles release. I guess I could look at the positive side of it and say it is saving PC gamers allot of money in upgrades, new builds but I'm sure NewEgg does like that. My Q9650/560Ti will not need replaced for years.

I guess that is true, as long as you don't want to game at anything larger than 1080p.
 
[citation][nom]john_4[/nom]Cool but why, we have no modern day Crysis to aim for. The kiddie consoles have pretty much ruined PC gaming as we use to have it. We will not see real improvement until the new Kiddie consoles release. I guess I could look at the positive side of it and say it is saving PC gamers allot of money in upgrades, new builds but I'm sure NewEgg does like that. My Q9650/560Ti will not need replaced for years.[/citation]

560 TI is the slowest card that can handle 1080p in games like Metro 2033 and BF3 with the quality settings maxed out today. That will change within a year or two, and you might see your self needing an upgrade or lowering the quality settings. Also, consoles aren't just for kids and that Q9650 will probably bottleneck a new graphics upgrade.

[citation][nom]Maximus_Delta[/nom]Anyone knows why nVidia always use non-rounded numbers for their frequences... I mean 1072mhz for the core, why not 1070mhz or 1075mhz... memory same story at 1502mhz... and boost wtf 1137mhz?? another reason to go witth the 7970, none of this stick on an extra 2mhz and turbo boost BS. Just imho.[/citation]

If why not 1070 or 175, why not 1072? Electronics (and really, any other field of science) is full of numbers. They tend to not conform to what we would prefer. For example, the 680 probably has it's frequencies as they are because they are the exact best that Nvidia wanted to handle. The Boost's numbers are probably odd because they needed to increase performance and power consumption an exact amount and for it to have the amount of tiers that it does, those odd numbers happened to work the best when they were tested in their labs.

That the 680 has odd frequency numbers should not be a good reason to buy a more expensive, but less power efficient and possibly slower 7970. I say possibly because we don't know for sure how well the 7970 and 680 overclock because Tom's tried a seemingly broken method of it with the 680 that implied it had no headroom despite other sites getting large overclocks on the 680 and we have yet to see the 7970 overclocked when tested without the crap drivers that it used to have that limit it's GPU frequency to a mere 1125MHz.

Basically, we don't know if the 680 can overclock as far as the 7970 yet. If not, then they might actually be more or less equals when overclocked. For example, the 7970 has been shown when not using the Catalyst drivers to go over 1.3GHz fairly easily and that's a more than 40% overclock. Guru3D only did a 20% overclock on the 680, but they're known to be slightly Nvidia biased and we don't know if it can go farther or not.
 
If the fans are quiet then definitely will get one of these, but I can imagine three fans over 40% might be a little noisy...well I'll find out when I get one.
 
For those who think MSI TwinFrozer cards are better, that is an incorrect statement.
My buddy and I built almost identical systems (I chose 8gig ram vs his 16) and he chose MSI TwinFrozer 570's in SLI whereas I used 2 Gigabyte Windforces in SLI. Both his cards and mine came factory oc'd. He could not play more than 30 mins of bf3 before hard crashing and the pc rebooting. (ended up the oc was too high and/or unstable)
On top of that, at 100% fan speeds, his sounded like a freakin vacuum cleaner. Mine, sounded like a smaller cpu fan+heatsink at 100%, definately not at all similar to the loudness his put out.
 
[citation][nom]therogerwilco[/nom]For those who think MSI TwinFrozer cards are better, that is an incorrect statement.My buddy and I built almost identical systems (I chose 8gig ram vs his 16) and he chose MSI TwinFrozer 570's in SLI whereas I used 2 Gigabyte Windforces in SLI. Both his cards and mine came factory oc'd. He could not play more than 30 mins of bf3 before hard crashing and the pc rebooting. (ended up the oc was too high and/or unstable)On top of that, at 100% fan speeds, his sounded like a freakin vacuum cleaner. Mine, sounded like a smaller cpu fan+heatsink at 100%, definately not at all similar to the loudness his put out.[/citation]

I have SLI 570, mines didn't crash and kept running. However yeah I agree it does get loud.
 
[citation][nom]xmortisx[/nom]If the fans are quiet then definitely will get one of these, but I can imagine three fans over 40% might be a little noisy...well I'll find out when I get one.[/citation]

Three fans at 33% are generally quieter than one similar fan at 100%.
 
No "need" for GTX680??

I can list PLENTY of games that can make full use of the GTX680 at 1920x1080:
Witcher 2, Metro 2033, Batman AC, Assassin's Creed Revelations/Brotherhood, Crysis 2 HD texture pack, Lost Planet 2, Total War Shogun 2...

and I haven't included upcoming games.
 
meh, all these extreme performance cards are not worth it.
It's way cheaper to get a normal card, a good aftermarket cooler and overclock it. It will perform even better and for a lower price.
 



And don't forget 2560x1440 resolution too!
 
I only have one question.

Should i buy EVGA GTX 680 SC or Gigabyte GTX 680 OC WindForce 3x?
Add a reason also if possible.


My opinion regarding the card comparing to EVGA one:-

1)Both cards has similar clock freq. with a very little difference.
2)Both cards are from top manufactures.
3)Performance wise both will be nearly equivalent.

4)May be price will have a real difference.
5)May be this will be not as quite as the EVGA one.

Link for EVGA card:-
http://www.evga.com/products/moreinfo.asp?pn=02G-P4-2684-KR
 
[citation][nom]Aviral[/nom]I only have one question.Should i buy EVGA GTX 680 SC or Gigabyte GTX 680 OC WindForce 3x?Add a reason also if possible.My opinion regarding the card comparing to EVGA one:-1)Both cards has similar clock freq. with a very little difference.2)Both cards are from top manufactures.3)Performance wise both will be nearly equivalent.4)May be price will have a real difference.5)May be this will be not as quite as the EVGA one.Link for EVGA card:-http://www.evga.com/products/morei [...] P4-2684-KR[/citation]

Assuming they have the same or a very similar price, I think I would prefer the EVGA because as a company, they are easily the most devoted to satisfying their customers. If the Gigabyte is significantly cheaper, then it is considerable.
 
[citation][nom]Aviral[/nom]Thanks your reply.I would like to just know How about going with Zotac GPU's?[/citation]

I'm waiting another month for more non-reference cards to be released, but here's my LIST of what I'm looking for in choosing the GTX680:

1) Lowest NOISE in IDLE/LOAD
2) General reliability of company
3) Higher quality capacitors and voltage regulators?
4) other: overclocking software tool, dual-BIOS etc.

So you might wish to find a review comparing idle noise. As it stands this Gigabyte card is pretty good but I'm expecting an even quieter card to be released in a month or so.
 
[citation][nom]Aviral[/nom]Thanks your reply.I would like to just know How about going with Zotac GPU's?[/citation]

I don't buy graphics cards often and the latest few have mostly been AMD for me, so I only know a few of Nvidia's card companies. sorry, but I'm not very familiar with Zotac's cards.

Also, not to nit-pick, but it's kinda annoying to me when people refer to a graphics cards as a GPU... I't a graphics card, a video card, but not a GPU. A GPU is part of the card, but not the entire thing. Calling it a GPU is like calling a full computer a CPU.

[citation][nom]photonboy[/nom]I'm waiting another month for more non-reference cards to be released, but here's my LIST of what I'm looking for in choosing the GTX680:1) Lowest NOISE in IDLE/LOAD2) General reliability of company3) Higher quality capacitors and voltage regulators?4) other: overclocking software tool, dual-BIOS etc.So you might wish to find a review comparing idle noise. As it stands this Gigabyte card is pretty good but I'm expecting an even quieter card to be released in a month or so.[/citation]

Idle noise for most video cards nowadays is almost always very quiet (excluding poor junk cards that shouldn't be bought in the first place). I think that at this point, for noise generation, it's just the load noise that really matters. Not that getting the lowest possible idle noise isn't nice, but idle noise is usually below 41 DBa for most cards already and that's just not loud at all (excluding very high pitches, those still suck, but once again, only junk cards should have such a problem). However, once put under load, even some good cards are like driving on a freeway.
 
NOISE:
I have my computer in my bedroom which is small so noise is crucial to me.

Keep in mind that the noise DOUBLES every 10dBa. I saw two HD7970's, one was 75dBA and the other was 55dBA under load.

That 20dBA means that the 75dBA is 4x as loud.
 
Status
Not open for further replies.