GeForce GTX 670 2 GB Review: Is It Already Time To Forget GTX 680?

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

darkstar845

Distinguished
Mar 29, 2010
230
1
18,680
I am a bit surprised about the load temperature of GTX 690. It was one degree cooler than a GTX 680. I was expecting the GTX 690 to be 5 to 10 degrees hotter because it has two GPU die on one PCB yet only has one heatsink.
 

prothera

Honorable
May 12, 2012
3
0
10,510
Hello!
The real benchmarks for wow should be in LFR Raids, having the different systems setup at the same fight, at the same time.

Raids are what stress the PC more then anything else. And now with LFR it's easy to do that kind of benchmarks.

We know wow have around 10 million players and most of then only hit a performance wall on raids.

We buy computer parts based on this kind of benchmark, but during raids the stress on computer components and different and we might end buying the wrong processor or video card.
 

aviral

Honorable
Mar 11, 2012
296
0
10,810

I am saying that results are quite impressive at that price.


The reason is:-

The people around the world now can buy a 600 series card of nvidia which was not possible with the two earlier cards launced that is 680 and 690.

Even majority of the people could not imagine of the 690 due to its price and 680 is still quite priced but will be opted by many users But with the launch of 670 it will be a replacement to the 500series card like 580. And it is giving impressive results to the users of this card with respect to 680 though 680 is outstanding graphic card. :)
 

maverick knight

Distinguished
Apr 17, 2011
156
0
18,710
[citation][nom]godfather666[/nom]The list of games is very unfair to AMD cards. Techpowerup's review, which featured a much longer list of games, had the Radeon HD 7970 faster than the GTX 670 at all resolutions 1080p and higher.Nvidia has had a big lead in Dirt 3 and WoW for a very long time. This skews the results.The GTX 670 is a fantastic card, don't get me wrong. Just pointing out that it's being slightly overrated here.[/citation]

why is it that you come to the conclusion that Nvidia is overrated here??? why cant it be that AMD is overrated elsewhere???

You and the 20 others that gave you a thumps up are in denial... Nvidia FTW!!!
 

mstngs351

Distinguished
Feb 7, 2009
75
0
18,630
[citation][nom]godfather666[/nom]The list of games is very unfair to AMD cards. Techpowerup's review, which featured a much longer list of games, had the Radeon HD 7970 faster than the GTX 670 at all resolutions 1080p and higher.Nvidia has had a big lead in Dirt 3 and WoW for a very long time. This skews the results.The GTX 670 is a fantastic card, don't get me wrong. Just pointing out that it's being slightly overrated here.[/citation]
The games chosen are a good reflection of popular/technically significant games (Arkham City would have been nice but TechPU showed the 670 was faster for it) but any list will never be complete without going over board. People can argue that Toms list favors X and Techs list favors Y all day long. In the end it's just one stop on the consumers path to enlightenment.
 


The differing game choices would be more of a step towards consumer enlightenment on the hardware if the different testing sites had more uniform testing equipment and methods and they all gave either a whole picture a different part of the same picture. With the way it is now, some sites can even have results that contradict each other. That isn't a particularly enlightening phenomenon for a novice who wants to learn about hardware.
 

Orlean

Distinguished
Nov 28, 2011
340
0
18,860
I find it interesting that some of the people that say "time to declare AMD"s bankruptcy", " AMD got owned" and so on is that AMD had there cards out roughly 4 months before Nvidia did that's a pretty long time to saturate the market with your product compared to the competitor, not to mention the 680 even a month later is very hard to get.

In addition if it wasn't for competition between company's that killer $400 GTX 670 may be allot more expensive or possibly not even exist if there was no competition to promote innovation. On that note if it weren't for me purchasing a 7870 a month and a half ago I would definitely have a 670 in my system.
 

oihan

Honorable
Apr 3, 2012
3
0
10,510
[citation][nom]ojas[/nom]I'm just wondering. When the TH German team posted Benchmarking AMD's 768-Shader Pitcairn: Not For Public Consumption a few days ago, the 680 and 7970 were trading blows. This review shows that the 670 mostly outperforms the 7970.What gives? I'm not a conspiracy nut, but i sincerely think that the German benchmark suite was more fair, or at least it appeared to be. Maybe that was because those were "performance" settings and this suite puts more importance on visual quality?Just curious[/citation]
Why would anyone not want to play any game at max settings with all of the eye candy?! That makes absolutely no sense. I think these benchmarks are "fair."

I, for one, am glad that they are using max settings at all of these resolutions. Keep up the good work guys!
 
G

Guest

Guest
I dont want a watered down version of the 680. So n. o. Ordering a 4gb 680 to replace my 2 3gb 580s soon for my 2560 x 1600 res
Wow, you must be really proud of yourself for sharing that with us.
 
"...Pyrrhic victory...again and again."

Isn't the point of a Pyrrhic victory that another such victory would cause defeat? Being the case, causing something to happen "again and again" couldn't be a Pyrrhic victory. Although the term may have applied, the explanation does not, as that if NVidia pulled the paper launch again, people may not hold off on purchases of Radeons a second time.
 

opalarrow

Distinguished
Feb 6, 2012
33
0
18,540
[citation][nom]blazorthon[/nom]They're making a lot of money. Both the PCB and the GPU are small (that PCB is tiny for a high end card, perhaps the smallest in years for such a card) and they only have 8 256MiB RAM chips, so cost of manufacturing is probably minimal. If the rumors of some of the components being cheapies are true, then it only solidifies the profit margins. Cost of manufacturing and shipping these cards is probably a mere fraction of what they are charging for the cards. AMD, on the other hand, is probably making a considerably less money with their slightly larger GPUs, 50% more RAM chips, and larger PCBs. It's probably still cheaper to make the GCN cards than the Fermi cards anyway, but these prices probably hurt AMD much more than they hurt Nvidia.[/citation]


That was a typo... meant 670
 
[citation][nom]_Pez_[/nom]the GTX 670 kick the amd's 7970 gaming ass. that simple period. you like it or not.[/citation]

The two are fairly parallel... The 670 wins some, the 7970 wins some. The 670's win comes purely from it's lower price and power consumption. However, that does not effect how well it games and it is not really a faster card than the 7970. Like it or not.
 

Maximus_Delta

Distinguished
Jan 21, 2008
269
0
18,810
When a company is in serious trouble then tend to release their best products, pull out all the stops if you like. This seems to be the case with Nvidia. For those of you who don't know, Nvidia is sinking in the mobile space and fighting for its life. Credit where it is due, its a stubby awkward looking PCB clearly made on the cheap but the GPU itself is very decent and job well done. Still happy with my 2 x 7970s, I've been gaming with them for the past 4 months ;)
 

gm666

Distinguished
Aug 2, 2011
2
0
18,510
I'm seeing reasonable quantities of the GTX 670 in stock in the UK. Maybe the higher pricing point over there means they favour some markets over others when supply is short?
 
[citation][nom]gm666[/nom]I'm seeing reasonable quantities of the GTX 670 in stock in the UK. Maybe the higher pricing point over there means they favour some markets over others when supply is short?[/citation]

The USA has kept stock of 670s well enough. Not 680s and definitely not 690s, but we have 670s in stock at multiple sites.
 

ooostephen

Distinguished
Jul 9, 2010
6
0
18,510



I didn't see anything about multiple monitors in extended desktop mode. It would be helpful to see how well a single card handles: 1- how many monitors in extended desktop mode, and 2- at what max resolutions.
 
G

Guest

Guest
what Im going to do is keep my 560 ti SLI setup. No reason to upgrade my video cards
 

dreadlokz

Honorable
Mar 30, 2012
312
0
10,790
670 is the winner of 2012 for sure! Im not gonna comment about nvidia... these supplies issues are just a way to make more money, so im not gonna rush my purchase!
 

Tab54o

Distinguished
Jan 10, 2012
261
0
18,790
I just bought a 7950. Reason is newegg had only one version of the 670 in stock (galaxy no thanks) and I'm not buying anymore reference design boards. My ASUS GTX 570 just crapped out two days ago. Ive had it about 6 months. I've never pushed gpu voltage past 1.05 volts.

After removing the heatsink I found that a 66 cent mosfet on the VR had blown. Going to attempt to fix it but not taking anymore chances on nvidia and reference design.

Ive owned a ton of both nvidia and AMD cards and this is the first time I've ever had a card fail.
 
[citation][nom]Tab54o[/nom]I just bought a 7950. Reason is newegg had only one version of the 670 in stock (galaxy no thanks) and I'm not buying anymore reference design boards. My ASUS GTX 570 just crapped out two days ago. Ive had it about 6 months. I've never pushed gpu voltage past 1.05 volts. After removing the heatsink I found that a 66 cent mosfet on the VR had blown. Going to attempt to fix it but not taking anymore chances on nvidia and reference design. Ive owned a ton of both nvidia and AMD cards and this is the first time I've ever had a card fail.[/citation]

Make sure that the VR has cooling if you are going to increase the voltage. If it doesn't by default, then you can spend another $10 to $20 on small heat spreaders for it. It might prevent similar problems from happening in the future.
 
Status
Not open for further replies.