GeForce GTX 670 2 GB Review: Is It Already Time To Forget GTX 680?

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
G

Guest

Guest
i Feel like i will vomit any moment... Nvidia makes me sick. AMD FOR LIFE!! Go play crysis you rich little green immature kids. Amd will rule very soon again and you will convert to red the way people convert to islam. Allahu Akbar!!! You dirty Kafir!!! I pray for all of you.
 
[citation][nom]ooostephen[/nom]I didn't see anything about multiple monitors in extended desktop mode. It would be helpful to see how well a single card handles: 1- how many monitors in extended desktop mode, and 2- at what max resolutions.[/citation]

Aren't the GTX 600 cards are supposed to have quad independent monitor support? That's what every site that talks about the (real) Kepler cards says.
 
[citation][nom]oxford373[/nom]kepler is just a paper for some thing you cant buy until august[/citation]

Wrong. You can buy them and fairly easily. Just sing up for the mailing lists for newegg and/or other such sites and make sure that you have a phone or something that gives you notifications when you receive an email. The supply comes in every week or so and when it does and you get that email, be there fast enough and buy one (or more, if that's what you want). There you go, you now have one or more Kepler-based graphics cards.
 

shin0bi272

Distinguished
Nov 20, 2007
1,103
0
19,310
What I think everyone is missing here is that this card is half the size of the 680, has 1 smx disabled and is clocked about 100mhz slower and its just as fast as the 680! Anyone else wanna try saying that the 256bit bus is enough in the 680? Cause youre wrong!
 
[citation][nom]shin0bi272[/nom]What I think everyone is missing here is that this card is half the size of the 680, has 1 smx disabled and is clocked about 100mhz slower and its just as fast as the 680! Anyone else wanna try saying that the 256bit bus is enough in the 680? Cause youre wrong![/citation]

That's what I've been telling people ever since Kepler launched. GK104 has a huge memory bandwidth bottle-neck. This is shown even more in bandwidth heavy game configurations where the 670 and 680 are marginally better than a 580 and the GTX 580 just happens to have nearly identical memory bandwidth to the GTX 670 and the GTX 680.

The 670 basically performs like a 680, just with a different PCB, lower price, and slightly lower power usage (making it most likely the most energy efficient high end video card on the market). With a 384 bit GDDR5 aggregate bus, Kepler would probably be much faster in games. Heck, a 448 bit or 512 bit GDDR5 bus might be necessary to really let GK104 be unleashed properly. It could let the GTX 680 improve greatly.
 
the gk104 was meant for mainstream market. but since nvidia saw that amd didnt have anything good, they decided that gk104 will be the top chip for now. if they used gk110, it would have dominated the 7970 by 40-50%
 


... Completely missed the point. GK104 could have been much faster than it is had Nvidia simply had a higher bandwidth connection. With it already being clocked high, that means going for XDR2 or sticking with GDDR5 and widening the bus. GK104 could have dominated Tahiti in gaming had Nvidia simply given it adequate memory bandwidth. This is because GCN, like Fermi, is a compute architecture, not a gaming architecture like Kepler FP32. So, thanks to Nvidia's skimping on memory bandwidth, AMD can actually compete with them at the highest end without making monstrous chips like the Big chips (recent examples being GF100, GF110, and GF110) and sacrificing power consumption in order to do so.

However, then why did Nvidia do this? Simple. There's no point in making a single GPU for gaming that is faster than the GK104 is with its 192GB/s of VRAM bandwidth. If you want much more performance, then you can get a dual GPU configuration. Not many people would buy a single GPU card that is faster and more expensive than the GTX 680, yet would not simply buy two slightly slower GPUs in a dual card or dual GPU card configuration. People who want even more performance than that can go up to quad SLI/CF GTX 670/680 and 7950/7970. Even more than that? Well, there isn't much that you could do with more than that, nor are there many people who could afford something more. Then think about how the display configuration would probably already be far more expensive than even the graphics cards are.

Basically, Nvidia didn't really not make a GK100 gaming GPU (GK100 would have been the gaming version, not GK110) because AMD didn't have faster GPUs... Not only did Nvidia not need to make one to get that much faster than AMD, but there would also not be much money in being that much faster, so Nvidia probably wouldn't have made GK100 even if AMD made a Big GCN GPU.
 

Kenshin55

Honorable
Jul 6, 2012
15
0
10,510
Why no 5760x1080 benchmarks again?! That is becoming very suspicious in every Chris' review of GTX670/680 vs AMD 7970, and always narrowing the Games selectively, who plays Dirt3 anymore?

The 7970 is only $40 more right now than GTX670, thanks Nvidia for making my favorite card price drop down :D

I can now do CFX in the near future!
 

shin0bi272

Distinguished
Nov 20, 2007
1,103
0
19,310
I was reading a review on the new msi oc'ed 670 and started thinking "what's up with the metro 2033 scores on these kepler cards? If metro is a physx game what's the deal?" The answer is potentially 2 fold.
First the 256bit memory bus of course hamstringing the gpu but in an interview Chief Technical Officer Oles Shishkovstov said that the cpu does hard body physics interactions and the gpu (if possible) does cloth, fog, etc., etc.... That means that if the game detects nvidia gpu physics all of the afore mentioned interactions get pushed to the gpu even when the user turns off advanced physx in the options (advanced physx just adds some extra particles and debris when its on anyway). When the gpu physics isnt there all of it is done on the cpu and the game will actually utilize the extra cores of the cpu to do it... something they used to make it run well on the xbox and ps3.

That's the best answer I can find as to why metro 2033 scores so low on the kepler cards.
 

mhill8

Distinguished
Jan 1, 2009
5
0
18,510
It's mid-October 2012 now, and I'm looking at a couple EVGA cards: a GTX 670 4GB ($438 after rebate) and a GTX 680 2GB Superclocked ($430 after rebate). Both are attractive, but I wonder if you (Tom's Hardware) could tell me if I'd get more benefit from the 4GB on the slower, 1/8 disabled card, or from the faster card with less memory? Thanks!
 
[citation][nom]mhill8[/nom]It's mid-October 2012 now, and I'm looking at a couple EVGA cards: a GTX 670 4GB ($438 after rebate) and a GTX 680 2GB Superclocked ($430 after rebate). Both are attractive, but I wonder if you (Tom's Hardware) could tell me if I'd get more benefit from the 4GB on the slower, 1/8 disabled card, or from the faster card with less memory? Thanks![/citation]

Performance between them would probably be very similar. The 4GB memory capacity models usually only make much of a difference at or beyond 4MP resolutions such as 2560x1440 and 2560x1600 or higher, if even then (depends on game and settings).
 
Status
Not open for further replies.