The GeForce GTX 770 Review: Calling In A Hit On Radeon HD 7970?

Page 12 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

The price of the GTX 680 has dropped, just not quite enough. It doesn't matter anyway, Nvidia just get more money if people make the mistake of getting a 680 instead of a 770.

Right now, the cheapest 770 on pcpartpicker is $396, while the cheapest 680 is $404.
 


Ok. The price of the 680 is indeed coming down. I just saw a triple-slot Asus at around $415 on newegg after $15 rebate, but the 670s are still hovering around $375 with the 770s sitting pretty at $400.
 


More price changes are coming - though sadly probably not to the $200 and less stack... I expect the 680s to stay at $420 or so until they clear out stock (because of the reason Sakkura proposed... bastards), but the 670s should eventually all drop to <$350 when the 760 releases...

http://videocardz.com/43001/nvidia-geforce-gtx-760-has-1152-cuda-cores?utm_source=rss&utm_medium=rss&utm_campaign=nvidia-geforce-gtx-760-has-1152-cuda-cores
 


256bit memory interface and a sizable core/memory speed bump. This should shake things up for the AMD fanboys. This GTX760 could potentially rival an HD7950... Ouch if its cheaper than 280$ I am going to tear up...
 


I know - some reports are saying $250 which would blow my mind and send AMD into a frenzy. But more reports are saying $299 which seems reasonable - though it really should be equal to the 670 with those alleged core/memory speeds.
 


Yeah 299 ENTRY... It COULD be equal or close to the GTX670, and if it is(very realistic expectation)... Ouch... Very ouch for AMD...
 


I do not think it was a problem of realization. Since all the top tier cards used the same GK104 as the GTX660 and up, it had to be intentional for their top tier to be worth the buy. Or at least that is my opinion...

EDIT : To think this MIGHT have been possible the previous gen, I am starting to think that Nvidia is playing the Intel game now too with massive superiority in potential performance at a great price...
 


Starting to think? The whole GTX 6xx lineup was a "good enough" response to AMD's cards. I couldn't believe they released the GTX 660Ti with a 192-bit interface, even though it looks like they skirted the most glaring issues of also saddling it with 2GB VRAM instead of 1.5GB. But it is definitely hobbled compared to the 670 - I guess leaving out one more SMX wasn't enough of a drop in performance for the price point they were looking for...

And don't even get me started on the whole Titan release... They had GK110 waiting to compete with the HD 7970 back when it was released in DECEMBER 2011. "Ha," they said, "GK104 is plenty. Lol, AMD."
 


A lot of people are saying that about GK110, but I don't agree mostly. GK110 is HUGE and expensive and was in very short supply even with the Titan release LONG after Kepler came out, but yes, GK104 was "good enough" as you say. Which is fine, since it did compete well and force a small price war, which Nvidia won obviously.

 

But the memory bandwidth limit affected their whole lineup, all the way to the GTX 680. It wasn't until the GTX Titan that Nvidia matched the 7970's memory bandwidth on Kepler.
 


Above 256-bit the difference is minimal on only 2GB of RAM. Its made sense on HD7950/70 to a small degree. Keep in mind NVidia like saving money and asking a premium. Cards did not need 3GB for mainstream.
 

SnuSnu4you

Honorable
Jun 17, 2013
18
0
10,510
The 770 is a great card, I dont know why people are complaining about a cheaper 680 w/ great new features. Go Nvidia!
 

mapesdhs

Distinguished


That depends on the intended task. It makes a *huge* difference for CUDA processing.
Atm the 600 and 700 cards repeatedly fail to beat the old 580 because of this issue. It
would be a very different situation if any of them had the 512bit or even wider bus they
deserve. All of these cards could be much better than they are, but NVIDIA believes it
doesn't need to bother, so it keeps releasing crippled products. Such a shame. As with
Intel's CPUs though, in theory it's an opportunity for AMD to get back into the game if
they play it right.

It's a pity that CUDA is NVIDIA-only though. No wonder Adobe is moving towards
OpenCL instead, it's hard to ignore the power of AMD's offerings these days. CUDA is
easier to code atm, but AMD is slowly getting there.

Ian.

 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810

Absolutely, that's what i wanted! I meant, could we just have the min/max/avg seen in the filtered data, since it's what we actually see.
 


True - Nvidia had a hard enough time keeping up with GK104 demand for a solid 3 months after release - I can't imagine them supplying enough GK110s at that time. But it doesn't change the fact that they could've easily released Titan and 780 no less than a year ago. Just like Intel keeping their 8-core Sandys and Ivys out of consumer hands because of so little competition from AMD.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810

Well, Nvidia also had to look after that Titan supercomputer...
 

Simon Ayres

Honorable
Jun 18, 2013
265
0
10,810
Whose idea was it to leave a 670 out of the benchmark for the 770?
I mean you're only testing its replacement, who wants to know what the performance improvement is?...
 
Personally i dont even think 700 series as a 'next' series. More like filler product to make it nvidia have something new this year. Kind of what GTX285/275 to GTX280/260. This year gpu war will only get interesting if amd have something to compete with nvidia head to head in performance not just in pricing.
 

barkerd25017

Honorable
Jun 19, 2013
1
0
10,510
fyi
cycle benchmark needs the tile sized to be changed to 256x256 for gpu
to get more accurate render times

i usually get around 55 second but when i did it 32x32 it took 180 seconds!
 


I would agree with two exceptions:
1) Titan
2) GTX 780
 
Ah GK110. i think nvidia might know early on that amd will not going to release their next gen this year. If notwe might seeing nvidia releasing GK114, GK116 and GK117 by now to fend off such cards from amd. Either that or there are no more optimization can be done to nvidia current chip unlike Fermi
 

Duckhunt

Honorable
Sep 22, 2012
339
0
10,810
I hope they can work on reducing the power consumption of these cards. It seems like they have not done much on that front. I am disappointed by the fact that there seem to be no accessories that can help use the case as a heat sink and thus help with the TDP. We have all these metal holding the pc together it does nothing.
 

All they did was overclock the GTX 680, so of course power consumption increased. That's unavoidable.
 
Status
Not open for further replies.