The GeForce GTX 770 Review: Calling In A Hit On Radeon HD 7970?

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Wow, look at that - you did replace with 780s. Totally agree with you about Nvidia holding off on the 110 part til this year, though it made business sense. Sad that we can't get the best when it's available. Makes me wonder if Intel's doing the same thing.

Nvidia's driver software is so obviously superior to AMD's it makes me sad I went for the better buy with my card a year and a half ago. TechReport did a few articles about the GTX 660Ti vs. the HD 7950 at the end of last year stating that AMD's card should really be 1.5x faster because of the superior hardware, but instead they're dead even because of Nvidia's software efficiency.

For my part, I'm just so sick of AMD's unresponsiveness to Nvidia's Adaptive Vsync, Ambient Occlusion, FXAA (we should have in-driver SMAA and Dynamic Vsync by now), and most importantly, single-click driver updates. All we get are second-rate or buggy third-party replacements. Not to mention: PhysX! I have 7 games that support it, and I want to try it out, for God's sake!

I've been wanting to go Green for a year now, and I can't wait for my birthday.
 
Wow. I've missed a lot here, but I agree with every one of BigMack's last "N" posts. Although both companies have had driver issues from time to time, right now AMD is the one who can't seem to get it together, which is a real shame. Software has indeed improved the speed of GCN though, and I'm not sure we've seen the last of the increases.
I remain extremely critical of the crippled nature of most of nVidia's offerings, even though I understand why they did it. One of the best examples to me is at a somewhat lower point in the range; the GTX650Ti Boost is what the GTX650Ti should have been at launch, and I'm annoyed at myself for being suckered into getting the older version less than a month before the "fixed" ones were available.
An AMD card seems to be more trouble initially to get running properly, but once it is, it does well; HardOCP's IQ articles and "apples to apples" analyses clearly show this.
Although I'm still mining, I know it won't be for much longer. Still, my Gigabyte version with the WF3 cooler, overclocked, in an open case, is not obnoxiously loud.
Then there's PhysX, and that may be why, at some point, I will likely return to Team Green. It's hard to say when, but soon is likely. I'm going to wait for a couple notches down in the product stack though, i.e. the 760Ti / 760.

 

Well, okay. But you do remember how the supply situation was with the GTX 680 early on, right? With GK110 it would surely have been considerably worse. And it would just have been a poor-value GTX 680 that would not have pushed AMD to cut prices the way they did - at least not until Nvidia followed up with GK104-based cards. It would have been a big "meh" for 99% of gamers.
 


I am not sure where you are getting this impression.

It is blatantly clear that nVidia purposefully releases products that are within a specific % performance of its competition. AMD has been doing the same but financial pressures may create an initiative to provide greater value to the consumer.

I am aware that the GK110 chips have been in production since at least May 2012. As you have said, due to low yeilds, primarily due to greater likelihood of production errors because the immense size of the chip, the GK110 was being stockpiled for a later release. It has become commonplace for nVidia to have two product cycles for a specific architecture so considering the GK104 as a midgrade chip is a true misconception considering that the GK110 availability restricted the launch of the product by a full year for the consumer market.
I love the arguments that a GPU released a month ago is superior to a year and half old product.
 


Yes, exactly. Just because NVIDIA may have originally planned to release GK110 as the 680, it doesn't follow that that plan was ever feasible, in retrospect. Plans change; circumstances shift; costs don't always pan out as expected. Based on what we've seen with Titan and GTX 780, it should be pretty clear that NVIDIA wasn't in a position to launch a similar product to the mainstream market more than a year ago, and certainly not at a mainstream price point.

To say that we should have gotten something far better than GTX 680 is a meaningless observation. By definition, the GTX 680 was/is a high-end card. The only notable outcome that can possibly arise from calling it anything less is to stir up needless annoyance and frustration. Why bother?
 

not sure...? you don't think nvidia internally test their own gpus against amd's?
it's not exactly clear .... i think you're saying that gk104 is a 'flagship' of the 'first' kepler 'product cycle'....? is it because there were no cards on gk100 gpu? gk104 would still be midrange. the only 'high end' thing about gk104 (gtx680) was it's price - that was purposefully done by nvidia. i just went back and searched a bit - not only gk104 is midrange, it was supposed to be priced against pitcairn cards. nvidia 'upgraded' it's price point because it overperformed in internal testings by a big margin. gk100 got cancelled due to it's own problems and gk110 took it's place. the hierarchy still stands despite price point change. 😀
 
I'm not sure how much internal testing nVidia actually did. If they had, they would have known about the FCAT findings early enough to make a howling big deal out of it for marketing purposes. I suspect they relied (too heavily, it would seem) on what external reviewers were reporting. Look at a competitor's products too closely, and their lawyers start to drool, so I'm sure nVidia was being cautious that way.
 

I am unsure where you are getting the idea that I "completely deny the possibility that nvidia had internally tested kepler samples against retail tahiti cards"

By the same logic, the GTX 480 was a mid-ranged card and the true high end product was the GTX 580.
 
may be nvidia figured that the fcat findings only affected multi gpu configurations, and frame time analysis didn't gain enough rep among reviewers back then. amd didn't admit they had driver problems either, until later. and amd took much less time to improve single gpu problems than they did with crossfire. nvidia let gtx690 sell for a year, stockpiled binned gk110 for titans, sent fcat to reviewers as frame time testing gained more and more momentum, then launched titan amidst reviwers taking amd to task for neglecting microstuttering and crossfire improvement 9and amd's subsequent public admission). from titan launch to the frame pacing driver release - within this time, any amd multi gpu config would look bad against titan and gtx690, and customers will be infleuced by those events. that's exactly what happened as titan outsold gtx690 in a short time. now gtx 780 and 770 are here to capitalize on that, as seen in multi-gpu benches. 7990 was a poor cannon fodder, result of amd's desperation to release a dual gpu halo product. fcat's earlier public availabilty woulda affected gtx690 sales (exacerbated already by yields) and then-available drivers. it's kinda (not)coincidental as ryan shrout of pcper was preparing frame capture analysis as well....
amd and nvidia have their own set of problems, but from a customers p.o.v., that's what seems to have happened.
 


Actually the boasts about the 40% performance increase over the 7970 was for the GK104 680 back in March 2012. The announcement of the GK110 was in May of 2012 and with projection of product release months out. The entire hysteria of the GTX 680 being a GK110 was a rumor based on unlabeled GK110 chip that was leaked. It is amazing how rumors evolve into being considered fact by the masses.

http://www.techpowerup.com/162305/geforce-gtx-680-up-to-40-faster-than-radeon-hd-7970-nvidia.html
 

nvidia's claim about not being imprssed with 7970 earlier in 2012 stayed throughout the duration... amd's own mistakes didn't help either. yet the competing gtx680's gpu(gk104) itself was still midrange. you didn't seem to be able to correlate the two (until your post below, may be? 😛). without testing gk104 vs 79xx(tahiti), nvidia wouldn't make the claim (their own benchmark engineering was also in play, typically), or set gtx680 model number to gk104. i never said midrange asic cannot be flagship. it was the price point that made the gtx680(based on gk104) high end.

gtx480, the highest configuration available based on gf100. you seem to have used your own 'logic'(!) to come up with that line. or may be typed 8 instead of 6.... 😀


40% is obviously nvidia being over the top. at least it shows that they tested 7970 against gtx680.
 

You do know what they say about assuming... Your entire argument is based around assumptions even though there was no confirmed GTX 680 that was build on the GK110 architecture. You continue to utilize rumors as the basis of your argument. The performance claims were exaggerated and were based on the GK104. You can call the GK104 mid-ranged all you want but it was the flagship product for more than a year.



Actually the GTX 580 is based on the GF110 which is also based on the Fermi architecture and was a die size increase of the GF110, similar to the GK104 to GK110 die increase.

This was an interjection that you made in the middle of the argument. It has no real reliance in the claim that nVidia was talking about the GX110 when discussing the performance of the 7970.


Thanks for the link:
"Although, credibility of this portal is quite low it may actually be a good analysis of gpu market.
Source of this information is known for spreading unverified specifications only to increase pageviews"
 


That is the problem these days... rumors are pass one as NEWS and the general public can't seem to distinguish between the two.
 
Many people have the same question in their minds: Why did Nvidia rebrand their 680 instead of directly cutting the price? Actually GTX 770 behaves just like what it is, which is a heavily OC'd GTX 680 (add that some cool features/tweaks). But why didn't they just cut the price and they choosed to rebrand? There are some logical answers actually, especially from a marketing stand point. First, if they did price cuts on their GTX 680s to compete with AMD's GHZ Edt., that would hurt their image. The FPS difference between a GHZ edt and default GTX 680 is roughly %5 to %10. Now, a price cut of %5 to %10 would bring GTX 680's price to GTX 670's price point. Then they would need to cut the price of 670 and also 660Ti. But they waited a year to tweak their card and now provide it as a mid-tier one, of course cutting its price. Also probably GK 110 is too expensive to produce and that may have require them more time/ cost more to make a GK 110 based GTX 770. I believe Nvidia have some good management and marketing department here (from their company side).
It seems Nvidia didn't want to put much pressure with Kepler on 7970 and GHZ due to AMD's effective price/performance policy. Because perhaps that would mean less profit or something like that. Maybe that was because AMD kind of surprised them by releasing the GHZ edt (if it was for the vanilla 7970, Nvidia was holding the king of the hill at their side, which was GTX 680) last year and Nvidia was, perhaps, caught unprepared because they didn't expect AMD to make this surprise. Lastly, i kind of like what they did with this card. GTX 770 is for sure an amazing part. The only thing bugging me is the 2GB models. Why couldn't they put some 3GB ram on that? Why do we have to pay extra for 4GB models?
 
reading those haswell reviews is a drag.... anyway,

who said anything about gtx680 being built on gk110? 😀
flagship...well, i guess it comes down to semantics. i call gtx680 nvidia's high end card because it launched at a high end price, based on a midrange asic. the midrange part isn't assumption nor rumor, gk104's specs give it away. check those out yourself and pay attention. :)

so... does this mean gf110...is ... wait for it..... here it is...
actually a 'die increase' of gf110?!
i think i see where you might have gone wrong. :lol: no, i do not believe you made a typo. i noticed the effort you paid to the post. still, brain farts may happen. gk104 and gk110's die increase is not exactly apples to apples comparison and i see why you may have come to the flawed conclusion. 😀 .... or may be i am misunderstanding... :ange:

i was being lazy and was replying two of your posts in the same post. strangely, your second post (with the tpu link) was relevant except your later edit. nowhere in the tpu rumor is gk110 being even mentioned. 😀
 


Drivers are exactly why I switched. Too many problems with AMD. Ain't Nobody Got Time for That!?!

The tens of dollars saved aren't worth the months waiting for AMD to come up with a solution in the next driver release to microstutter issues you've been having with AAA titles for the last year only to be disappointed when the next release doesn't include the anticipated fixes, but introduces new issues (eg can't wake from sleep). To me driver drama is not worth the cost savings. I just want to play my darn games. I can do that with Nvidia.
 


HA! I knew it wasn't just me experiencing the can't-wake-from-sleep mess. Also, it looks like AMD effectively made MLAA less effective in the most recent round of drivers - still blurs text to near-unreadableness, but now it's worse at removing aliasing, too! What great features.
 
^ well to be honest i heard good things about MLAA when it first comes out. (if i remember correctly the feature come out together with 6k series). then i see some games implement the feature inside their games disabling the need to force it from CCC and make it available to nvidia user as well. why AMD did not push game dev to use MLAA more like nvidia did with FXAA? also i haven't heard much about MLAA after version 2.0
 


Bleh, that was painful! Hated it...
 


Yeah, when it first came out it blurred everything (really bad on text) a bit more than FXAA and cut your fps by about 25%. Then Nvidia released FXAA with a <5% perf hit, and AMD refined MLAA a bit. 2.0 was <5% perf hit, and blurred a little less.

But since 13.1 they have changed it again - I suppose to try and reduce blur further, but now polygon edges and alpha cutouts have more noticeable aliasing. I don't know why they didn't let anyone know about the change this time... I sure noticed - I used it on almost every game.
 
Upgrading from a 570 this is a no brainer. From anything better than that - even a 580? Not so sure As for some of the dumb questions here. This ISN'T new gen Nvidia - it's a reboot of old gen. It IS faster. quieter and more power efficient than GE (and cheaper) so it wins that one. However - as most people have had the opportunity to buy the GE for months now - and very few people would swap a GE for a 770 for the fractional imrpovements in performance, it's a card that missed the sales bubble and is in the "clean up spot" for end generation.
November 2013 for the next real Radeon (but probably 6 months later for the AMD jokers to give users decent drivers for it) and March/April 2014 for the 800 series Nvidia. This is the best value top end gamers card available right now, but how many people who haven't already invested in this generation are going to see it as an upgrade?
 


well just like you i use FXAA on most of my games simply for the lower performance hit it has to offer. it might not as good as MSAA or some other AA technique but i was no AA freak to begin with. so far i'm quite satisfied with it
 

Cheapest GTX 770 on pcpartpicker: $398.99. Cheapest 7970 GHz: $399.99. And there are non-GHz 7970s for 70 bucks less, and they provide the exact same performance as GHz edition once overclocked.
 


Me too - MLAA really doesn't even come close to FXAA in most games. It blew me away how good a job it does by itself in Arkham City, so I dl'd the injector to use in Crysis - works wonders.
 
Well, I'm happy for Nvidia that they've managed to bring a competitive card in for $50ish less than the 7970 GHz editions. Sadly, 1) I've never had much luck with Nvidia cards, while I've never had problems with ATI/AMD cards; and 2) non-GHz 7970s can be had for around $400ish (+/-), and in many cases, if not already up to GHz standards, can be pushed to similar performance as the GHz cards.
Hooray for Nvidia, but I'll stick with my Radeons, tyvm.
 
Status
Not open for further replies.