Nvidia GeForce GTX 780 Review: Titan’s Baby Brother Is Born

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
With regard to resolution, I agree that I am unhappy with the limited resolution data in this review. 90% of the people out there are using 1920 x 1080/1200.

I have no interest in 2560x1440. For what I'd spend for a decent one, I'd much rather invest in 5760 x 1080. For one, I'm done with 60 Hz monitors and there are no 2560 x 1440 monitors at 120 or 144 Hz.

All of the other sites I have looked at give varying resolutions

http://uk.hardware.info/reviews/4419/nvidia-geforce-gtx-780-review-titan-light
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780/
http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/8

And if ya gonna include the "overclocked" GHz version of the 7970, seems to me you should follow suit with an overclocked 680.... in fact, why there is never any mention of how far the reviewer was to get on their own manual overclocks is is a serious review omission in my view.

I see Guru3D got their's well above the 900 stock boost clock to 1100~1176MHz (22 - 30%) .... techpowerup got the base clock to 1059 Mhz (22%)
 


Funny. I was just thinking the same thing regarding why sites will always use the 7970 GHz edition, but never go to the 680 FTW or Classifieds. Seems to be some unwritten rule out there among review sites that its ok to use the GE when talking 7970s while using the vanilla edition for 680s.

I also share your opinion about the 120Hz+ monitors. Any time I try to go back and game on 60Hz monitors, I can't stand it at any resolution. If 120Hz+ monitors came in higher resolutions than 1080p, I'd definitely consider them. Once you start gaming on them, you just can't accept less. It's kept me from going to a 2560x1440. I actually bought one and returned it a day later to go back to my good ole 120Hz at 1080p.

Additionally, another argument for making 1080p the staple is if you go into most retailers, they're carrying 1080p for most larger size monitors 23"+ and only a few 1600x1050 and lower on the smaller side, but I'd have to say the market is dominated by 1080p. It's all about availability/price.

I'm just thinking 2560x1440 was used exclusively in this review due to time constraints?
 

I agree that it is a bit weird to compare the GHz Edition to the vanilla 680 only. But GHz Edition is at least consistent across all brands, because it comes from AMD. The GTX 680 factory overclocks are specific to each OEM.

As for how far the reviewer could overclock his sample, that can be misleading. They can provide cherry-picked samples to reviewers, which means the results are not representative of what people can expect "in the wild." And overclocking has an inherent random component anyway, so even if the sample isn't cherry-picked the results may still not be representative of what people can expect "in the wild."
 

softplacetoland

Honorable
Apr 3, 2013
30
0
10,530
When I read microstutter whiner's comments I can't help but imagine people playing games on AMD cards with their eyes bleeding in some sort of masochistic ecstasy. Is gaming on AMD that bad? For god's sake, cut the crap.
 

gsxrme

Distinguished
Mar 31, 2009
253
0
18,780
I was to see a 1536core, 384bit 3GB memory Running at GTX680 speeds of 1100-1200Mhz. That'll my friend will be a true upgrade for us normal people only gaming at 1080p.

Whats the percentage of the people really using multi displays or 2560x1440? extremely low.
 
I'm still not "seeing it" but I'm not a high-end GPU guy (except on the workstation side from time-to-time). The economy may be recovering but not to the extent of selling millions of $650 video cards.

I suspect what nVidia will ultimately try to do is land the GTX770 around $550 or so if it beats up on the Radeon. That should extend the life of the high-end 'Sixes' into Q4 until we see a price-war bloodbath.

 

gadgety

Distinguished
Dec 3, 2011
69
0
18,630
"The GeForce GTX 780 is akin to Core i7-3930K. ...Almost every bit as fast, it costs a lot less and sacrifices very little of the flagship’s feature set (FP64 performance the biggest loss)"

Well, the Titan's 6GB vs the 780's 3GB memory is what a lot of rendering people would miss if they went with the 780.
 

MyNewRig

Honorable
May 7, 2013
278
0
10,860


The difference in Europe between the 7970 GHz and the GTX 780 is much much wider, it is 400 Euros vs 700 Euros respectively, which makes the difference a hefty unacceptable 300 Euros for a 15% performance boost tops across the boarder, as far as am concerned the 7970 Ghz edition is still the crowned king of value/performance here in Europe until nVidia gets back to its senses and sets the GTX 780 at 500 Euros with VAT and all!
 

By early June, the GTX 760Ti should effectively come in and drop a big value bomb all over the rest of the top-end 600 series. The next gen's here with a full lineup on the way, time to quit talking about last year's cards like they're still relevant.
 

The GTX 760 Ti will still use last year's architecture. So don't get too excited.
 


That's my understanding, but I'm not on top of it like you guys ... More of a 'tweaking' and 're-badging' than new arch.

Whether it's true or not, I don't know, but my understanding is the GTX770 is a tweaked GTX680 with a new BIOS.



 


shakuvendell is correct! The Titan was going to be the 680 I know this for a fact. However the AMDs 7000 series was less then what Nvidia expected. So they held it back for later and rearranged the 600 series. So truly the Titian is a 600 series card.
 

Fulgurant

Distinguished
Nov 29, 2012
585
2
19,065


Does it really matter? Whether NVIDIA wants to call it a 600 series card, a post-600-series card, a 700-series card, or a pink pony, the naming and numbering scheme remains arbitrary and ultimately meaningless.

What sets Titan apart from everything else is its price. It's by far the most expensive single-GPU video card on the general-consumer market. Informed consumers can judge for themselves whether Titan's attributes are worth the cost; everything else is just noise.
 


Yeah sure it is what it is, there seems to be a market for them they do sell. The reason it make a difference is they could have been the 680s and we could have bought them for $500. Then today's 680s would have been the 670s and so on. Now they are twice the price since Nvidia didn't feel the need to release them when they reviewed the AMD 7000 series. Now do you understand the point?
 

Yeah pretty much. So it's an update comparable to what AMD did from the 7970 to the 7970 GHz Edition. Same silicon, so it doesn't get any cheaper for Nvidia to produce a given amount of performance. So we shouldn't expect too drastic improvements in performance per dollar.


The GTX 680 (GK104) has 3.54 billion transistors, the Radeon HD 7970 (Tahiti) has 4.31 billion transistors. The GTX Titan (GK110) has 7.08 billion transistors; it was never really meant to compete against the 7970. The GPU was meant for professional graphics cards. AMD could have made a huge GPU too, if they'd wanted to.
 

The GK110 is way too big for Nvidia to sell it at $500.
 


Yeah Nvidia felt the 7970 was no match for it so they held off using it till later. Sure there are professional versions of it however this card was going to be the 680 at one point, that is a fact.
 


Yes, I'm sure that was part of the discussion concerning the placement, and pricing of the card. Of course the 780 is selling for $650 and its the GK110 except for some stuff turned off, right. Next time I talk to my contacts I'll ask them about that. It sure would have been nice though, lol.
 

Maybe it was planned as a 680, but then the 680 would have cost a lot more than the 7970. Nvidia and AMD are on the same process node and so each transistor is going to cost roughly the same amount of money to produce. So when the GK110 has 65% more transistors than Tahiti, they're going to need to charge substantially more for GK110-based graphics cards than AMD does for Tahiti-based graphics cards.
 

Raheel Hasan

Honorable
Apr 17, 2013
1,019
0
11,660
[citation][nom]g-unit1111[/nom]That is one gigantic beast of a video card. Awesome to see Titan performance at a price point that most gamers could actually afford![/citation]

Most gamers have a whole pc for $700
 

87ninefiveone

Distinguished
Oct 16, 2011
449
0
18,860
Chris, I know this probably won't get read because it's 150 comments down, but it would be interesting to see Tom's start using frequency plots for FPS during a benchmark rather than a simple FPS vs time plot. The frequency plot would plot FPS on the x-axis vs. the number of frames at a given FPS rate on the y-axis. This plot should give you a nice visual indication of the data's mode(s) and give a better visual indication of what the most common frame rate range is during benchmarking.
 

Why not a percentile graph of frame times?
 
Status
Not open for further replies.