Nvidia Announcements: GeForce GTX 780 Ti, G-Sync, And More

Status
Not open for further replies.

bison88

Distinguished
May 24, 2009
618
0
18,980
0
How are they even going to make this worthwhile even if it's just $100 more than the GTX 780? Double the memory? Big whoopie. How will it compete with the high-end Titan that is barely 10% faster at stock speeds and costs $1,000? Without a price drop on the GTX 780 and the 780 Ti taking its pricing place I can't see why anyone would buy it for a higher price when they can already do a small tweak on a $650 card and beat a $1,000 one.
 

kulmnar

Distinguished
Dec 15, 2011
310
0
18,860
29
G-Sync is just another name for v-sync (sound familiar?). G-sync is simply the equivalent of limiting a video card's refresh rate to the monitor refresh rate. This technology has been around for decades. Nvidia must be running out press releases to advertise the re-invention of a technology that has been around a while.
 

PepitoTV

Honorable
Oct 10, 2013
847
0
11,360
171


This, is like this card was born dead, I don't see a reason to get this if it is in between the 780 and the Titan when the pricing on the 780 will drop and is well known how you can push it to match its big brother.
 

Mousemonkey

Titan
Moderator


If it has been around for ages why has the module not been fitted to every monitor in existence?
 

WithoutWeakness

Honorable
Nov 7, 2012
311
0
10,810
15


You clearly didn't watch the event or follow any of the live blogs that occurred. V-sync caps your GPU's framerate to match your monitor's frame rate (generally 60hz) but does nothing to pace the rate at which frames are displayed on your screen. The output from your GPU is dynamic; some frames take more or less time to draw depending on their complexity and the load on the GPU. Your montor's refresh rate is static and does nothing to compensate for the dynamic rate at which frames are fed from the GPU to the screen.

G-sync is Nvidia hardware inside the monitor that talks to the GPU and displays frames on the screen at the time they are fed from the GPU rather than relying on a static clock to determine when to draw frames. G-sync does not match the GPU's output to the monitor's refresh rate; it does the opposite. It matches the rate at which the monitor displays frames to the speed at which the GPU can render them. This allows the frames that the GPU renders to be displayed as soon as they are ready to be fed to the monitor, eliminating stuttering and tearing from the picture all together.
 

CaptainTom

Honorable
May 3, 2012
1,563
0
11,960
68
I feel like Nvidia's goal is to make all Titan owners regret their purchase more and more. I mean the idea was solid: A super card that acts as a flagship for several generations. But in practice it was just a price-gouging 780 that got beaten in half a year...
 

Mousemonkey

Titan
Moderator


Isn't every flagship card a price gouger that will only be superseded in a matter of time? Or is every AMD/ATi flagship card still top dog?
 

Bondfc11

Honorable
Sep 3, 2013
232
0
10,680
0


You are completely wrong on how G-Sync works. I know I tested it months ago at Nvidia. This is much more than just limiting rates as you state. And if it has been around, why hasn't it been implemented? Why torment gamers with the choice of tearing or low FPS?
 

Bondfc11

Honorable
Sep 3, 2013
232
0
10,680
0
Getting these in the Overlord's will be awesome. I know Scribby has been posting about this today-looking forward to an IPS 1440 120Hz G-Sync Tempest. That is the gamer's wet dream. Not sure why TN 1080 is so widely purchased still when 1440 looks so much better.
 

Shankovich

Distinguished
Feb 14, 2010
336
0
18,810
18


Not really. The HD 5970 is still solid, and buyers of the HD 7970 two years ago can play BF4 at 1080P ultra no problem.
 


Did you actually see the live presentation? How JHH explaining the v-sync and g-sync?
 

WithoutWeakness

Honorable
Nov 7, 2012
311
0
10,810
15

AMD/ATi are always the best. Nvidia always makes overpriced crap. PhysX is stupid and no good games have it.

/sarcasm
 

Mousemonkey

Titan
Moderator


I didn't say they weren't still usable I said they will be superseded and that is exactly what should happen with the release of each new generation of architecture, no?
 

aMunster

Honorable
May 30, 2013
9
0
10,520
2
Asus already sells a 144hz monitor named VG248QE. It can be purchased online for about $266. Gsync adding $100+ to the cost of an identicly named monitor is a tough sell. I hope that isn't the case.

Edit: The monitor is the same. You have a choice of buying the gsync module and modding it yourself (cool!) or buying it pre-modded.
 

MANOFKRYPTONAK

Distinguished
Feb 1, 2012
952
0
19,060
42
http://hexus.net/tech/news/graphics/61385-plethora-unofficial-radeon-r9-290x-benchmarks-leaked/
Hehe, let the price wars begin. So the 780ti is like Gandalf in Lord of the Rings? "Look to my coming on the first light of the fifth day, at dawn look to the east." The 290x is going to kill the 780 (hopefully) on price and performance. So Nvidia releases the 780ti to save the the day?
 

bochica

Distinguished
Oct 7, 2009
146
0
18,680
0
@Toddy, I'm in the same boat there.

@Kulmnar, You obviously didn't see the press conference, or read this article. They are two different methods of smoothing the FPS between GPU and Monitor.
 

ubercake

Splendid
Moderator
I can't wait for Shadowplay, but it seems like similar functionality will be available on the Xbox One far sooner than it will through any Nvidia graphics solution.

They've been talking about Shadowplay for a year or so now. That's all though... Talking.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS