How close is nVidias "RTX Future"?

Jul 2, 2018
9
0
10
I've never been a hardcore pc enthusiast, but I do like to keep my (mostly gaming) systems on the high end, spending more money today and delaying future upgrades. My normal upgrade cycle is about 5 years. With the launch of the Turing GPUs, it's pretty clear that I would be better off buying a Pascal GPU instead of Turing. My budget barely reaches the 1080ti, which in Turing terms would be good enough for a 2070. Is it worth for me to sacrifice the raw performance of an older GPU, or to spend some more money with the hope of future proofing, for me DLSS is the only intriguing thing holding me back from going Pascal, since at least in some major titles it seems like the performance gains could be substantial. The better case for me would be to get a 1080 and save some money (which currently seems enough for 1440p) but I'm afraid that getting greedy now will result in great performance hits down the line. So ultimately it comes to this: Based on the introduction of new features in the past, is 5 years a time where they were able to push the older technology aside (along with its hardware) or when the time comes for me to upgrade my current next GPU bring me to a much more logical time to go with ray tracing and DLSS where the technology will be much more mature and just becoming the norm?
 
Solution
Kinda depends on you and your games. My daughter is quite happy playing her Fifa, minecraft, Sims, roblox etc on an old gtx660ti. Has absolutely no need for Ray tracing or DLSS as neither the games nor the monitor have any real support or need of such. The games I play on my gtx970 are pretty much the same, devoid of need of ray tracing etc. So would a 1080 be good? Sure, absolutely. The RTX series wouldn't change my gaming, no matter how high end my system is. To me, Ray tracing and DLSS is just another gimmick, just adds a little more realism and immersion, so unless you feel the absolute need for 'latest and greatest' hardware that's only going to be applicable, realistically to a handful of games, then Turing isn't much better than...

Karadjgne

Titan
Ambassador
Kinda depends on you and your games. My daughter is quite happy playing her Fifa, minecraft, Sims, roblox etc on an old gtx660ti. Has absolutely no need for Ray tracing or DLSS as neither the games nor the monitor have any real support or need of such. The games I play on my gtx970 are pretty much the same, devoid of need of ray tracing etc. So would a 1080 be good? Sure, absolutely. The RTX series wouldn't change my gaming, no matter how high end my system is. To me, Ray tracing and DLSS is just another gimmick, just adds a little more realism and immersion, so unless you feel the absolute need for 'latest and greatest' hardware that's only going to be applicable, realistically to a handful of games, then Turing isn't much better than Pascal. The only realistic advantage of Turing is in the 2080ti and 4k gaming being able to field higher fps in a single card than a 1080ti, putting more games at closer to or above the current 60Hz monitors and smoothing out game play.
 
Solution
Jul 2, 2018
9
0
10


Thanks, I had my doubts about Pascal at first, but it seems like it's here to stay anyways

 
Jul 2, 2018
9
0
10


I think the same of DLSS, but it still keeps me away because it is necessary for the game developer to add it first (basically letting you choose between great performance some times or good performance at the price of the already great performing 1080ti)
 


Ray Tracing might be gimmick to gamer but the benefit is very real for game developer. as for DLSS it is definitely not a gimmick. it is a feature that supposed improve your gaming performance not hampering your performance like RT. so the addition of DLSS is very welcomed.
 


i still don't have the full details but to my understanding game developer almost did not have to do any changes or integrate DLSS in their games. it is all done by nvidia. all they need to do is give nvidia the permission to access their game and then nvidia will train DLSS profile themselves. in a way it might be similar to nvidia 3D profile for their 3D vision tech. many games are not build with stereoscopic 3D in mind but nvidia still create profile for many of those game to be used with their 3D vision tech. and they did this alone without the help of game developer at all. DLSS should be the same only that this time with more developer input instead of nothing at all like how it was with 3D vision.
 

Karadjgne

Titan
Ambassador
Gtx970, getting anywhere from a solid 60fps to 300fps on a 60Hz, 1080p monitor. Basically about the same as a very large percentage of gamers out there with 1060-3 or 1060-6 users. At ultra settings and even some 4k DSR when available. So how is DLSS going to improve performance? It's not. Maybe for the 4k crowd it will, the difference from a 1080ti to a 2080ti, bit it's still just a gimmick added to a new gen of cards, something that makes them perform better.

Kinda like saying a Ferrari gets better performance than a mustang, simply because it comes with special, patented Ferrari Red paint and the mustang doesn't. A gimmick. A RTX2080ti IS better than a GTX1080ti , the exact reason why doesn't matter, just like it didn't really matter when comparing a 780ti to a 980ti
 


since you said that did you even know what DLSS do? also just because you did not play at 4k then the feature is gimmick. right now 2080 non Tiis roughly around 5 to 10 percent faster than 1080ti. but with DLSS alone the gap can widen to 20 to 30 percent.
 

Karadjgne

Titan
Ambassador
To me, it really doesn't matter what DLSS actually does. Back in the pre-maxwell days, it was all about the cuda core and power. Drop in a Maxwell with less cuda and less power needs and performance increased considerably, especially on the lower end. Maxwell just made (for whatever reason) much better usage of cuda and power. Could have called it DIGM for all that it mattered. Now with Turing, you've added DLSS and changed the architecture from Pascal a little, but kept the roughly same power needs, so something has increased performance of the RTX over the Pascal gpus. As is normal (we hope) with a new gen card. I say it's a gimmick as in something for salesmen to get behind to promote sales etc, it's just a name added to newer and better tech on a newer and better gpu.
 
It's not a gimmick when people can actually benefit from it. I still remember when nvidia first integrate FXAA function with their control panel (so you don't need to inject FXAA manually using tools like sweetfx). It give me AA function and jmprove image quality in many games without sacrificing too much performance like more traditional form of AA such as MSAA. in fact i saw FXAA work much better than in game AA in some games.
 

Karadjgne

Titan
Ambassador
Right. A gimmick. That's nvidia saying 'buy our cards and say goodbye to the sweetfx hassle, sorry AMD'. It's not an architectural, basic design improvement in ability, it was an add on with software/drivers that improved your game play, but not really the functionality of the gpu in question.
 


nah not quite. FXAA actually was somekind of nvidia respond to AMD MLAA. both actually have much less performance impact than traditional AA like MSAA because both were post processing type of AA. while the development of FXAA keep going on as times goes by the same did not happen with MLAA. funny thing is the guy that first develop FXAA right now are working at AMD (he did work with nvidia before).

being part of the GPU hardware is not the important thing here. DLSS offering what TAA been doing at much lesser performance cost.