Nvidia Adds DSR Support For Older GPUs In New Driver

Status
Not open for further replies.
Wouldn't rendering at a higer resolution and then downsampling result in visual anomalies similar to interpolation? Also does this method include transparency antialiasing as well? I remember reading about it but I forgot the details already.

To be honest 16X FXAA or 2X MSAA usually does the job for me and I'm fine with them. Always nice to hear that companies are still trying to find the most optimized ways of antialiasing.

 
To be honest 16X FXAA or 2X MSAA usually does the job for me and I'm fine with them.

I dont see how you can compare FXAA and MSAA. FXAA is the reason why people hate post processing AA solutions. It looks absolutely horrible, smearing everything, especially all the text. If FXAA is the only option for AA, I am much happier without any kind of AA at all.

This is why I am very happy with DSR that Nvidia is now offering. Although for many, the performance hit DSR entails might just be too big. 2-4 X MSAA is, in my opinion, a perfect compromise between quality and performance hit.
 
I'm just hoping this driver fixes vision surround. I had to downgrade from the last driver because vision surround was unable to be enabled on 3 1920x1080 Alienware monitors at 5760x1080. The prior driver had no issues. Nvidia tech support only said they could not replicate the problem and that was it.
 
@dovah-chan, the visual anomalies or interpolation issues you may be thinking of are typically issues when UPsampling not downsampling. Furthermore, DSR has an adjustable gaussian scaling filter built in, so it offers superior downsample filtering vs. typical SSAA settings in games which just use linear interpolation. Any remaining artifacts that would still be there can easily be eliminated by just using a direct multiple of the native resolution (i.e. using DSR to downsample from 3840x2160 to 1920x1080 or a 2x2 : 1x1 ratio).

If you use a non-even multiplier and downsample some scaling artifacts may still be present, but a combination of also using in-game FXAA or better yet inject SMAA via SweetFX and DSR's built-in gaussian interpolation filter will almost completely eliminate any issues even then.

Alpha coverage? This is a full screen brute force AA technique. It effectively anti-aliases the entire image, so every pixel you see on screen regardless of what it is has been effectively supersampled and gaussian filter interpolated.

This is ONLY a good thing.
 
I've not yet updated to the most current driver, but I will say I'm interested to see how beautimus it is even on my lowly 660Ti OC 2GB Power Edition. Given what I've read and seen in the videos from Nvidia, it does look like a very promising tech. I'll get to take full advantage once I get a chance to grab a nice 970 card.
 
@dovah-chan: By you stating that any form of post-processing AA is fine for you, that makes it clear that SSAA is in no way meant for you.

Don't take that the wrong way, as you can get away with much less GPU load and still be happy
 


What I'd also like to try out is a hybrid approach that involves combining some of the better FXAA and SMAA implementations out there with less than 2x2 DSR; say maybe 1.5x1.5 or even just 2560x1440 vs native 1920x1080. It won't be as good as true 2x2+ DSR on its own, but it may provide better overall image quality vs. crappy post solutions on their own for those that have good mid-range or slightly older gpus like myself (GTX 670 OC'ed)

One problem I DO forsee with this DSR solution is that the HUDs in most older games (pre 2013 or so) are fixed sizes, so they'll shrink relative your monitor as you go higher in overall res, instead of a dynamically scaling, resolution independent UI that always adheres to your native display resolution. Newer games are doing this in the wake of 1440p and 4K monitors becoming popular, but I doubt older games will be patched to fix this 🙁.
 
Well I tried it out just now with Alien Isolation, upped the PC resolution to 4K and then the games resolution to 4K....... looked good and performed goo too

GTX770 4GB Classified

Was it a big enough difference to keep it on? Not really. Keep in mind this is using a 1080p Samsung LED TV. I think you would need a 4K monitor or TV to see a big difference
 


You're either trolling, stupid or buying the wrong graphics cards.

Show me one gpu that can max out crysis 3, Metro 2033, or Skyrim with ENB maxed out at 1440p at 60fps; there is none even 780ti/980 can't. With my gtx 780 32fps Metro 2033, 42fps Crysis 3 (2xAA), 35fps Skyrim with K extensive all at 2560x1440. So I don't see how I'm uninformed or stupid, I'm just pointing out a fact that DSR will only burden single cards more when they are already mediocre for resolutions above 1080p for demanding graphics titles.
 
I agree, single cards will struggle. Like loki said you turn this on and will see even more performance hit. Sounds like you are just wanting to call someone a troll... I saw a 2x performance hit with DSR. I must have the wrong card... GTX 970.

But really with 4xmsaa Im not seeing much difference and would prefer to have high fps in bf4. Some games this might be better for.
 
And it works excellent!! At 4.0 DSR setting my 1080p turns into a 3800x 2715 resolution panel!!
You need Sli to really enjoy DSR, single 980 just won't cut it.
 


I never said anything about DSR being performance free. I was merely refuting loki's statement that 1440p is unobtainable by ALL single GPU cards.

To cite loki's example games, I've not had any trouble running Metro 2033 at 1440p/60fps with my Titan. Crysis 3 hovered around 57fps. Vanilla Skyrim is has run at stupidly high framerates, and even loading up the 6GB VRAM I have has given some impressive restuls around the 50fps mark. I have friends with stock and OCed 290X that similarly aren't having trouble staying above 60fps with all the eye candy turned on. Just because loki can't do it doesn't mean it cannot be done. By the same token just because I CAN do it doesn't mean everybody can do it.

And yes, DSR is going to potentially kick performance in the teeth, but if you have free capacity then it shouldn't be an issue. I've had to drop back to 1080p recently while I sort out broken screens and the Titan is just sat there twiddling its thumbs as it smashes through everything, so I have spare capacity. I'd never turn DSR on when I get my 27" screen fixed up.
 


I'm not sure how you have a Titan that defies benchmarks, but a regular Titan cannot max out Metro 2033 or Crysis 3 on MAX settings; i.e. including PhysX for 2033; let alone Skyrim with an ENB, I didn't even bother mentioning vanilla skyrim as that is irrelevant resource wise. Unless you use a very light ENB you will not hit 60fps on any current card, overhead for a regular ENB will be 50-80% depending on the GPU/ENB preset. I used K ENB as the example because that is one of the more demanding ones.

GTX 780 results are not that far off from the old Titan results:
http://www.eurogamer.net/articles/digitalfoundry-nvidia-geforce-titan-review
Crysis 3 "Titan can't get anywhere near 60FPS at 2560x1440 without some degree of compromise in the quality settings."
Metro 2033 with PhysX on: 29fps

For the 780Ti: http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks,3663-7.html
Crysis 3 with only FXAA: 49.15 fps
Or http://www.eurogamer.net/articles/digitalfoundry-nvidia-geforce-gtx-780-ti-review
Metro 2033: 41.9 fps

That is why I was saying it only makes sense for screenshots rather than gameplay from a single card perspective. From what I understand the performance hit from DSR is the same as running at that resolution; i.e. 4K downsampled or downsized or whatever you call it to 1920x1080 will make your card work like you're using a 4K monitor. Maxing the most demanding games at 1440p is an SLI situation while 4K is 3-4 way SLI situation to max the heavy hitters.
 
Status
Not open for further replies.