[SOLVED] The RTX series 30 - real ray tracing performance without cheating with frame upscaling and frame interpolation

edytibi

Distinguished
Nov 8, 2014
9
0
18,510
Hello!
I have to upgrade my old Radeon HD 5700 video card. I read a lot of articles on the subject GTX vs RTX and all was in favor of RTX cards, on my wishlist I had a RTX 3070, until I understood that al the benchmarks that stated the performance increase and the exceptional value of FPS were made with DLSS enabled, and everybody was thrilled about the image quality and the high framerate. After some reading it was clear that DLSS meant: render low resolution frames at a low framerate, upscale the frames for a big resolution, and interpolate frames for a high framerate. In my opinion this is cheating, no matter how good and fluid it looks. And from a review about RTX 4090 here on Tom's Hardware I understood that ray-tracing at 4K resolution for example in Cyberpunk 2077 is usable only with DLSS enabled, otherwise the framerate is too low.Older RTX 3 cards also had low framerates with ray tracing enabled without DLSS even for 1920x1080p resolution. Did I misunderstood?
My requirements are 1920x1080 resolution at an average framerate of 60 (minimum 30 FPS) as my monitor can't handle more .For 2022 games, is there any video card capable of ray tracing without upscaling and interpolation at my required resolution and framerate ? Will the future RTX 4070 be capable of such thing?

i7 9700KF at 3.6GHz
64GB DDR4 at 2666MHz
Motherboard Asus Prime B360-Plus
PSU Seasonic G12 GC-750

LE.Sorry.RTX 30 and 40 series.
 
Last edited:
Solution
I think you're misunderstanding a little bit.

Firstly, it's rtx 30 series and rtx 40 series.
GTX cards no longer exist (Besides the 1650 and 1660 which are being phased out).

Now here's the thing:
All new graphics cards from AMD, Nvidia and even the new intel cards have ray tracing, but most games do not support it and in my humble opinion, those that do, don't look that much better.

The fastest card on the planet right now is the RTX 4090.
You need to check benchmarks, but I believe it can do cyberpunk 4k 60fps with medium ray tracing with no DLSS
edit: I checked, on 4K Ultra preset (meaning, everything is set to the maximum) with no DLSS, the 4090 gets about 40 fps. There is nothing faster than this, and won't be for atleast a few...
I think you're misunderstanding a little bit.

Firstly, it's rtx 30 series and rtx 40 series.
GTX cards no longer exist (Besides the 1650 and 1660 which are being phased out).

Now here's the thing:
All new graphics cards from AMD, Nvidia and even the new intel cards have ray tracing, but most games do not support it and in my humble opinion, those that do, don't look that much better.

The fastest card on the planet right now is the RTX 4090.
You need to check benchmarks, but I believe it can do cyberpunk 4k 60fps with medium ray tracing with no DLSS
edit: I checked, on 4K Ultra preset (meaning, everything is set to the maximum) with no DLSS, the 4090 gets about 40 fps. There is nothing faster than this, and won't be for atleast a few months or maybe years.
With DLSS on performance mode, you can get about 100fps using the same preset and resolution, and it looks almost as good.

Now, there's something you need to understand about DLSS.
There are a few versions of it, with the most recent being 3.0
The basic premise of DLSS is as such: Render games at a lower resolution and using AI, upscale them, and it seriously does look 95% as good as the original.
DLSS also has levels that you can adjust ingame, from high-performance to quality, and those basically change how low the rendered resolution is, and how high the upscaled resolution is.
For Example, Quality DLSS might render at 1440p and only upscale to 4K and look the same as 4K but with 20% more FPS, while DLSS on performance might render at 720p and upscale to 4K, looking about 90% (my opinion) as good as 4K, but with 3 times the FPS.
"After some reading it was clear that DLSS meant: render low resolution frames at a low framerate" This is not true. Because they render at a lower resolution, you get higher framerates already.

The reason it gives you higher FPS is because the gpu is working less hard since it's actually rendering a lower resolution, say, rendering 1080p, but what you see is actually 4k, meaning you get close to the FPS you would get in 1080p, but with the sharpness of 4k.

As for the interpolating frames you are talking about, this is a new feature released a few days ago along with DLSS 3.0, and it doesn't "interpolate" frames like old tvs used to do, it uses AI to learn the frames and create an entire new frame out of nothing.
Interpolation basically mashes the first frame and the 2nd frame to create frame 1.5, thus adding more smoothness.
DLSS is NOT interpolating using motion detection, it's learning and making new frames using incredibly complex calculations, and it does look good, not just "fluid", but 100% normal.
Now, this feature is so new infact, I believe only 4 or so games actually support it until now, and you are not likely to run into it, cyberpunk 2077 not being 1 of them.

We can't know for a fact how fast the RTX 4070 will be, and every single game is WILDLY different in how it performs with RTX.
1080p 60hz is now bascially the minimum and I doubt even an rtx 3070 could run most games at 1080p 60hz with ray tracing on.
Edit: check a benchmark and seems at 1080p ULTRA preset (everything set on as high as it can) with no DLSS, the RTX 3070 can get about 70 fps or so in cyberpunk.


I would also like to add that RTX 30 series are obviously slower than the RTX 40 series.
 
Solution
Honestly, I’ll tell you right now, save your money. Your card is basically still in the performance realm of the rx6600, 6600xt or rtx 3060.

Disclaimer I do have an rx 6700xt, but I was using a 1080p 144hz display. However I got a good deal on a 32 inch 1440p 165hz screen. Let me tell you…wow! After some tuning, it runs great and I can tell a good difference between 1080p and 1440p in how sharp things are. For example I was playing World of warships, and it’s pretty cool when you zoom into your battleship and you can look at the cannons and see the rifling inside the barrels.

So ray tracing is cool but honestly, I’ve seen it a bit on my Xbox series s, it’s neat but give me higher fps. You really want an image quality upgrade, upgrade to 1440p and go to at least a 27 inch screen or even a 32 inch. But in my opinion the upgrade there was definitely noticeable and was worth it. I think your 5700 can use AMD’s fsr technology in order to do some upscaling if needed which seems to work decently (Attila total war seems like they didn’t optimize as well imo, so I am using that setting to upscale it from 1080p)

This is the monitor I got through Neweggs eBay store.

https://www.newegg.com/p/N82E16824012047

But imo a 3070 is a waste of money for 1080p 60hz.
 
In my opinion this is cheating
The whole idea of 3d graphics is based on cheating, from the very start of it. The very first thing GPU does is to cut the amount of processing by dropping all the objects that are currently not visible. Then every step of rendering is more and more cheating. DLSS is just yet another step in a whole chain of cheating. If you have a problem with that you should maybe stick to 2D graphics.
 
The whole idea of 3d graphics is based on cheating, from the very start of it. The very first thing GPU does is to cut the amount of processing by dropping all the objects that are currently not visible. Then every step of rendering is more and more cheating. DLSS is just yet another step in a whole chain of cheating. If you have a problem with that you should maybe stick to 2D graphics.
Maybe I was not clear enough. If you can't perform realtime ray-tracing at native 4K with a usable framerate without upscaling from a lower resolution it's false advertising to claim that this video card can do ray-tracing at 4K (cheating the client). Your example of dropping all the objects that are not visible is called "optimizing". And thank you for the suggestion to stick with 2D graphics - it's completely outside the topic of the question.

Thank you all for the replys. I will wait for RTX4070 and take out of my mind the ray-tracing. I'd like to see some global illumination (Mental Ray) in games though.
 
Last edited:
The whole "cheating" thing is utterly bizarre.

By this logic, multiple cores, where single cores weren't getting it done, is cheating. CPUs and GPUs automatically boosting is cheating. Using fans to hide the fact that CPUs can't cool themselves down is cheating. And yes, 3D-like graphics is cheating since they only present the illusion of three dimensions (which is why it was brought up); there's no depth there. CD Projekt Red doesn't disclose to you that you're not actually jumping into V's body after travelling through time!

And, you won't believe this, but I bought some ground beef at the store today and the sell by date is four days from now. But what they don't tell you is their false advertising: you have to also have a refrigerator! I can't believe the grocery store is cheating on their food spoilage.

The nefariousness is even personal. I have some back issues and occasionally, I have to cheat by using artificial combinations of chemicals to trick my body into thinking it doesn't hurt. I even rely frequently on a machine to "upscale" my ability to travel long distances; I didn't really walk 250 miles last week to be in a state two states over, clearly misrepresenting my walking ability to people who saw me in a different state than they normally associate with me.
 
Honestly, I’ll tell you right now, save your money. Your card is basically still in the performance realm of the rx6600, 6600xt or rtx 3060.

Disclaimer I do have an rx 6700xt, but I was using a 1080p 144hz display. However I got a good deal on a 32 inch 1440p 165hz screen. Let me tell you…wow! After some tuning, it runs great and I can tell a good difference between 1080p and 1440p in how sharp things are. For example I was playing World of warships, and it’s pretty cool when you zoom into your battleship and you can look at the cannons and see the rifling inside the barrels.

So ray tracing is cool but honestly, I’ve seen it a bit on my Xbox series s, it’s neat but give me higher fps. You really want an image quality upgrade, upgrade to 1440p and go to at least a 27 inch screen or even a 32 inch. But in my opinion the upgrade there was definitely noticeable and was worth it. I think your 5700 can use AMD’s fsr technology in order to do some upscaling if needed which seems to work decently (Attila total war seems like they didn’t optimize as well imo, so I am using that setting to upscale it from 1080p)

This is the monitor I got through Neweggs eBay store.

https://www.newegg.com/p/N82E16824012047

But imo a 3070 is a waste of money for 1080p 60hz.
I believe he has an HD 5700, not RX 5700.
Pretty sure it's a carry over from an old pc, It's from like 2009.
 
As I said, go to 1440p imo that will be a much bigger image quality boost for you. For other options vs a 4070 you may also check out the 3090 or 3080ti.

Edit…good catch about the hd 5700. Almost forgot about those. I remember when those first came out how fast they were but not being able to get one.
 
Maybe I was not clear enough. If you can't perform realtime ray-tracing at native 4K with a usable framerate without upscaling from a lower resolution it's false advertising to claim that this video card can do ray-tracing at 4K (cheating the client). Your example of dropping all the objects that are not visible is called "optimizing".
A lot of ways computers generate data for human consumption "cheats" in order to save space or computation time.
  • JPEG drops a lot of information in higher frequency areas of the image.
  • MP3 and other lossy codecs drop a lot information that you won't hear or won't really notice without serious A-B testing
  • Video formats since MPEG-2 do things like store every other frame (or every 3rd or 4th frame) as is, and use things like motion vectors and bi-directional prediction to reconstruct the frames in between.
Compared to say PNG, FLAC, and... whatever lossless video format exists, is this considered cheating? And if we simply focus on the resolution aspect itself, even in the 3D rendering pipeline, things are reduced in resolution internally for the sake of performance (things like AO, certain shadows, reflections, etc.). If everything isn't being rendered to output a "true" resolution without upscaling, is it cheating to get there?

Move away from computers and take a look at other products that take advantage of human perception in order to cut the cost of something. If you can't tell the difference, what's the problem?
 
Maybe I was not clear enough. If you can't perform realtime ray-tracing at native 4K with a usable framerate without upscaling from a lower resolution it's false advertising to claim that this video card can do ray-tracing at 4K (cheating the client). Your example of dropping all the objects that are not visible is called "optimizing". And thank you for the suggestion to stick with 2D graphics - it's completely outside the topic of the question.

Thank you all for the replys. I will wait for RTX4070 and take out of my mind the ray-tracing. I'd like to see some global illumination (Mental Ray) in games though.
DLSS is not cheating in my own opinion since it really does look just as good, and also, they are not cheating with their performance claims, because while they cannot run cyberpunk 2077 ultra at 4k with RTX, they can run F1 2022, and minecraft RTX, and shadow of the tomb raider, and metro exodus all at 4K ultra with 60 fps.
Also, is 40fps not "running"? some people think that 40fps is more than enough for games like cyberpunk.

They are not cheating in their claims, they do their best to shine on the good qualities of their product, while side lining the bad qualities.
It's called marketing.
 
Another thing to add: ray tracing is a really old algorithm. As in, the basic form of it is approaching 50. Rasterization is really just a bunch of hacks and cheating to get image quality to look as good as ray tracing. You could argue that yes, we've developed rasterization really well to a point where one could make a scene comparison that shows they're both imperceptible. But I still see cases where rasterization falls apart and generates scenes that don't make sense more often than not (a common one being light leakage in places that don't make sense, and I really get annoyed with SSR with no cube map fallback)

I think everyone in this thread has gotten a little too upset at OP for saying DLSS is cheating, he thought it was like old upscaling and old interpolation, cut him some slack.
In a manner of speaking, DLSS is a form of upscaling and interpolation. OP's argument makes sense though: can you really call it 4K rendering if you're not rendering at 4K? If you want to be pedantic, no, you can't. Like how you have to be specific in some video formats like 480i vs 480p. But if the end result is imperceptible from doing it "correctly," does it really matter? I think we agree no, it doesn't.

The only thing that still needs to be said though when showing data is to make sure that DLSS is specified. And for the most part NVIDIA and everyone else who tests the cards have been diligent about that. So I don't think NVIDIA is cheating in that sense, because they're clearly saying what they're doing to get there.
 
Please note that I'm not a native English speaker. By the enthusiastic reviews I was mislead to believe that RTX series 30 can actually render a game scene with ray-tracing in at least 1080p with a usable framerate. When I found that this is possible only with upscaling (yes AI, yes unnoticeable) it pissed me off. Note also that NVidia cards have inflated prices, at least in my country, way over the launch price, and I was ready to pay a lot of money for one card justifying the investition by ray-tracing performance without upscaling tricks. Damn marketing.
To all AI-enhanced upscaling fans - would it bother you if the RTX cores could render 4K at an usable framerate and DLSS would not be necessary? Would you like to buy a movie advertised as 4K 60fp and later to find out it was originally shot in 1080p 30 fps and upscaled plus interpolated? Would you not think that the movie would have looked way better if either was shot in native 4K at 60 fps or sold correcty as 1080p 30 fps and not raise your expectations too high?
Yes, my mistake that at first I ignored the DLSS from the charts. I like a lot your divagation about beef, but hey, have you heard about the right to have an opinion?
For work I did some projects in 3D Studio Max. Every upscale algorithm I used after render (no AI enhanced, only Lanczos, Bilinear, Bicubic) produced crap results compared to rendering at target resolution. And my Sony X900H TV is advertised with excellent interpolation algorithms for a smooth motion - guess what, it has a crappy motion (horrible judder/stutter) for every progressive footage that is not 60fps. Another marketing crap. So no upscaling and no interpolation for me.
 
You're entitled to your opinion. But so are we. And yes, the cheating thing is bizarre because it shows a real lack of understanding of how any of this works. You're almost never seeing raw video or music, everything is packaged, compressed, upscaled, optimized, and so on, at some level of the process.

There are a lot of reasons to dislike Nvidia, but they've been quite upfront about how ray-tracing and DLSS work; in fact, they've been bragging about the AI upscaling and proudly touting how good their algorithms for it are, which is the exact opposite of "hiding" something. You've basically arbitrarily declared that something is cheating and then loudly demanded that companies conform to those arbitrary definitions you've made. Again, you're free to express that opinion. And anyone else is free to call it out as absolute hogwash. I doubt few would object if you said "I don't like upscaling," but instead you chose cheating, which is a word with a very negative connotation and significant real-world consequences.
 
To all AI-enhanced upscaling fans - would it bother you if the RTX cores could render 4K at an usable framerate and DLSS would not be necessary? Would you like to buy a movie advertised as 4K 60fp and later to find out it was originally shot in 1080p 30 fps and upscaled plus interpolated? Would you not think that the movie would have looked way better if either was shot in native 4K at 60 fps or sold correcty as 1080p 30 fps and not raise your expectations too high?
You know what matters to me?

How it looks to my eyeballs.

Irrespective of any alphabet soup of acronyms it took to get there.
 
To all AI-enhanced upscaling fans - would it bother you if the RTX cores could render 4K at an usable framerate and DLSS would not be necessary?
No. Why would it if it looks indistinguishable from 4K? So here's a test for you: can you tell which one is the native rendered resolution one and which one is the DLSS rendered one?

Lbd9nMd.png


And let me ask you something, someone who wants the "raw" experience:
  • Does it bother you that lossy music doesn't give you everything that the original recording picked up?
  • Does it bother you that most video doesn't encode the color resolution at the same resolution as the output? (Most video is encoded with 4:2:2 or lower chroma subsampling)
  • Does it bother you that most video also doesn't encode every frame, but rather only encodes the entire frame at regular intervals and reconstructs the frames in-between with data that takes up less space?
  • Does it bother you that your phone camera dropped a bunch of high frequency portions (i.e., sudden color changes) of the image it took?
Would you like to buy a movie advertised as 4K 60fp and later to find out it was originally shot in 1080p 30 fps and upscaled plus interpolated? Would you not think that the movie would have looked way better if either was shot in native 4K at 60 fps or sold correcty as 1080p 30 fps and not raise your expectations too high?
No, because I don't really buy a movie for its resolution. Sure, I'd go for a 4K one if possible, but I'm not going to bemoan the lack of something higher than 480p.

Also any film crew worth their weight wouldn't rely on the resolution for important details. That is, they're not going to shoot 4K and throw something important an area that's only clear in 4K. They might throw an Easter Egg for those that have the higher resolution, but important details are going to look good at the common formats.

Yes, my mistake that at first I ignored the DLSS from the charts. I like a lot your divagation about beef, but hey, have you heard about the right to have an opinion?
Yes, and you're free to say what you're opinion is. And we're free to disagree with reasons why and further question your position on the matter. If it doesn't bother you that most of the content you consume from the internet is lossy compressed, then it doesn't make sense to me that DLSS bothers you this much when it's essentially in the same category of "optimization."
 
Last edited:
The whole idea of 3d graphics is based on cheating, from the very start of it. The very first thing GPU does is to cut the amount of processing by dropping all the objects that are currently not visible. Then every step of rendering is more and more cheating. DLSS is just yet another step in a whole chain of cheating. If you have a problem with that you should maybe stick to 2D graphics.
Wait until he finds out about Rasterization
 
I can say myself I see both sides but I also prefer to run at the native resolution personally. There’s one game or so that I have to use fsr with because the game is poorly optimized. If anyone else has played Attila total war you probably know what I’m talking about. Rasterization may be similar, and I know they can sharpen and all but imo being able to run in the native resolution looks better still.