News RTX 4090 Gets Just 16 FPS in Cyberpunk 2077 RT Overdrive Preview

Any ideas on a 12GB 3080 at 1440P or 1080p with DLSS2?

If it can run at 30-40 fps that would be just fine.
So, 4090 is getting maybe 16~24 fps at 4K native. Dropping to 1080p native would probably get 3-4 times that performance, and then Frame Generation give you another ~50% is what I'd guess (it could be more). If a 4090 can get 60~70 fps at 1080p native without FG, that would probably put a 3080 12GB at roughly half that level of performance. Turn on DLSS upscaling and I'd guess 1080p will be very playable, and 1440p should still break 30 fps.
 

mhmarefat

Distinguished
Jun 9, 2013
41
42
18,560
Glad to see billionaire companies are hard at work to shove "true ways of gaming" down our throats. Makes me wonder what was nvidia doing with introduction of RT with 20 series GPUs when years later even 4090 is biting dust here? Why enter this incomplete technology into gaming industry? The rich are hard at work to make themselves even richer.
Don't let these greedy corporations dictate how you should enjoy your games people, just don't. They are shaping the future of PC Gaming for the worse (rich man's hobby).
 
I wonder if overdrive RT requires an nvidia card to run because of ties to nvidia or if will it be available on AMD. :unsure:

Current ray tracing in Cyberpunk 2077 is actually playable on a 6800 XT with about 50-60fps at 1440p using RT ultra and FSR 2 on performance and doesn't look too bad to be honest. There is some grainly blocky steam and smoke effects and some other artifacts like ghosting on physics objects, but everything else looks pretty decent all things considered.
 
I wonder if overdrive RT requires an nvidia card to run because of ties to nvidia or if will it be available on AMD. :unsure:

Current ray tracing in Cyberpunk 2077 is actually playable on a 6800 XT with about 50-60fps at 1440p using RT ultra and FSR 2 on performance and doesn't look too bad to be honest. There is some grainly blocky steam and smoke effects and some other artifacts like ghosting on physics objects, but everything else looks pretty decent all things considered.
My understanding is that it's still just using DXR API, though I could be wrong. I know SER is an Nvidia tech, but I don't know that it strictly requires an Nvidia card. I think it's something that happens if the tech is available on a GPU — because only 40-series supports SER. So FG, SER, and OMM are new things that need Ada Lovelace and I assume will simply not be used / supported if you're not on RTX 40-series or later.
 

Philomorph

Distinguished
Sep 11, 2010
14
0
18,510
I'm guessing it won't be playable under any settings on my RTX 2070. I can currently get almost decent frame rates with the old Ray Tracing half-enabled. Good thing I'll be done playing Cyberpunk before I could save up enough for a 4070 or better - it'll be a moot point I guess :)
 

bkuhl

Distinguished
Dec 31, 2007
22
16
18,515
Glad to see billionaire companies are hard at work to shove "true ways of gaming" down our throats. Makes me wonder what was nvidia doing with introduction of RT with 20 series GPUs when years later even 4090 is biting dust here? Why enter this incomplete technology into gaming industry? The rich are hard at work to make themselves even richer.
Don't let these greedy corporations dictate how you should enjoy your games people, just don't. They are shaping the future of PC Gaming for the worse (rich man's hobby).

If this was the mentality when Quake 3 came out and ran at like 20-30fps @ 800x600 where would we be today? The companies weren't any "poorer" back then, they were still multi-billion dollar companies.
Software and hardware companies show demos. They make us aspire to own these capabilities in future generations, and hope developers see where they should be pushing their products to get to in 2, 3, 5 years!

Just because a a RTX4090 running on a $4000, $5000 or more PC looks VERY cool doesn't mean you are not allowed to enjoy your budget gaming rig you threw together for $700\800 to play CyberPunk. Dude, just enjoy what you have and know that your budget build in 3\4 years will likely be doing what this demo shows today...
 
Feb 21, 2023
9
22
15
Glad to see billionaire companies are hard at work to shove "true ways of gaming" down our throats. Makes me wonder what was nvidia doing with introduction of RT with 20 series GPUs when years later even 4090 is biting dust here? Why enter this incomplete technology into gaming industry? The rich are hard at work to make themselves even richer.
Don't let these greedy corporations dictate how you should enjoy your games people, just don't. They are shaping the future of PC Gaming for the worse (rich man's hobby).
Yeah! You're right! We should all stick to Atari 2600-level graphics! Don't let the rich man try to make your games look better!

Fight the power!!!111!1!
 

Ar558

Proper
Dec 13, 2022
228
93
160
Those RTX ON/OFF screenshots make me realize how little I care about raytracing. The difference is minuscule and irrelevant. Certainly not worth cutting your FPS in half.
I can hardly tell. This will be just like audiophile stuff where some arrogant so and sos claim they can "see" more than everyone else so they need to have a 4090Ti/5090/6090 etc to play in this mode. Most people play games, and if you are playing you don't have time to pixel peep which is the only real way to tell. When cards are fast enough to do all this and still push triple figure FPS then I'll consider using it for the marginal difference but until then I'll leave it to those who have too much money and not enough sense.
 

InvalidError

Titan
Moderator
Glad to see billionaire companies are hard at work to shove "true ways of gaming" down our throats.
Nvidia needs to beat RT into people's heads because it knows that if it fails to get that hype train going, most people will be perfectly fine with whatever semi-recent GPUs they currently have for the next several years.

I can hardly tell. This will be just like audiophile stuff where some arrogant so and sos claim they can "see" more than everyone else so they need to have a 4090Ti/5090/6090 etc to play in this mode. Most people play games, and if you are playing you don't have time to pixel peep which is the only real way to tell.
Unlike audiophoolery though, you do have the option of "pixel peeping" to actually confirm the difference, unlike audiophoolery where you can hook up a spectrum analyzer, show that the signals are within noise floor from being exactly identical using a measuring instrument 1000x more accurate than human ears and they'll still insist that one is better than the other.

That said, I agree that the vast majority of eye-candy is of little or no consequence during active gameplay where you don't have time to look for it.
 
Feb 21, 2023
9
22
15
I can hardly tell. This will be just like audiophile stuff where some arrogant so and sos claim they can "see" more than everyone else so they need to have a 4090Ti/5090/6090 etc to play in this mode. Most people play games, and if you are playing you don't have time to pixel peep which is the only real way to tell. When cards are fast enough to do all this and still push triple figure FPS then I'll consider using it for the marginal difference but until then I'll leave it to those who have too much money and not enough sense.
I disagree with the audiophile comparison and the last four words of your post. True, fully ray-traced games are the "holy grail" of photo-realism. The fact that you can "hardly tell" is just a matter of how much work developers have put into faking it. Once the hardware eventually catches up, it will be affordable for everyone. Until then, there's nothing stopping you from continuing to play with lower realism settings. You just sound like you're bitter and whiny. I remember back when the first 16:9 widescreen TVs were over $20k. They have to start somewhere or progress won't ever come.

Stop complaining about the "rich gamers" instead of thanking them for funding your future of better affordable realism in games.
 

InvalidError

Titan
Moderator
Once the hardware eventually catches up, it will be affordable for everyone.
At the rate sub-$300 GPUs have been improving since the GTX1050 (double the performance every five years) and how the RTX4090 is struggling with full-RT, the days of full-RT for everyone are 15-20 years away assuming "entry-level" progress doesn't slow down even further.

Faking it will be necessary for a very long time to come.
 

DougMcC

Commendable
Sep 16, 2021
127
86
1,660
At the rate sub-$300 GPUs have been improving since the GTX1050 (double the performance every five years) and how the RTX4090 is struggling with full-RT, the days of full-RT for everyone are 15-20 years away assuming "entry-level" progress doesn't slow down even further.

Faking it will be necessary for a very long time to come.

That assumes that RT improvement roughly aligns with overall performance growth. But RT is a new feature at the earlier part of its growth curve. If games are heading in this direction, RT can consume much more of new transistor budgets and grow at a much faster pace. I would not be surprised to see the 5 series jump 5x.
 
  • Like
Reactions: bkuhl and KyaraM

vampz

Distinguished
Mar 29, 2013
4
2
18,510
Glad to see billionaire companies are hard at work to shove "true ways of gaming" down our throats. Makes me wonder what was nvidia doing with introduction of RT with 20 series GPUs when years later even 4090 is biting dust here? Why enter this incomplete technology into gaming industry? The rich are hard at work to make themselves even richer.
Don't let these greedy corporations dictate how you should enjoy your games people, just don't. They are shaping the future of PC Gaming for the worse (rich man's hobby).

Nothing in the test was shoving the "true way of gaming" into anyone, are you daft?
4K no DLSS is true gaming? Guess everyone running 1080p playing on high settings on high refresh isn't a true gamer!

Stick to surfing and less posting, goof.
 

InvalidError

Titan
Moderator
If games are heading in this direction, RT can consume much more of new transistor budgets and grow at a much faster pace. I would not be surprised to see the 5 series jump 5x.
Throwing more transistors at RT means also throwing more die area at it on top of everything else that gets scaled up, which is going to get absurdly expensive very fast if you aim for a 5X gen-on-gen improvement.

Current word is 2-2.6X for the 5k-series. Based on how Nvidia's previous 2.x times from 3k to 4k included DLSS3 frame generation with something closer to a 50% improvement in raw throughput, I suspect the preliminary 5k-series projections are more of the same. Hopefully without the same-to-same tier price increases.
 
Feb 21, 2023
9
22
15
At the rate sub-$300 GPUs have been improving since the GTX1050 (double the performance every five years) and how the RTX4090 is struggling with full-RT, the days of full-RT for everyone are 15-20 years away assuming "entry-level" progress doesn't slow down even further.

Faking it will be necessary for a very long time to come.
Maybe so, but that doesn't change the fact that I can get a flat panel TV today for $300 that blows the original $20k ones away. Like I said before, progress takes time, but you have to start somewhere.

If someone had written off Pong as a waste, we'd never have all the nice things we have now.
 

InvalidError

Titan
Moderator
Maybe so, but that doesn't change the fact that I can get a flat panel TV today for $300 that blows the original $20k ones away. Like I said before, progress takes time, but you have to start somewhere.
The technology to make large-format TV is still getting cheaper because there is plenty of room to improve a process that is dealing with micron-sized structures. The technology to make higher-density chips with practically atomic-scale structures is getting more expensive each year faster than density can offset transistor count increases and we are hitting practical limits on some critical large structures such as SRAM. Consumer computing may be about to hit a brick wall where future performance will scale in one dimension only: how much you are willing to spend to throw more silicon at whatever it is you want to do.
 

DavidLejdar

Prominent
Sep 11, 2022
245
144
760
Some are presenting realism as a feature. But is in-game jumping and splashing in puddles realistic? Hardly ever. And then there is also the topic of acoustics, for which there rarely seems to be an attempt at making it according to the environment the sound happens to be in.

Not that I personally would expect total realism in video games. For example, in "AER: Memories of Old" I like the (cartoonish) art-style. But just meaning to point out that there is more to physical realism than just how the reflections of a glass of water in sunlight are depicted.
 
The technology to make large-format TV is still getting cheaper because there is plenty of room to improve a process that is dealing with micron-sized structures. The technology to make higher-density chips with practically atomic-scale structures is getting more expensive each year faster than density can offset transistor count increases and we are hitting practical limits on some critical large structures such as SRAM. Consumer computing may be about to hit a brick wall where future performance will scale in one dimension only: how much you are willing to spend to throw more silicon at whatever it is you want to do.
Not wrong, but there is another path: optimization. I'm pretty sure we could get twice the performance (at least) out of current silicon and process if they were better utillized. Better programming would be one thing, dedicated compute units and less bottlenecks would be another.
 
  • Like
Reactions: LolaGT and KyaraM