News RTX 4090 Gets Just 16 FPS in Cyberpunk 2077 RT Overdrive Preview

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

s997863

Distinguished
Aug 29, 2007
143
10
18,685
"Overdrive is the new Crysis"
Except that Crysis 1 was a fun game, at least until the aliens showed up. Enough so to get the highest score on PCGamer. Cyberpunk is not.
Would be more apt to say "Overdrive is the new PhysX or Directx-10", i.e. "not worth ANY performance hit, so just turn it off".
 
Feb 21, 2023
9
22
15
The technology to make large-format TV is still getting cheaper because there is plenty of room to improve a process that is dealing with micron-sized structures. The technology to make higher-density chips with practically atomic-scale structures is getting more expensive each year faster than density can offset transistor count increases and we are hitting practical limits on some critical large structures such as SRAM. Consumer computing may be about to hit a brick wall where future performance will scale in one dimension only: how much you are willing to spend to throw more silicon at whatever it is you want to do.
Moore may be dead, but his law isn't quite yet. There is plenty of room for improvement in process yields, MCMs, etc. If you or I knew more than the folks at NVIDIA, then we'd be competing against them. They obviously have plans for the future of this tech.

As for how much money I'm willing to spend, people have become spoiled over the price of electronics lately. Back in 1993, my 486DX2 66 cost me over $4000. Adjusted for inflation, that's more than $8000 today.

The thing is that you don't NEED the bleeding-edge features to enjoy games. So what if there's a 120" 8k full microLED monitor and a $15k GPU that can pump out fully-raytraced games at that resolution? You probably can't afford a Bugatti either. It doesn't stop you from enjoying the things you can afford.

PC gaming has always been an expensive hobby. I happen to be in a position in my life where I can afford it. For the rest, there's the upcoming RTX4060 and medium quality settings at 1080p.
 
  • Like
Reactions: KyaraM
Moore may be dead, but his law isn't quite yet. There is plenty of room for improvement in process yields, MCMs, etc. If you or I knew more than the folks at NVIDIA, then we'd be competing against them. They obviously have plans for the future of this tech.

As for how much money I'm willing to spend, people have become spoiled over the price of electronics lately. Back in 1993, my 486DX2 66 cost me over $4000. Adjusted for inflation, that's more than $8000 today.

The thing is that you don't NEED the bleeding-edge features to enjoy games. So what if there's a 120" 8k full microLED monitor and a $15k GPU that can pump out fully-raytraced games at that resolution? You probably can't afford a Bugatti either. It doesn't stop you from enjoying the things you can afford.

PC gaming has always been an expensive hobby. I happen to be in a position in my life where I can afford it. For the rest, there's the upcoming RTX4060 and medium quality settings at 1080p.
True - but a lot of that cost reduction comes from economy of scale. Also, at the time a 486 was $4K, you could still perfectly enjoy a 386 for a third of the price, eventhough it was a quarter of the top performance. Heck, you could still use a 286 comfortably - you just couldn't play Doom!
The problem is that games have gotten extremely bloated, and the operating system's need for optimization has gone out the window in the consumer space. Remember how many people complained that Windows 95 on their 486 required 8 Mb of RAM instead of the 4 they usually had? And the system was slower than DOS + Win3.11 ? Yeah, same crap we have today. People are focusing on the (extremely expensive) top range, when the mid-range is perfectly good enough.
 
D

Deleted member 1353997

Guest
Glad to see billionaire companies are hard at work to shove "true ways of gaming" down our throats.
Is that what you said when gaming transitioned from 2D to 3D? Or when anti-aliasing was introduced? Or when we got shaders? HDR rendering? Tesselation? Subsurface scattering?

If the rapid evolution of PC gaming graphics disturbs you so much, why not just buy a console and be done with it? You can get a PS5, Xbox Series X, or Switch for far less than an RTX 4090. People who spent that kind of money obviously want to see a return on their investment. Besides, nobody really needs an RTX 4090. Cyberpunk 2077 will run on weaker GPUs if you're willing to lower the graphics. And if you're patient enough, it will run in Overdrive at 4k at acceptable performance on your budget RTX 7050 when it releases in ~6 years.

What are you complaining about? That you don't get to enjoy new technologies that improve graphics quality without having to spend money on a new GPU? Welcome to the real world.
 

Ar558

Proper
Dec 13, 2022
228
93
160
I disagree with the audiophile comparison and the last four words of your post. True, fully ray-traced games are the "holy grail" of photo-realism. The fact that you can "hardly tell" is just a matter of how much work developers have put into faking it. Once the hardware eventually catches up, it will be affordable for everyone. Until then, there's nothing stopping you from continuing to play with lower realism settings. You just sound like you're bitter and whiny. I remember back when the first 16:9 widescreen TVs were over $20k. They have to start somewhere or progress won't ever come.

Stop complaining about the "rich gamers" instead of thanking them for funding your future of better affordable realism in games.

While the Audiophile comparison isn't exact it is analogous, if developers can "Fake it" and produce way better performance why would need the real thing (or at least care about it until it can be done at high FPS ), I'm not bitter but rich gamers have ruined the GPU market by paying nVidia's inflated prices, if they max anyone would pay for a GPU was $800 rather than $2k then the 60 and 70 class cards would still be affordable. But the rich don't care that they are f**king the whole thing up for everyone else.
 
  • Like
Reactions: mhmarefat

mo_osk

Reputable
Nov 13, 2020
33
16
4,535
Those RTX ON/OFF screenshots make me realize how little I care about raytracing. The difference is minuscule and irrelevant. Certainly not worth cutting your FPS in half.


I understand why you're saying that but you're missing out on what ray tracing is offering. First of all you can't really see what it brings on a still image, the dynamic real time lighting and shadowing allowed by the technique isn't going to appear obviously different without motion.

Secondly there are a lot of techniques that allow for good enough approximation of what ray tracing make possible while using simple rasterization. The problem is that it demands a lot of works and that most studio aren't capable of outputting something that look like Cyberpunk 2077. Or at least not at this scale.

What a generalization of hardware ray tracing capability would do for the video game industry is completely change the way levels are build, the number of artist and developers that have to intervene on a scene to make it look plausibly lighted would change dramatically. Not all innovation is about how it look, sometimes its about how its made. But its harder to sell.
 
D

Deleted member 1353997

Guest
if developers can "Fake it" and produce way better performance why would need the real thing
Because it's expensive for the production side.

In the simplest case, you have static lighting. This means that the light sources will never move or change. In that case, you can just use ray tracing to bake the lighting and include that information in the game. That information increases the size of the game, costs time for the level designer, and needs to be repeated every time you make a change in the level.

But in some cases you need dynamic lighting. What if you have multiple lightbulbs that can be turned on or off individually?

In this case, level designers have to spend countless hours placing and adjusting additional light sources so the scene looks somewhat realistic when a certain light bulb is on or off. Why? Because that's how you get indirect lighting. It's the reason why your room isn't pitch black during the day, even though the only source of light is a single small window. The light that comes in through the window bounces off your floor and walls and illuminates parts of your room that aren't directly illuminated by your window.

This Digital Foundry video about the Metro Exodus Enhanced Edition explains this pretty well:
View: https://www.youtube.com/watch?v=NbpZCSf4_Yk&t=1376s
 
Last edited by a moderator:

PEnns

Reputable
Apr 25, 2020
609
644
5,760
Once the hardware eventually catches up, it will be affordable for everyone.

Stop complaining about the "rich gamers" instead of thanking them for funding your future of better affordable realism in games.

LOL, thanks for the good laugh!!

The good old "drizzle-down economy" foisted on the gullible by RayGun is alive an well.
 
D

Deleted member 2838871

Guest
Is that what you said when gaming transitioned from 2D to 3D? Or when anti-aliasing was introduced? Or when we got shaders? HDR rendering? Tesselation? Subsurface scattering?

If the rapid evolution of PC gaming graphics disturbs you so much, why not just buy a console and be done with it? You can get a PS5, Xbox Series X, or Switch for far less than an RTX 4090. People who spent that kind of money obviously want to see a return on their investment. Besides, nobody really needs an RTX 4090. Cyberpunk 2077 will run on weaker GPUs if you're willing to lower the graphics. And if you're patient enough, it will run in Overdrive at 4k at acceptable performance on your budget RTX 7050 when it releases in ~6 years.

What are you complaining about? That you don't get to enjoy new technologies that improve graphics quality without having to spend money on a new GPU? Welcome to the real world.

Exactly. It's the world we live in today.

I once bought a flagship card (1080 Ti ) for a retail price of $699.99... and just paid $1749.99 for a 4090. Did I need it? Of course not. But I justified it by the fact that the 4090 is a pretty darn impressive generational leap over the 3090.
 

colossusrage

Prominent
Jun 8, 2022
55
60
610
Many don't seem to understand that RT is coming whether they like it or not. Developers will be using the technology as the standard so they can stop using baked in lighting and shadows which takes up a lot of time. It may not be for maybe two or more GPU generations, but RT will eventually become the standard lighting with no option to turn it off. Due to the lack of hardware that can run full RT, developers are left to just halfway implement RT which many times doesn't improve graphics to a noticeable extent. Scream all you want about how RT doesn't make a difference, you will eventually see it no different than any other technology that has been implemented and become standard over the past 30 years.
 

colossusrage

Prominent
Jun 8, 2022
55
60
610
"Overdrive is the new Crysis"
Except that Crysis 1 was a fun game, at least until the aliens showed up. Enough so to get the highest score on PCGamer. Cyberpunk is not.
Would be more apt to say "Overdrive is the new PhysX or Directx-10", i.e. "not worth ANY performance hit, so just turn it off".
I actually enjoyed both. Crysis was probably more fun overall, but CP2077 really wowed me in how much detail and scale the game had. I haven't played it in over a year, but I'm looking forward to going back to it now that I have better hardware. BTW I played Crysis remastered with a 3080 12GB at 1440p on "Can it Run Crysis" mode and still only got 45fps. That game is ridiculous.
 

InvalidError

Titan
Moderator
Moore may be dead, but his law isn't quite yet. There is plenty of room for improvement in process yields, MCMs, etc. If you or I knew more than the folks at NVIDIA, then we'd be competing against them. They obviously have plans for the future of this tech.

As for how much money I'm willing to spend, people have become spoiled over the price of electronics lately. Back in 1993, my 486DX2 66 cost me over $4000. Adjusted for inflation, that's more than $8000 today.
The difference between the 486 30 years ago and now is that back in the 486's days, the cost of putting transistors on silicon was halving every ~18 months, so we got chips about twice as complex and 2-3X as fast at the same or lower prices every ~1.5 year despite inflation. Today, the cost of transistors is almost flat and we get the current situation of mostly stagnant performance per dollar, albeit at currently still inflated prices. AMD and Nvidia see the writing on the wall, buyers' next GPU is likely to be their last for a while and they want to wring every dollar they can out of it.
 
D

Deleted member 2838871

Guest
CP2077 really wowed me in how much detail and scale the game had. I haven't played it in over a year, but I'm looking forward to going back to it now that I have better hardware.

It's funny you mention that because that's exactly what I just did. I've never played it... but did purchase and install it yesterday evening when I was setting up my PC. I did a system wipe when I set up the new CPU and GPU to get rid of the crap I never used and add new stuff.

I remember when CP2077 came out reading all the complaints about performance when is why I've waited this long to pull the trigger. Looking forward to diving right into it.
 
D

Deleted member 2838871

Guest
The difference between the 486 30 years ago and now is that back in the 486's days, the cost of putting transistors on silicon was halving every ~18 months, so we got chips about twice as complex and 2-3X as fast at the same or lower prices every ~1.5 year despite inflation. Today, the cost of transistors is almost flat and we get the current situation of mostly stagnant performance per dollar, albeit at currently still inflated prices. AMD and Nvidia see the writing on the wall, buyers' next GPU is likely to be their last for a while and they want to wring every dollar they can out of it.

You're not wrong. My 7700k/1080 Ti system I ran with from 2017 to 2021... and then went to a 10900k/3090 rig and yesterday just upgraded the same rig with an 11900k/4090. Can't do much else without building an entirely new PC which I don't plan on doing till 2027 or 2028.
 
Glad to see billionaire companies are hard at work to shove "true ways of gaming" down our throats. Makes me wonder what was nvidia doing with introduction of RT with 20 series GPUs when years later even 4090 is biting dust here? Why enter this incomplete technology into gaming industry? The rich are hard at work to make themselves even richer.
Don't let these greedy corporations dictate how you should enjoy your games people, just don't. They are shaping the future of PC Gaming for the worse (rich man's hobby).

It never ceases to amaze me how whinny some people get on this forum every time Nvidia introduces a new technology, feature, or GPU

Don't like ray tracing? Don't use it! Don't like how expensive the new GPUs are? Don't buy them! You hate Nvidia? Don't do business with them!

For crying out loud, if you don't have anything helpful to say, what's the point of clogging up the forum with "booo, Nvidia bad!!!!!!"
 

hannibal

Distinguished
Glad to see billionaire companies are hard at work to shove "true ways of gaming" down our throats. Makes me wonder what was nvidia doing with introduction of RT with 20 series GPUs when years later even 4090 is biting dust here? Why enter this incomplete technology into gaming industry? The rich are hard at work to make themselves even richer.
Don't let these greedy corporations dictate how you should enjoy your games people, just don't. They are shaping the future of PC Gaming for the worse (rich man's hobby).

They start sellin their next gen solutions that can actually run this ;)
It is busines like usual you see!

Well the FLR 3.0 will save the owners of older tech card to somewhat. But in reality real raytrasing s still years ahead! But we are moving toward that little by little.
 

InvalidError

Titan
Moderator
Well the FLR 3.0 will save the owners of older tech card to somewhat.
The problem with DLSS3 (or any other frame generation tech) when the base frame rate is under 40fps is that you have a good amount of time to notice artifacting and prediction errors, especially if those anomalies occur in bursts across multiple generated frames.
 

husker

Distinguished
Oct 2, 2009
1,209
221
19,670
It cannot be overstated how important it is to first make an engine like this that can do a fully ray-traced game. It is needed to spur on the development of mathematical techniques that will potentially increase performance without requiring multiple generations of hardware to come and go. On a planet without mountains, who's going to manufacture mountain climbing gear?
 

elforeign

Distinguished
Oct 11, 2009
97
130
18,720
Those RTX ON/OFF screenshots make me realize how little I care about raytracing. The difference is minuscule and irrelevant. Certainly not worth cutting your FPS in half.

While the hardware necessary to play with raytracing is very expensive, the argument that the difference in minuscule and irrelevant is just a flat out lie.

Raytracing looks fantastic for the elements it's supposed to highlight, and doubly amazing we have gotten to the point where such graphics can be enjoyed in real-time, rather than in a pre-rendered sequence.

It's like hating on the invention of sliced bread, some people just want to eat a sandwich.
 
Last edited:

mhmarefat

Distinguished
Jun 9, 2013
41
42
18,560
It's amazing how RT in its current form is looked upon as "evolution" of PC gaming graphics when in reality so far exactly the opposite has happened for many titles (including big titles like WITCHER 3). Thanks to DLSS and FSR (offsprings of RT), new games are terribly optimized because "economically" it does not make sense for developer to use resources when ppl will just use ever improving up-scaling technology features and won't know and won't even notice what is happening underneath.

Another mind-boggling thing is how ppl are ignoring the elephant in the room fact that how RT is tanking performance of MANY GENERATIONS OF GRAPHICS CARDS for little to zero (apprehensive) visual improvement and AFTER YEARS finally ending up to 16 fps 4090 yet such a technology must be accepted as the way forward? Why is such unfinished technology currently out of the labs, heavily marketed and implemented as game patches and FREE GAMES no less when it should just be a tech demo? Who is the target audience of this? Moron who buy 4090 only to satisfy his pathetic sense of "superiority" to "lowly" console gamer? Is this the target audience of tech industry these days?

Is that what you said when gaming transitioned from 2D to 3D? Or when anti-aliasing was introduced? Or when we got shaders? HDR rendering? Tesselation? Subsurface scattering?

If the rapid evolution of PC gaming graphics disturbs you so much, why not just buy a console and be done with it? You can get a PS5, Xbox Series X, or Switch for far less than an RTX 4090. People who spent that kind of money obviously want to see a return on their investment. Besides, nobody really needs an RTX 4090. Cyberpunk 2077 will run on weaker GPUs if you're willing to lower the graphics. And if you're patient enough, it will run in Overdrive at 4k at acceptable performance on your budget RTX 7050 when it releases in ~6 years.

What are you complaining about? That you don't get to enjoy new technologies that improve graphics quality without having to spend money on a new GPU? Welcome to the real world.
anti-aliasing took years to develop only to end up offering 16 fps for $2k+ graphics cards? and what do you have to say to technologies such as chromatic aberration that majority don't like yet are imposed upon them? and how are you using the word "RAPID" to describe advancement rate of RT?

RT should've stayed in the labs until common sense decided it was ready. Yet because of corporate power, it exists NOW in the industry, offering very little, asking for a lot (money and power), fooling everyone and benefiting basically no one but virtual needs of rich, moron and corporate slave.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
While the hardware necessary to play with raytracing is very expensive, the argument that the difference in minuscule and irrelevant is just a flat out lie.

Raytracing looks fantastic for the elements it's supposed to highlight, and doubly amazing we have gotten to the point where such graphics can be enjoyed in real-time, rather than in a pre-rendered sequence.

It's like <Mod Edit> on the invention of sliced bread, some people just want to eat a sandwich.

LTT did a test, and most of his staff could not tell raytraced vs non-raytraced settings apart in games.

I often think the non-raytraced versions of games actually look better. The raytraced shadows often look softer than the rasterized ones and they lose character.

You might think raytracing amazing, and more power to you if you enjoy it, but not everyone likes it.

To me raytracing offers small and meaningless differences, it doesn't improve the enjoyment of my game, and the hit on performance is not justifiable for me at all.

 
Last edited:
  • Like
Reactions: Ar558