News Control Ultimate gets DLSS 3.7 support and new 'Ultra' settings — cuts performance nearly in half

I could buy into whatever temporal stability improvements came with the new mode not being limited to it having an effect on performance. When I played Control after getting my 3080 that was the source of the only graphical anomalies I noticed while just playing the game.

Remedy sadly seems to be behind the curve when it comes to upscaling implementations on Northlight as Alan Wake 2 doesn't have XeSS or FSR 3.x either. Hopefully this will change with FBC: Firebreak or at least Control 2/Max Payne remakes.
 
I'm obviously blind as i don't see a difference on the images 🤣
My reaction too - They too low resolution.

Plus, given the changes referred to (new DLSS and more rays/pixel), I suspect these are not the sort of improvements you'll notice much in static images. Probably a bit more noticeable when moving around.
 
They just talk about a few more rays per pixel - So no, it's not a bit pathtracing update.

Mostly like a bit more stable, less grainy RT due to more rays per pixel.
Thank you.
So this means 5070 Ti without DLSS, is capable of:
"Using 8 RT samples per pixel further reduces performance, this time to 15.6 FPS".
And this is RT, not even PT. Ok. Wow. The same effects that people here cannot even detect. You turn it on, and you are reduced to 15.6 FPS (from mighty 47 FPS) in a game specially optimized for Nvidia hardware no less.

I'm obviously blind as i don't see a difference on the images 🤣
You're not alone. Even experts have a hard time distinguishing between PT (which Nvidia has told us is the REAL RT) on/off let alone "fake RT". If a game is entirely made out of glass (like Control) you can see the reflection of the character on glass surface, for "15.6 FPS".
 
Tbf Im tired with all this scaler and ray tracing talk. Wow you can tank a card with calculations it was not really made for. Impressive.

Ray / path / whatever you call it , will always be a simplified implementation of actual proper light tracing. Which is practically impossible because the formula runs to infinity. Ofc we dont need infinite bounces, but all this talk is just nvidia marketing , nobody really needs this - it hardly makes the games more pretty, just more easy to program - and spends a ton more ressources. I do not want my graphics card to deprioritize gaming graphics performance , to give way for lighting engine cores - So whats next, we need fluid simulation cores we need differential equation cores , we need blablabla. This was never the approach for a reason, it is insanely expensive and is really only a benefit to lazy developers.

Sure ray tracing is a cool technology - but its not new, its not groundbreaking, nvidia did not do the bulk of the work making this possible even and its just unfeasible anyway and immensely expensive to do. Usually what we do in gaming is make the best with what we have, we dont suggest people go and buy a full fledged real time physics simulator.... We can implement all kinds of cool physics into the games, it would just tank performance. Many old games have more than good enough lighting to be immersive , and most was just made with clever programmers and a curious mind. This idea of bruteforcing next gen graphics is outright stupid to me.

When you can demonstrate an algorithm that gets very close to the real results, without spending 150 watts on lighting, or implying that I need to buy a lightsimulator to get light in my game..... Then we can talk.

You can hardly claim to be anything but ignorant if you think that VSR spending 200watts upscaling a video on youtube is amazing .... wtf people.

In its essence it is a question of ingenuity vs brute force. Brute force is the expensive one, both in terms of dollars and watts. Im to the core not impressed whatsoever. It is a slippery slope for the entire industry imo. And is it really of benefit to the gamer when "all you need" is just an insanely expensive number cruncher , to give way for developers with zero clue about gaming engines and graphics. Its lazy, its stupid and its not to the benefit of the consumer.

Consider if in doubt about what to think, that actual real fluid dynamics simulations in practice is psycho difficult even with endless ressources. Look at formula1 as an example - again and again the numbers did not line up with practical results (these people have 100m$ computers for this). So we should just throw all ingenuity out the window and start doing fluid simulations the "physics" way? ... it does not make sense :)

I wonder , for the lulz, how jensens ray tracing would react to a double slit 😀That is to say, even this implementation is far from the "real" algorithm. It probably has no clue that light has wavelike properties, and that different wavelengths interact differently.
 
Last edited:
Remedy just released the March 2025 update to Control Ultimate Edition that adds DLSS 3.7 support, new 'ultra' settings, and some other extras. The ultra ray tracing adds support for multiple ray bounces and can cut performance nearly in half — or more if you max out the setting!

Control Ultimate gets DLSS 3.7 support and new 'Ultra' settings — cuts performance nearly in half : Read more
IS there a reason why the comparison images looks like screenshots taken at 360p?
 
I as a guy who bought Control on release and have all DLC's, season pass and effectively making my game ultimate edition, DONT GET THE UPDATE! WTF is that!! I regret so much buying Control, AW, AW2 and all other games from this Thief of a developer. But fear not! They stole this one, lets see how many of their future games end up on my PC without paying a cent. Rockstar games just made a similar update for GTA V and naturally i as a GTA V owner get the update free, that is why rockstar will always get my money. I just got two words for REMEDY........... ELAMIGOS GAMES!!!
 
Tbf Im tired with all this scaler and ray tracing talk. Wow you can tank a card with calculations it was not really made for. Impressive.

Ray / path / whatever you call it , will always be a simplified implementation of actual proper light tracing. Which is practically impossible because the formula runs to infinity. Ofc we dont need infinite bounces, but all this talk is just nvidia marketing , nobody really needs this - it hardly makes the games more pretty, just more easy to program - and spends a ton more ressources. I do not want my graphics card to deprioritize gaming graphics performance , to give way for lighting engine cores - So whats next, we need fluid simulation cores we need differential equation cores , we need blablabla. This was never the approach for a reason, it is insanely expensive and is really only a benefit to lazy developers.

Sure ray tracing is a cool technology - but its not new, its not groundbreaking, nvidia did not do the bulk of the work making this possible even and its just unfeasible anyway and immensely expensive to do. Usually what we do in gaming is make the best with what we have, we dont suggest people go and buy a full fledged real time physics simulator.... We can implement all kinds of cool physics into the games, it would just tank performance. Many old games have more than good enough lighting to be immersive , and most was just made with clever programmers and a curious mind. This idea of bruteforcing next gen graphics is outright stupid to me.

When you can demonstrate an algorithm that gets very close to the real results, without spending 150 watts on lighting, or implying that I need to buy a lightsimulator to get light in my game..... Then we can talk.

You can hardly claim to be anything but ignorant if you think that VSR spending 200watts upscaling a video on youtube is amazing .... wtf people.

In its essence it is a question of ingenuity vs brute force. Brute force is the expensive one, both in terms of dollars and watts. Im to the core not impressed whatsoever. It is a slippery slope for the entire industry imo. And is it really of benefit to the gamer when "all you need" is just an insanely expensive number cruncher , to give way for developers with zero clue about gaming engines and graphics. Its lazy, its stupid and its not to the benefit of the consumer.

Consider if in doubt about what to think, that actual real fluid dynamics simulations in practice is psycho difficult even with endless ressources. Look at formula1 as an example - again and again the numbers did not line up with practical results (these people have 100m$ computers for this). So we should just throw all ingenuity out the window and start doing fluid simulations the "physics" way? ... it does not make sense :)

I wonder , for the lulz, how jensens ray tracing would react to a double slit 😀That is to say, even this implementation is far from the "real" algorithm. It probably has no clue that light has wavelike properties, and that different wavelengths interact differently.
Well said Sir, but unfortunately nvidia's marketing reaches more ears then your post. I was looking forward for the HDR in this update. But Imagine if raytracing was the standard practice of light rendering and then nvidia invented rasterized lights (if that is the term for standard lights). nvidia would then be selling it like "Almost identical with 80% more FPS) everybody would be using rasterized lightning praising nvidia's ingenuity.
 
Tbf Im tired with all this scaler and ray tracing talk. Wow you can tank a card with calculations it was not really made for. Impressive.

Ray / path / whatever you call it , will always be a simplified implementation of actual proper light tracing. Which is practically impossible because the formula runs to infinity. Ofc we dont need infinite bounces, but all this talk is just nvidia marketing , nobody really needs this - it hardly makes the games more pretty, just more easy to program - and spends a ton more ressources. I do not want my graphics card to deprioritize gaming graphics performance , to give way for lighting engine cores - So whats next, we need fluid simulation cores we need differential equation cores , we need blablabla. This was never the approach for a reason, it is insanely expensive and is really only a benefit to lazy developers.

Sure ray tracing is a cool technology - but its not new, its not groundbreaking, nvidia did not do the bulk of the work making this possible even and its just unfeasible anyway and immensely expensive to do. Usually what we do in gaming is make the best with what we have, we dont suggest people go and buy a full fledged real time physics simulator.... We can implement all kinds of cool physics into the games, it would just tank performance. Many old games have more than good enough lighting to be immersive , and most was just made with clever programmers and a curious mind. This idea of bruteforcing next gen graphics is outright stupid to me.

When you can demonstrate an algorithm that gets very close to the real results, without spending 150 watts on lighting, or implying that I need to buy a lightsimulator to get light in my game..... Then we can talk.

You can hardly claim to be anything but ignorant if you think that VSR spending 200watts upscaling a video on youtube is amazing .... wtf people.

In its essence it is a question of ingenuity vs brute force. Brute force is the expensive one, both in terms of dollars and watts. Im to the core not impressed whatsoever. It is a slippery slope for the entire industry imo. And is it really of benefit to the gamer when "all you need" is just an insanely expensive number cruncher , to give way for developers with zero clue about gaming engines and graphics. Its lazy, its stupid and its not to the benefit of the consumer.

Consider if in doubt about what to think, that actual real fluid dynamics simulations in practice is psycho difficult even with endless ressources. Look at formula1 as an example - again and again the numbers did not line up with practical results (these people have 100m$ computers for this). So we should just throw all ingenuity out the window and start doing fluid simulations the "physics" way? ... it does not make sense :)

I wonder , for the lulz, how jensens ray tracing would react to a double slit 😀That is to say, even this implementation is far from the "real" algorithm. It probably has no clue that light has wavelike properties, and that different wavelengths interact differently.

I'm looking forward to your self-developed, highly realistic, super efficient and easy to program game engine. Please don't forget about us little guys down here at Tom's Hardware and you make it to the top.
 
I think this goes to show how much closer amd has gotten to nvidia in rt preformance. Both aren't super playable but they are within 7 fps of each other at the worst when talking native.