News Intel details game-boosting frame generation tech that applies a different technique — ExtraSS uses extrapolation instead of AMD and Nvidia's appro...

Status
Not open for further replies.
Interesting, frame generation needs lower latency, and I guess this is one of the only ways to decrease latency in the whole process of generating the frames. I can’t wait to see what the latency looks like.
 
  • Like
Reactions: rtoaht
Nothing beats horsepower. Replacing the horses with goats is not going to work. Hopefully the industry keeps working on improving both soft and hardware to provide better frame rates in the future.

I am afraid these frame boosting techniques will lead to more sloppy software, sloppier programming, and lower improvements in hardware.
 
Nothing beats horsepower. Replacing the horses with goats is not going to work. Hopefully the industry keeps working on improving both soft and hardware to provide better frame rates in the future.

I am afraid these frame boosting techniques will lead to more sloppy software, sloppier programming, and lower improvements in hardware.
The notion that a lot of 'sloppy' software is responsible for slow frame rates is wildly off base. The realism of rendering has been marching steadily upward for years, that is the root cause. Programmers aren't getting worse, their job is getting harder.
 
  • Like
Reactions: Order 66
The notion that a lot of 'sloppy' software is responsible for slow frame rates is wildly off base. The realism of rendering has been marching steadily upward for years, that is the root cause. Programmers aren't getting worse, their job is getting harder.
Yeah right... One of the big issues is that game engines like Unreal are made to let artists just drop stuff in, in a wide variety of silly formats and setups, without optimization. Artists and designers are overly-prioritized in many workflows, and programmers hardly deal with the low-level guts of things anymore.

Another big issue is that "modern" programming paradigms of the past two decades have been the death of hardware-centric optimization.
 
But think about applications where the highest visual fidelity isn't paramount, such as consoles (and handhelds like the Steam Deck) and iGPU setups where power constraints mean max detail levels aren't possible. Having a more streamlined technique is the better way to go.
 
But think about applications where the highest visual fidelity isn't paramount, such as consoles (and handhelds like the Steam Deck) and iGPU setups where power constraints mean max detail levels aren't possible. Having a more streamlined technique is the better way to go.
There's nothing inherently wrong with the technique, esp for the use-case that BlurBusters detailed, which is effectively boosting a base 100 fps game to very high refresh rates. It's just that it unfortunately also doubles as an excuse for increasingly-lazy devs to continue being so.
 
Nothing beats horsepower. Replacing the horses with goats is not going to work. Hopefully the industry keeps working on improving both soft and hardware to provide better frame rates in the future.

I am afraid these frame boosting techniques will lead to more sloppy software, sloppier programming, and lower improvements in hardware.
While I don't disagree that we need better hardware we absolutely need good frame generation technology. Right now the way nvidia has been pushing frame generation as a universal improvement is toxic and bad for everyone. You need a good frame rate to start with before frame generation is a good technology.

The reason I say it's necessary though is high refresh rates are a big deal, but there's no hardware that can truly drive them. We're going to have 360hz 4k screens soon and we're probably 3+ generations away from being able to do that even at the highest end hardware. If we can get $300 GPUs up to 90-120 FPS in 4k frame generation can start to pickup the slack to great advantage for everyone.
 
Nothing beats horsepower. Replacing the horses with goats is not going to work. Hopefully the industry keeps working on improving both soft and hardware to provide better frame rates in the future.

I am afraid these frame boosting techniques will lead to more sloppy software, sloppier programming, and lower improvements in hardware.
Unfortunately (or not) it's a technology that's only going to become more prevalent especially when the console upgrades hit and it's a core part of their strategy along with ai enhanced upscaling.

The good news is the tech is constantly improving image quality wise and as long as your hitting a native refresh that's roughly half of the output then latency isn't really an issue in all but the most twitchy of twitch shooters with low TTK. Latency reduction technology essentially acts as a frame rate limit so it's actually not so bad.

The problems really start showing when a game is running at native 30 and even going to 60 which goes against the half native I talked about but 30 is just too low. 40 would work at 60 output because 40 is exactly way to 60 half in terms of frame time which is more important but anything beyond 60 from 40 and you have issues in basically everything.

I play PC games on my LG C1 so I run a 117 fps cap and I have a 4090 so usually my native frames rate is 60+. Most games at the upper end of demanding run between 60-80fps then get boosted to 100-117.

I've actually been having quite a bit of fun using the dlss3 frame gen injection mods that replace fsr or even "upgrade" standard dlss2 so they support frame gen. Pray dog just released a gen for red dead 2 which works nicely and the one for dead space remake is fantastic. It fixes that games stutter as does the one for Jedi survivor which oddly works better than the official games implementation of frame gen using dlss3.
 
The notion that a lot of 'sloppy' software is responsible for slow frame rates is wildly off base. The realism of rendering has been marching steadily upward for years, that is the root cause. Programmers aren't getting worse, their job is getting harder.
I beg to differ here. It is true that the realism of rendering have gone up. But the question is, do you really need it to be so realistic looking? These folks are selling a game, not some art pieces. Look at all the AAA titles that supposedly come up with UE 5 "realism". More than often, they fall flat because the requirements to run the game is simply too high. Which genius decides that they should shut out 70% of the gamers by imposing super high graphic requirements? I feel this is a self inflicted problem. Does Baldur's Gate look super good? I doubt that, but yet it sells like hot cakes. I just feel game developers in general have looked away from making a fun game, to milking an old franchise or title, and focusing on the visuals to distract people from the fact that it is a boring game.
 
  • Like
Reactions: bolweval
So they basically made their own implementation of Blur Busters method that BBs was trying to get everyone to adopt: https://blurbusters.com/frame-generation-essentials-interpolation-extrapolation-and-reprojection/
Not exactly, or so I think. What your link seems to advocate is for VR-style re-projection that's based on the actual viewing position. That requires the program to tell the graphics driver where the player is looking, at the time the approximated frame is being generated.

I just skimmed the whitepaper, but they indeed seem to be going for a naive extrapolation scheme. That would work without requiring application-level changes, but at the expense of more potential glitches - which become more pronounced as the base framerate drops (i.e. when you need frame generation the most)! On the plus side, naive extrapolation has the potential to smooth object motion, which reprojection doesn't.
 
The notion that a lot of 'sloppy' software is responsible for slow frame rates is wildly off base. The realism of rendering has been marching steadily upward for years, that is the root cause. Programmers aren't getting worse, their job is getting harder.
That's not true at all.

First, there's a lot of sloppy software out there -- there's an almost endless list of games including AAA titles that are released in a horrible state not just when it comes to gameplay / quest bugs, but also to poor rendering performance. Most of them eventually get fixed by post-launch patching, but some are just badly designed and can't be optimized or the developers don't give a damn once they have your money.

Realism has been improving, but at what cost? Does my PC with an RTX 4090 really need to pull 670W from the wall socket continuously so it can barely squeeze past 60 FPS in 1080p in Cyberpunk 2077? Sure it looks realistic, but it's a boring game about being an unemployed nobody in a horrible world.

Finally, programmers are certainly getting worse because:

1. They are pushed to increase productivity which can past certain point only come at the expense of the code quality.
2. They are increasingly reliant on a 3rd party code without good understanding on how it works or sometimes even how to use it correctly with prime example being complex game engines such as Unreal and Unity which take years to master and require domain-specific knowledge (3D rendering, GPU architectures).
3. They are increasingly read-only (meaning they can't read other programmers' code and loudly complain when they have to do it) because that requires knowing low-level stuff and being able to recognize patterns and algorithms while many only know how to find and copy/paste stuff from SO or use pre-built packages and deal with high-level concepts only.
4. There are no universal certifications and licenses for software engineers like there are for civil engineers for example -- literally anyone can say "I am a developer" even if the only thing they can do is put some HTML + JS together.

Source: I am an avid gamer and a software engineer with 15 years of experience in software development, and 30 years of experience in the PC industry with basic understanding of electronics.
 
Status
Not open for further replies.