Question why are the games more and more demanding and there is no graphical improvement

NorbertPlays

Proper
Jul 31, 2023
85
62
110
Assasins creed odyssey compared to ac mirrage graphics looks same
No they don't. I mean sure, thematically and in terms of art direction they're similar, but Mirage has more detail (both in geometry and textures), a more advanced rendering pipeline, higher world density, etc. Also note the minimum specs for Odyssey target 720P/30 whilst Mirage targets 1080P/30 (2.25 times as many pixels).
 
No they don't. I mean sure, thematically and in terms of art direction they're similar, but Mirage has more detail (both in geometry and textures), a more advanced rendering pipeline, higher world density, etc. Also note the minimum specs for Odyssey target 720P/30 whilst Mirage targets 1080P/30 (2.25 times as many pixels).

Those would be valid points were the game not very bland in the look of it's environments. It's something that has been pointed out in reviews and after seeing some footage of it, I have to agree. It is also pretty uninteresting in it's story and characters. About the only thing that looked appealing to me was the combat moves, but that is not enough to sustain a game.

AC has been done to death. Right now Avatar Frontiers of Pandora and the soon to release Star Wars Outlaws, both made by Massive Entertainment, appear to be Ubisoft's best titles. I'd prefer they'd not abandoned Splinter Cell, but at least they're putting out SOME good content. They also need to bring Watch Dogs back to a serious tone and can the silly stuff.
 
  • Like
Reactions: madaraosenpai

NorbertPlays

Proper
Jul 31, 2023
85
62
110
Those would be valid points were the game not very bland in the look of it's environments.
That's like saying the Blu-ray version of High School Musical 3 doesn't look better than the DVD version because it's an unnecessary sequel to a terrible film; both versions may be unwatchable garbage, but in terms of visual fidelity the former is still objectively better than the latter. The OP asked why Mirage had higher system requirements than Odyssey when they looked "the same", and I pointed out that while they looked superficially similar Mirage has a number of objective technical improvements; I didn't say either looked interesting or that they were good games, which is an entirely different and completely subjective issue.
 

35below0

Respectable
Jan 3, 2024
1,727
744
2,090
Diminishing returns.

The new game is more demanding and difficult to render, but the visual improvements are small enough not to be very noticeable.

The increase in resolution alone will push requirements up considerably. Whether it looks better is neither here nor there.
 
That's like saying the Blu-ray version of High School Musical 3 doesn't look better than the DVD version because it's an unnecessary sequel to a terrible film; both versions may be unwatchable garbage, but in terms of visual fidelity the former is still objectively better than the latter. The OP asked why Mirage had higher system requirements than Odyssey when they looked "the same", and I pointed out that while they looked superficially similar Mirage has a number of objective technical improvements; I didn't say either looked interesting or that they were good games, which is an entirely different and completely subjective issue.

Yeah, sorry, I guess it was a bit off topic, and you're right. I was just saying the graphical improvements you're referring to are largely wasted on such a bland looking game, that's all. Avatar Frontiers of Pandora however is a good example of a game that makes full use of every bit of the high tech graphics they've thrown at it. Mirage by comparison reminds me very much of Revelations, which was another very bland looking AC game.

Ubi have made a lot of bad decisions with their games, but at least with Massive's latest two titles they are finally getting a pulse on what players want. So it was not your point I was referring to, as much as Ubi choosing to try to "polish a turd" with great graphics improvements that don't show so well due to how Mirage was made with it's core look.
 
Last edited:
While the environments for a lot of games "look" the same, that ignores the fact there are a lot of really expensive minor improvements that are done behind the scenes that eat a LOT of GPU horsepower.

For example: things like light diffusion, which offer *very* minimal graphical improvement outside of a a handful of specific situations, is ridiculously expensive for a GPU to calculate. So you see next to no improvement in overall fidelity, but the work is still being done if the option to use this feature is enabled.

Now, if you lower graphical settings to use the equivalent graphical features from a previous game, you should see somewhat similar performance. But "Ultra" preset to a game released now is "always" going to run slower then that same preset from a game a few years ago, regardless of how the two games "look".
 

Eximo

Titan
Ambassador
I agree with diminishing returns. Getting something to look 90% as good as photo real, each percent is harder and harder requiring essentially full path tracing and full simulation of a good chunk of physics to achieve proper results. One of the reasons Nvidia puts so much effort into ray tracing, it really is the easiest way to increase realism.

For me I am distracted a lot by games that try to look good, but don't quite get there. I tend to prefer stylized graphics over realism, since I am less taken out of the experience when things aren't supposed to look real. It is why face animations are so important and when it looks goofy it is very off putting.

The drive towards ever increasing visual fidelity is almost a detriment to the gaming industry. They spend so much of their time and resources on making high poly count models and textures. Or paying celebrities for face capture, not to mention microtransaction assets. With gameplay often a lot lower on the priority list than it should be.
 
D

Deleted member 2969713

Guest
Diminishing returns.

The new game is more demanding and difficult to render, but the visual improvements are small enough not to be very noticeable.

The increase in resolution alone will push requirements up considerably. Whether it looks better is neither here nor there.
This is the answer.

Frankly, the rasterization techniques of the last generation were good enough, and I wish video game developers would stick to those and focus on high resolutions and frame rates instead of pushing extremely expensive techniques for very little visual gain. Instead of getting 4K 60 games on current consoles, we're getting often sub-1080p (native) resolutions and 30 FPS, with the end result of worse visuals.

I agree with diminishing returns. Getting something to look 90% as good as photo real, each percent is harder and harder requiring essentially full path tracing and full simulation of a good chunk of physics to achieve proper results. One of the reasons Nvidia puts so much effort into ray tracing, it really is the easiest way to increase realism.

For me I am distracted a lot by games that try to look good, but don't quite get there. I tend to prefer stylized graphics over realism, since I am less taken out of the experience when things aren't supposed to look real. It is why face animations are so important and when it looks goofy it is very off putting.

The drive towards ever increasing visual fidelity is almost a detriment to the gaming industry. They spend so much of their time and resources on making high poly count models and textures. Or paying celebrities for face capture, not to mention microtransaction assets. With gameplay often a lot lower on the priority list than it should be.
100% agree, couldn't have said it better myself.
 
Come to think of it, it's not so much that my above comment was off topic, it just addressed the other half of OP's complaint, that the visual benefit of high tech graphics features in some games have limited effect visually.

So I stand by what I said, that it's largely due to how graphically complex the game's environments are to begin with. I'll add though that even the hardware components of your PC, including it's display, have an impact as well.

I also think it cannot be ignored that some devs do not implement some graphical features as well as others do. Lastly, the optimization of a game takes time to do right, and many are clearly taking shortcuts there, more so lately.
 
I agree with diminishing returns. Getting something to look 90% as good as photo real, each percent is harder and harder requiring essentially full path tracing and full simulation of a good chunk of physics to achieve proper results. One of the reasons Nvidia puts so much effort into ray tracing, it really is the easiest way to increase realism.

For me I am distracted a lot by games that try to look good, but don't quite get there. I tend to prefer stylized graphics over realism, since I am less taken out of the experience when things aren't supposed to look real. It is why face animations are so important and when it looks goofy it is very off putting.

The drive towards ever increasing visual fidelity is almost a detriment to the gaming industry. They spend so much of their time and resources on making high poly count models and textures. Or paying celebrities for face capture, not to mention microtransaction assets. With gameplay often a lot lower on the priority list than it should be.
Pretty much this. And yes, Ray Tracing is really the next step in graphical fidelity; all those expensive to compute lighting effects get done for free as part of the algorithm.

But regardless, we are in fact at the point of diminishing returns, and have been for well over a decade now. Notice how all the new features we've gotten over the past decade plus have done little in terms of in-game graphics: HDR, new AA modes, VRR, and so on. This is because there's basically nothing left to do that isn't comically expensive to compute.

Basically: we're graphically tapped out, and you aren't going to see significant improvements until we get games that utilize ray tracing across the board, which we're still a ways from having enough horsepower to do. Combine that with the fact we're at the end of die shrinks (moreso for CPUs, but GPUs aren't far behind) means we're going to stagnate graphically for a good half decade, maybe longer.
 
D

Deleted member 2969713

Guest
What bugs me the most about modern AAA graphics is the use of TAA and FSR/DLSS upscaling from low resolutions, making everything a blurry, smeary, artifact-y mess. I played DKC: Tropical Freeze at 4K using CEMU (having dumped the game from disc, I'm not a pirate), and turned off anti-aliasing. That game at 4K, with its relatively simple rendering techniques, looks leaps and bounds better than modern games afflicted with TAA and upscaling.
 
What bugs me the most about modern AAA graphics is the use of TAA and FSR/DLSS upscaling from low resolutions, making everything a blurry, smeary, artifact-y mess. I played DKC: Tropical Freeze at 4K using CEMU (having dumped the game from disc, I'm not a pirate), and turned off anti-aliasing. That game at 4K, with its relatively simple rendering techniques, looks leaps and bounds better than modern games afflicted with TAA and upscaling.
TAA came about because Sumpersampling is still comically expensive; emulators can kind of brute force increasing internal resolution, but expecting modern 3d games to offer that as an option is a bit much. Pretty much all forms of anti-aliasing have downsides; TAA generally hits the sweet spot for image-quality versus performance.

And yes, either Switch emulator is a better Switch then the Switch itself.
 
D

Deleted member 2969713

Guest
TAA came about because Sumpersampling is still comically expensive; emulators can kind of brute force increasing internal resolution, but expecting modern 3d games to offer that as an option is a bit much. Pretty much all forms of anti-aliasing have downsides; TAA generally hits the sweet spot for image-quality versus performance.
I think I actually prefer no anti-aliasing over TAA, especially at high resolutions. I don't mind jaggies; I do mind blurriness and motion artifacts. The problem is that games like Jedi: Survivor rely on the TAA for things like hair and foliage to look right.