News Gaming industry insiders say cutting-edge graphics cost too much to make for AAA games

we can have both.
See Horizon forbidden west, pc. (2024).
Good story, good ambiance, immersive, totally bug-free, and, graphically wonderful.. just a perfect game. and a total success.
without demanding a rtx8090 1500w 150 gb vram. :)

so beautiful and well optimized doesn't equal to last super-expensive hardware
 
The problem is the modern cost of gaming.

$1300 to build a PC for 4K (As TomsHardware did)
$700 to build a PC for 1920x1080 (As TomsHardware did)
$700 for a PS5 Pro
$450 for an XBOX Series X or PS5

The vast majority of the market isn't going to be able to take advantage of extremely high fidelities, especially on FPSs and other fast moving content, and especially not with ray or path tracing, and it's only going to get worse as the cost of GPUs continues to increase. AI will be able to help streamline costs by hopefully taking a lot of man hours out of the environment and NPCs, but that means little if people are still playing on low resolutions and frame rates.
 
The problem is the modern cost of gaming.

$1300 to build a PC for 4K (As TomsHardware did)
$700 to build a PC for 1920x1080 (As TomsHardware did)
$700 for a PS5 Pro
$450 for an XBOX Series X or PS5

The vast majority of the market isn't going to be able to take advantage of extremely high fidelities, especially on FPSs and other fast moving content, and especially not with ray or path tracing, and it's only going to get worse as the cost of GPUs continues to increase. AI will be able to help streamline costs by hopefully taking a lot of man hours out of the environment and NPCs, but that means little if people are still playing on low resolutions and frame rates.
That's the opposite of the problem. Expensive hardware should sell less, meaning less people can run high-quality graphics, meaning studios would try to make less demanding games. But the news says otherwise, they want better graphics but the cost of *creating* those graphics is too high.
 
It also comes down possibly to optimization. If you look back at games like Assassin's Creed 3, Unity and the Uncharted games, they had good graphics without needing a huge amount of hardware, at least by today's standards. Though I recall Unity when it came out was supposed to have been very buggy which is too bad because years later I played through it and it was a great game. It seems like back in the day that developers were much better at optimizing games or programs in general to run on less hardware. But maybe those ideas have gone away as we have multicore cpus, ever increasing amounts of ram and bigger/faster gpus.
 
  • Like
Reactions: iLoveThe80s
See Horizon forbidden west, pc. (2024).
Good story, good ambiance, immersive, totally bug-free, and, graphically wonderful.. just a perfect game. and a total success.
without demanding a rtx8090 1500w 150 gb vram. :)

so beautiful and well optimized doesn't equal to last super-expensive hardware
 
The problem is the modern cost of gaming.

$1300 to build a PC for 4K (As TomsHardware did)
$700 to build a PC for 1920x1080 (As TomsHardware did)
$700 for a PS5 Pro
$450 for an XBOX Series X or PS5

The vast majority of the market isn't going to be able to take advantage of extremely high fidelities, especially on FPSs and other fast moving content, and especially not with ray or path tracing, and it's only going to get worse as the cost of GPUs continues to increase. AI will be able to help streamline costs by hopefully taking a lot of man hours out of the environment and NPCs, but that means little if people are still playing on low resolutions and frame rates.
I don't think 4K needs to be part of the equation necessarily. If I put on HD (720p) film/TV or even a DVD (~480p typical), I'm not guessing if it was filmed with real people or not. Put on a modern "photorealistic" game at 4K such as the new Indiana Jones one or others mentioned, and you can tell it's virtual. Graphics seems to have entered an uncanny valley. It will take an enormous amount of resources on both the hardware and software side to get out of the valley, and raytracing isn't a magic bullet but only part of a comprehensive solution.

Meanwhile, pushing the graphics boundary does nothing for gameplay, and the industry is oversaturated with games that nobody wants to play or pay for.

I don't think the entry-level cost of getting into gaming (1080p) is such a big deal. It would be great if GPU and other hardware prices improved, but even the post-Cezanne APUs alone (e.g. 8700G or 6800H in a mini PC) have credible performance. You can find ways to get under the $700 mark, such as a refurb office PC + low profile GPU combo. You can pick up a discounted Steam Deck for <$400 and play at 720p. Any general purpose PC is useful and there are folks who are going to be using it all day every day.

But the games? Is anyone here paying hundreds of dollars a year for AAA titles? There is a massive amount of F2P, giveaways, and discounts, before you even consider the effects of piracy and emulation. 90% of the gaming industry could collapse overnight, and we would still be good... FOREVER.
 
  • Like
Reactions: Sluggotg
The NYT article is still basically useless and based on an inaccurate hypothesis. Graphics isn't what drives up the cost of these games and is making them unsustainable. I'd imagine most live service games which regularly churn out content probably cost more to make than most of these games with advanced graphics. The way most of the big studios work is where the problem really tends to come in.

There's a tendency to hire a bunch of bodies to finish up productions (this is also a driver of install size). Many of the long in development games have had a toxic cycle of iteration and throwing out work due to there not being clear goals. There's also putting huge development costs into a game with limited appeal (see Avatar). These things are management related and have nothing to do with advanced graphics.

In terms of real cost increases these are largely related to the scale of the games as well as extensive use of mocap. Games also used to reuse assets a lot more as that keeps install sizes down and doesn't require hiring as many artists. At the end of the day the current gaming industry is what you get in any industry when the money people take over.
 
outside of games that are cinematic....2000-2010's graphics were good enough for rest of games.
I'm not much of a gamer, so maybe my opinion doesn't count for much, but there were a couple recent examples of well-known franchises where I considered picking up an earlier installment of a game made for PS4, but then decided to pass after seeing gameplay videos showing graphics on par almost with what a PS3 could've managed. On the contrary, I've bought a couple PS5-exclusives, just to see what the console can do. Also, when it comes to fighting games, I'm much more a fan of realism (Tekken) than cartoony (Street fighter).

In some sense I wonder if the gaming market hasn't already sorted to a point where you can't be very successful on playstation or XBox without graphics at a certain level, because most of the people who don't care about that stuff are using Nintendo.

P.S. I think one reason PCMR types should appreciate better upscaling technologies is that it enables game developers to raise the minimum bar without foregoing the iGPU market.
 
Another thing to consider with Ubisoft for example, look at all the stuff they do that players don't like. Removing older games like The Crew, some of the different DRM things they do. I went to fire up Assassin's Creed 1 recently and the game was running terribly. Come to find out, I had to modify my host file because the game was trying to phone home to old servers that didn't even exist any longer. Once I did that the game ran much better.

But really make games people care about. Like the Indiana Jones game, one of the best games I've played in a very long time. Companies should also stop with the attitude that you don't really own games. I get their thinking but I don't agree. Don't make them always online.
 
  • Like
Reactions: Armbrust11
I’m still waiting for TAA to be fixed. It’s been a decade since it was released and it’s still as blurry and ghostly as ever. It’s in fact gotten worse in recent years, with TAA being required to render things correctly….Honestly, I think graphics have regressed due to these Temporal processes.

Call me old fashioned, but give me clear, high definition graphics and MSAA and I’ll be happy. I’m okay with a few jagged lines over TAA’s Vaseline camera lens effect lol
 
I’m still waiting for TAA to be fixed. It’s been a decade since it was released and it’s still as blurry and ghostly as ever.
I'm pretty sure nobody is working on just TAA, any more. The industry's solution is to use a neural pass, like DLSS. I think TAA has enough tricky corner cases that fusing it with other techniques, via a model that's optimized to pick the best strategy in each situation, is the way they decided to go.
 
P.S. I think one reason PCMR types should appreciate better upscaling technologies is that it enables game developers to raise the minimum bar without foregoing the iGPU market.
excepts its already happening
upscaling has downsides. & games like indiana jones wouldnt run on igpu. (and MHWilds is for sure not running on any igpu)

Technology advancement "should" bring benefit to lower systems (was point of dlss) but devs instead use it as a crutch to skip optimizing games and the lower quality systems still struggle even w/ the tech and the games still ugly looking.
 
I have seen so many technology wonders come and go through the years and it always seems to be the push on the hardware side of the market that ends up killing the whole market.

Back in the 70 we had Quad surround and about 4 other competing Matrix 1, Matrix 2, and or SQ or QS.
On the software side the public learned real quick you had to buy your media to match what kind of decoder you owned. You always didn't have the right one anyways. That hardware market dried up.

We did the same with with surround starting in the 1990's Dolby surround than. Dolby Prologic. than Dolby digital, than DTS and the list goes on and on.
Than even If you had all the surround form factors snap well now you need HDMI. So you started all over again. That's where I seen the home theater market dry up. It was not the software side. It just became a never ending game to stay current to watch a movie in surround at home.

Now were to PC gaming and again for the most part the software side could keep rolling but the hardware side is getting to unreal to be real. $$$

Each one of the improvement to items I listed did bring a huge improvement to there markets but the hardware side made to big of a demand to upgrade or your just left behind.

I love all the eye candy that the AAA's bring to the market but jumping from a RTX 3090 to a 4090 and coming up to a 5090 is beyond me if you have that kind of money. Most of us don't and I consider myself the group who always buys just under the top tier but that's not cheap as well.

Example Indiana Jones and the circle of life. Don't have raytracing oh well !

Scrap it and start over.

New people who will jump in and start this PC gaming journey just might not ever make the plunge when in two years your now PC is ready for the scrap heap. Marketing wise.
 
Last edited:
  • Like
Reactions: bigdragon
"It's very clear that high-fidelity visuals are only moving the needle for a vocal class of gamers in their 40s and 50s. But what does my 7-year old son play? Minecraft. Roblox. Fortnite."

Right, and 7 year olds don't have any money to buy anything.

Well made games with high end graphic still get a good return on investment. The industry is losing their collective shirts by chasing live service trends in hopes of being the next Fortnite, which is a fool's errand.
 
Personally, I'm far more interested in opponent AI, storyline, and escapism than I am in graphics. The gaming industry seems to be allergic to improving things that aren't clearly cinematic (cutscene animations, acting, and similar). The idea that AAA's can't afford to continue pushing graphics isn't much of a concern because graphics have been good enough for the past decade or so.

The problem here is that I think AAA is complaining about graphics cost so they can shift more resources to live services, sequels, and other locked down experiences. I think gaming peaked with the 2000-2013 time period (PS2, XBox, PS3, XBox360). AAA needs to refocus on what that era did right before the Indies eat their lunch. We had such great variety in gaming experiences during that time period with many games encouraging their users to improve the experience for themselves and others. Today it seems like AAA just regurgitates relatively few games designs and IP that worked well in the past -- no risk taking, very few new IP, locked down experiences, and fixation on graphical fidelity.

I hope AAA gaming changes. I don't want to buy a 5090. I can get photorealistic graphics just by walking away from my computer. AAA needs to focus on the non-graphical stuff.
 
See Horizon forbidden west, pc. (2024).
Good story, good ambiance, immersive, totally bug-free, and, graphically wonderful.. just a perfect game. and a total success.
without demanding a rtx8090 1500w 150 gb vram. :)

so beautiful and well optimized doesn't equal to last super-expensive hardware
HFW is an interesting case because the base game had to run on the PS4, but the expansion did not. Nothing in the base game is particularly stressing, but when you get to the expansion hub it's significantly more demanding and the performance drop was enough for me to notice when I played it.

In general though I've noticed that games which were developed for the PS5 only tend to run very well and look good. This is probably related to the focus on fixed hardware and getting the most they can out of it. This in turn means that when doing the PC port the graphics can be scaled up for more powerful hardware, but the baseline is very good.