So far Nvidia has not commented on or announced a game-ready driver for Hogwarts Legacy. Intel already released one on Wednesday and AMD announce one coming soon, but the latest Nvidia driver from this week does not mention Hogwarts Legacy in the description (but added a game profile). They also do not answer questions from Game Review websites/magazines or users via Twitter...it's all tumbleweeds from Nvidia.
As it turns out, the game looks amazing but has its fair share of performance issues. It's loaded with geometry and game details, stuff moves, and is animated everywhere (as you expect it in Hogwarts). It's made in Unreal Engine 4 which is known to be a bit iffy with micro stutters due to how it handles shader compiling and LODs.
It really is next-gen in terms of System Ram and VRAM use. With my 13700k/4080 I play the game in 1440p with everything in Ultra + Raytracing on Ultra with DLSS on Quality (sometimes I add frame generation, sometimes not, playing around with it). My System RAM fills up to 24 GB and having between 12 and 13 GB VRAM usage happens quite frequently when using Raytracing. Over 12GB VRAM is insane for 1440p.....
AND....here I come to the point I want to make. The 3080 10GB is basically obsolete for this game if you want to use 1440p with Raytracing. And the brand-new 4070Ti is also not ideal to do 4k with raytracing. So, is Nvidia now all grumpy about the fact that the game is highlighting the downsides of its low VRAM approach? And therefore not giving the game the love they give any other new game embracing Nvidia tech? After all, Hogwarts Legacy is using all fancy and shiny Nvidia tech such as DLSS 2.0, DLSS 3.0 Frame Generation, and Raytracing.....
Discuss....
As it turns out, the game looks amazing but has its fair share of performance issues. It's loaded with geometry and game details, stuff moves, and is animated everywhere (as you expect it in Hogwarts). It's made in Unreal Engine 4 which is known to be a bit iffy with micro stutters due to how it handles shader compiling and LODs.
It really is next-gen in terms of System Ram and VRAM use. With my 13700k/4080 I play the game in 1440p with everything in Ultra + Raytracing on Ultra with DLSS on Quality (sometimes I add frame generation, sometimes not, playing around with it). My System RAM fills up to 24 GB and having between 12 and 13 GB VRAM usage happens quite frequently when using Raytracing. Over 12GB VRAM is insane for 1440p.....
AND....here I come to the point I want to make. The 3080 10GB is basically obsolete for this game if you want to use 1440p with Raytracing. And the brand-new 4070Ti is also not ideal to do 4k with raytracing. So, is Nvidia now all grumpy about the fact that the game is highlighting the downsides of its low VRAM approach? And therefore not giving the game the love they give any other new game embracing Nvidia tech? After all, Hogwarts Legacy is using all fancy and shiny Nvidia tech such as DLSS 2.0, DLSS 3.0 Frame Generation, and Raytracing.....
Discuss....
Last edited: