Question FPS Drops Dramatically RTX 3070

IsraelHP

Distinguished
Oct 7, 2015
34
4
18,535
Hello community!!!

I would like to have your help. Recently I update my Pc gaming but noticed some problems with the stability on gaming. This is my current build:

Ryzen 5 5600G
RTX 3070
16 GB RAM (3200)
PSU 600w
Gaming Monitor @ 4k

This happened only with three games at this moment. Dead Space Remake, Plague Tale Requiem and Hogwarts Legacy. While playing I have FPS drops from ~50 to 5-10 FPS, particularly on Hogwarts. After some time, runs fine. It’s important to mention it happens too when I change the graphics settings, particularly on Dead Space. Finally I found some artifacts on Plague Tale, textures not loading correctly and instead load a colored grid.

what do you think? its my gpu in bad state? Its used

thanks!
 
I don't know about Dead Space Remake or Plague Tale Requiem but I do know what the problem is in Hogwarts Legacy. Steve Walton of Hardware Unboxed/Techspot fame did a HUGE performance review and it would seem that the RTX 3070 and RTX 3080 are crippled by their low amount of VRAM. Cards like the RX 3070 and RX 3080 that only have 8GB and 10GB of VRAM respectively are crippled to the point that their performance becomes similar to that of the RX 5700 XT.
Your card can't even keep up with the RTX 3060 because it has 12GB of VRAM and yours only has 8GB. There is no fix for this because you can't add the VRAM needed to get more performance out of that game. If you want decent FPS, you have no choice but to run the game at 1080p.

I don't know why you chose to pay more for the RTX 3070 with 8GB than you would've paid for the RX 6700 XT with 12GB but that decision has now come back to bite everyone who made that bad decision in the posterior.

I knew that this would happen sooner or later and I said so. It's a HUGE reason why I chose the RX 6800 XT over the RTX 3080. I'd rather pay less for 16GB of VRAM than pay more for only 10GB. People said "Nah, nVidia's tiny VRAM buffers will be just fine!" and I said "Oh sure, just wait..." and sure enough, here we are.
 
  • Like
Reactions: IsraelHP
^ I am not sure I am biting on that scenario. One of the other issues that we have run into with graphics cards is them having all this on board memory that the card itself can't utilize properly. IIRC this was an issue with 960 and newer GPU from NVIDIA as well as some of the current AMD offerings. You can stuff as much VRAM as you want on a card but the GPU ability to work with it quickly and effectively is obviously in question.

To the OP, what brand and how old is that PSU. The 3070 is 'supposed' to have a 650W unit. I personally run a 600W on mine, but the balance of the system and the quality gold rated supply work out fine for it.
 
  • Like
Reactions: IsraelHP
Hello community!!!

I would like to have your help. Recently I update my Pc gaming but noticed some problems with the stability on gaming. This is my current build:

Ryzen 5 5600G
RTX 3070
16 GB RAM (3200)
PSU 600w
Gaming Monitor @ 4k

This happened only with three games at this moment. Dead Space Remake, Plague Tale Requiem and Hogwarts Legacy. While playing I have FPS drops from ~50 to 5-10 FPS, particularly on Hogwarts. After some time, runs fine. It’s important to mention it happens too when I change the graphics settings, particularly on Dead Space. Finally I found some artifacts on Plague Tale, textures not loading correctly and instead load a colored grid.

what do you think? its my gpu in bad state? Its used

thanks!

Your psu is 600 watts when the recommended psu for the 3070 is 650 watts.

Hogwarts does have issues that need updates to the gpu drivers and game code, nothing to do here but wait for the patches.

The 3070 is a good 1440p gpu, it's just not a 4k gpu.
 
  • Like
Reactions: IsraelHP
I don't know about Dead Space Remake or Plague Tale Requiem but I do know what the problem is in Hogwarts Legacy. Steve Walton of Hardware Unboxed/Techspot fame did a HUGE performance review and it would seem that the RTX 3070 and RTX 3080 are crippled by their low amount of VRAM. Cards like the RX 3070 and RX 3080 that only have 8GB and 10GB of VRAM respectively are crippled to the point that their performance becomes similar to that of the RX 5700 XT.
Your card can't even keep up with the RTX 3060 because it has 12GB of VRAM and yours only has 8GB. There is no fix for this because you can't add the VRAM needed to get more performance out of that game. If you want decent FPS, you have no choice but to run the game at 1080p.

I don't know why you chose to pay more for the RTX 3070 with 8GB than you would've paid for the RX 6700 XT with 12GB but that decision has now come back to bite everyone who made that bad decision in the posterior.

I knew that this would happen sooner or later and I said so. It's a HUGE reason why I chose the RX 6800 XT over the RTX 3080. I'd rather pay less for 16GB of VRAM than pay more for only 10GB. People said "Nah, nVidia's tiny VRAM buffers will be just fine!" and I said "Oh sure, just wait..." and sure enough, here we are.
Apparently decided wrong. I had a 3060 but changed it for 3070 to get a better performance with 4k
 
  • Like
Reactions: Avro Arrow
^ I am not sure I am biting on that scenario. One of the other issues that we have run into with graphics cards is them having all this on board memory that the card itself can't utilize properly. IIRC this was an issue with 960 and newer GPU from NVIDIA as well as some of the current AMD offerings. You can stuff as much VRAM as you want on a card but the GPU ability to work with it quickly and effectively is obviously in question.

To the OP, what brand and how old is that PSU. The 3070 is 'supposed' to have a 650W unit. I personally run a 600W on mine, but the balance of the system and the quality gold rated supply work out fine for it.
You could be right... but Steve Walton has been doing this for so long that his review was one of the ones that prompted me to buy an HD 4870. To be fair, I was just as moved by Fedi and Florian's article here on Tom's Hardware.

Over the years, Steve Walton has proven himself to be one of the best PC performance testers in the world and while I have questioned some of the more subjective aspects of some of his tests, his numbers have always been solid.

I lack the arrogance required to question his numbers or his know-how. It would be like me trying to say that I don't believe a story done by Steve Burke. I'm just not arrogant enough to consider myself to be in their league.
 
Last edited:
  • Like
Reactions: IsraelHP
^ I am not sure I am biting on that scenario. One of the other issues that we have run into with graphics cards is them having all this on board memory that the card itself can't utilize properly. IIRC this was an issue with 960 and newer GPU from NVIDIA as well as some of the current AMD offerings. You can stuff as much VRAM as you want on a card but the GPU ability to work with it quickly and effectively is obviously in question.

To the OP, what brand and how old is that PSU. The 3070 is 'supposed' to have a 650W unit. I personally run a 600W on mine, but the balance of the system and the quality gold rated supply work out fine for it.

Seems to be the PSU. It’s more less 10 years old. I will change it for a new one with no less that 650w. I hope this can work.
 
Seems to be the PSU. It’s more less 10 years old. I will change it for a new one with no less that 650w. I hope this can work.
I really don't think it's the PSU because I've never seen a PSU cause low frame rates. If the PSU can't handle the card, the system will do a hard-reset crash. Power delivery is an all-or-nothing thing.

If the card tries to draw more than the PSU can provide, it doesn't say "Oh sorry, I'll slow down and draw less than I need!", it just takes the whole system down in a big crash.

I would definitely agree that changing a 10 year-old PSU (unless it's REALLY high-end like a 1000W 80+Gold-certified model) but I don't see that improving your card's performance. This is especially true considering that the legendary Steve Walton has already identified the problem.
 
  • Like
Reactions: IsraelHP
Apparently decided wrong. I had a 3060 but changed it for 3070 to get a better performance with 4k
It's not your fault, you thought you were doing the right thing. The thing is, when you're buying a video card, never choose one that seems low on VRAM. You kinda gotta do your homework because things like this happen more often than you might imagine. To avoid nVidia's shenanigans, I have only had Radeon cards since 2008 when I bought my first HD 4870. Since then, I've never had to pay nVidia's exorbitant prices, suffer with their lower performance levels at non-halo price points or VRAM scams.

When you get your next card, just keep that in mind and above all else, do your homework. These things have become horrifically expensive and the last thing that you want to do is spend that much money without getting what is truly best for you. I would recommend looking at Tom's Hardware reviews but also reviews from sites like Techspot, TechPowerUp and Guru3D. On YouTube, you'd want to look at channels like Gamers Nexus and Hardware Unboxed. Hardware Unboxed offers video versions of the reviews that they post on Techspot.

Look at many sources and always remember that you are buying hardware and that is what you should be paying for, nothing else. I say that because a lot of people bought nVidia cards because of DLSS. The problem here is that they're paying for software, not hardware.

Software can be created, changed, updated or otherwise improved over time. The same is not true about your video card as it will die with the same GPU and VRAM with which it was born.

To give an example, I'd rather buy a Radeon RX 6800 XT with 16GB of VRAM and wait for FSR to become as good as DLSS than buy an RTX 3080 with 10GB, knowing that no amount of software will ever be able to increase my VRAM. Fancy software can always be duplicated and improved but not the physical card, the hardware that you paid so much for.

I hope that what I'm saying makes sense because I love this hobby and hate to see people get fleeced. I just wish that I had happier information to give you.
 
  • Like
Reactions: IsraelHP
You could be right... but Steve Walton has been doing this for so long that his review was one of the ones that prompted me to buy an HD 4870. To be fair, I was just as moved by Fedi and Florian's article here on Tom's Hardware.

Over the years, Steve Walton has proven himself to be one of the best PC performance testers in the world and while I have questioned some of the more subjective aspects of some of his tests, his numbers have always been solid.

I lack the arrogance required to question his numbers or his know-how. It would be like me trying to say that I don't believe a story done by Steve Burke. I'm just not arrogant enough to consider myself to be in their league.


With all respect for HU and the 'arrogance' to speak my mind, they are just human too. I am not attempting to say or imply that I have more experience of the vast resource of hands on knowledge...But, I am also aware that new games often are not optimized and have issues. (cough CB77) and also am old enough to recall when people realized the folly of things such as 60 series Nvidia cards being unable to utilize the VRAM that was added purely for marketing purposes.
 
  • Like
Reactions: IsraelHP and KyaraM
I don't know about Dead Space Remake or Plague Tale Requiem but I do know what the problem is in Hogwarts Legacy. Steve Walton of Hardware Unboxed/Techspot fame did a HUGE performance review and it would seem that the RTX 3070 and RTX 3080 are crippled by their low amount of VRAM. Cards like the RX 3070 and RX 3080 that only have 8GB and 10GB of VRAM respectively are crippled to the point that their performance becomes similar to that of the RX 5700 XT.
Your card can't even keep up with the RTX 3060 because it has 12GB of VRAM and yours only has 8GB. There is no fix for this because you can't add the VRAM needed to get more performance out of that game. If you want decent FPS, you have no choice but to run the game at 1080p.

I don't know why you chose to pay more for the RTX 3070 with 8GB than you would've paid for the RX 6700 XT with 12GB but that decision has now come back to bite everyone who made that bad decision in the posterior.

I knew that this would happen sooner or later and I said so. It's a HUGE reason why I chose the RX 6800 XT over the RTX 3080. I'd rather pay less for 16GB of VRAM than pay more for only 10GB. People said "Nah, nVidia's tiny VRAM buffers will be just fine!" and I said "Oh sure, just wait..." and sure enough, here we are.
Holy crap, you have NO idea how annoyed I'm getting at this review being thrown around. I literally get TWICE the FPS in RT with my 3070Ti than he gets... when comparing his 1080p result to my 1440p!!! Yes, all ultra settings, including RT. No DLSS. No RT also gives me 70+ FPS in Hogsmeade and more in Hogwarts. He either royally messed up something in his test system, or didn't test with the day 1 patch, or the community patch does more than smooth out frame drops; or all of the above. In any case, something is very different between our systems, to the point that, I say it again, I have twice the FPS at a higher resolution than he has at 1080p with RT. And it's buttery smooth. All I did was install the community patch, which only included certain UE4 parameters the devs for some reason didn't put in, in a single ini file. The issue with this game is NOT VRAM, it is literally the programmers being too incompetent to follow even the most basic UE4 guidelines!

So stop running around spouting this nonsense already. It is literally the game, not the card. Simple as that.

@IsraelHP let me know if you want to try the patch yourself. There is either an installer, or including the parameters yourself if you are distrustful, which would be understandable. Let me know if you want the link. It worked great on my 3070Ti.
 
Last edited:
  • Like
Reactions: IsraelHP
I just found out that there was a patch for the game last night that apparently will fix some FPS issues and other bugs, though the stuttering issues most seem to have is still being investigated. Will see what it does when I'm home.
 
I really don't think it's the PSU because I've never seen a PSU cause low frame rates. If the PSU can't handle the card, the system will do a hard-reset crash. Power delivery is an all-or-nothing thing.

If the card tries to draw more than the PSU can provide, it doesn't say "Oh sorry, I'll slow down and draw less than I need!", it just takes the whole system down in a big crash.

I would definitely agree that changing a 10 year-old PSU (unless it's REALLY high-end like a 1000W 80+Gold-certified model) but I don't see that improving your card's performance. This is especially true considering that the legendary Steve Walton has already identified the problem.

Yep, after do some research I can confirm my PSU it's working totally fine for now. It's alomost 6-7 years old from now. I would change it but not now.

At the end, the problem was related with the drivers. Maybe I do not clear properly the old drives when I switched from 3060 to 3070.

It's not your fault, you thought you were doing the right thing. The thing is, when you're buying a video card, never choose one that seems low on VRAM. You kinda gotta do your homework because things like this happen more often than you might imagine. To avoid nVidia's shenanigans, I have only had Radeon cards since 2008 when I bought my first HD 4870. Since then, I've never had to pay nVidia's exorbitant prices, suffer with their lower performance levels at non-halo price points or VRAM scams.

When you get your next card, just keep that in mind and above all else, do your homework. These things have become horrifically expensive and the last thing that you want to do is spend that much money without getting what is truly best for you. I would recommend looking at Tom's Hardware reviews but also reviews from sites like Techspot, TechPowerUp and Guru3D. On YouTube, you'd want to look at channels like Gamers Nexus and Hardware Unboxed. Hardware Unboxed offers video versions of the reviews that they post on Techspot.

Look at many sources and always remember that you are buying hardware and that is what you should be paying for, nothing else. I say that because a lot of people bought nVidia cards because of DLSS. The problem here is that they're paying for software, not hardware.

Software can be created, changed, updated or otherwise improved over time. The same is not true about your video card as it will die with the same GPU and VRAM with which it was born.

To give an example, I'd rather buy a Radeon RX 6800 XT with 16GB of VRAM and wait for FSR to become as good as DLSS than buy an RTX 3080 with 10GB, knowing that no amount of software will ever be able to increase my VRAM. Fancy software can always be duplicated and improved but not the physical card, the hardware that you paid so much for.

I hope that what I'm saying makes sense because I love this hobby and hate to see people get fleeced. I just wish that I had happier information to give you.

I appreciate your time and dedication put on this answer. I love it!! Thanks for this and I'm totally agree with you, I failed my homework, In the future will do a better research before to spend the bucks.

Holy crap, you have NO idea how annoyed I'm getting at this review being thrown around. I literally get TWICE the FPS in RT with my 3070Ti than he gets... when comparing his 1080p result to my 1440p!!! Yes, all ultra settings, including RT. No DLSS. No RT also gives me 70+ FPS in Hogsmeade and more in Hogwarts. He either royally messed up something in his test system, or didn't test with the day 1 patch, or the community patch does more than smooth out frame drops; or all of the above. In any case, something is very different between our systems, to the point that, I say it again, I have twice the FPS at a higher resolution than he has at 1080p with RT. And it's buttery smooth. All I did was install the community patch, which only included certain UE4 parameters the devs for some reason didn't put in, in a single ini file. The issue with this game is NOT VRAM, it is literally the programmers being too incompetent to follow even the most basic UE4 guidelines!

So stop running around spouting this nonsense already. It is literally the game, not the card. Simple as that.

@IsraelHP let me know if you want to try the patch yourself. There is either an installer, or including the parameters yourself if you are distrustful, which would be understandable. Let me know if you want the link. It worked great on my 3070Ti.

Yes, the game it's not 100% optimized

I just found out that there was a patch for the game last night that apparently will fix some FPS issues and other bugs, though the stuttering issues most seem to have is still being investigated. Will see what it does when I'm home.

Yep, apparently works for me in addition of my drivers clean install :)
 
With all respect for HU and the 'arrogance' to speak my mind, they are just human too. I am not attempting to say or imply that I have more experience of the vast resource of hands on knowledge...But, I am also aware that new games often are not optimized and have issues. (cough CB77) and also am old enough to recall when people realized the folly of things such as 60 series Nvidia cards being unable to utilize the VRAM that was added purely for marketing purposes.
I didn't call you arrogant and didn't mean to imply that. I just said that I wasn't arrogant enough think that I know better than Steve Walton which is why I wasn't questioning his results. I assure you that it wasn't in any way aimed at you.

If you watched the entire video (which is, admittedly, long), I don't think that you would question him either. It's very clear that in Hogsmeade, the VRAM buffers of the RTX 3070 and RTX 3080 are crippling those cards. It's very clear because the RTX 3060 is outperforming both of them by a large margin. There's no way that it's a result of bad optimization because bad optimization would affect the RTX 3060 as badly, if not worse because it's a weaker GPU. The only variable between them that could explain this is the fact that for reasons I still don't understand, the RTX 3060 has more VRAM than the RTX 3070 and RTX 3080.

There's a clear line drawn in the results that says "Only cards with at least 12GB will survive this." and I don't think that any amount of software optimization can replace a lack of VRAM.
 
  • Like
Reactions: punkncat
Holy crap, you have NO idea how annoyed I'm getting at this review being thrown around. I literally get TWICE the FPS in RT with my 3070Ti than he gets... when comparing his 1080p result to my 1440p!!! Yes, all ultra settings, including RT. No DLSS. No RT also gives me 70+ FPS in Hogsmeade and more in Hogwarts. He either royally messed up something in his test system, or didn't test with the day 1 patch, or the community patch does more than smooth out frame drops; or all of the above. In any case, something is very different between our systems, to the point that, I say it again, I have twice the FPS at a higher resolution than he has at 1080p with RT. And it's buttery smooth. All I did was install the community patch, which only included certain UE4 parameters the devs for some reason didn't put in, in a single ini file. The issue with this game is NOT VRAM, it is literally the programmers being too incompetent to follow even the most basic UE4 guidelines!

So stop running around spouting this nonsense already. It is literally the game, not the card. Simple as that.
Well, if there is a patch, great. It doesn't change the fact that 8GB of VRAM is not enough for a card as potent as an RTX 3070 Ti or that 10GB is not enough VRAM for a card as potent as an RTX 3070. Hell, the GTX 1080 Ti had 11GB and so too did the RTX 2080 Ti. If what you say is true (and I'm not one to call someone a liar unless I can prove it), I'm happy about it because I want people to enjoy gaming.

Going off the deep-end as you did however is not appropriate for adult conversation, I honestly don't care how "annoyed" you were about it because all I did was post a video done by a well-known and VERY well-respected benchmarker. Until shown otherwise, I have no choice but to trust his results. I broke no rules posting it and I've read nothing in the TOS that says:

"If the great and entitled KyaraM should ever (god forbid) be annoyed by something, that user and that user alone is allowed to fly off the deep end end."

Just pointing out to me that there is a patch and that it was the devs would've done a better job than flying off the handle like a lunatic and making demands of me that you're in no position to make. It's called being a grownup.
 
Last edited:
Well, if there is a patch, great. It doesn't change the fact that 8GB of VRAM is not enough for a card as potent as an RTX 3070 Ti or that 10GB is not enough VRAM for a card as potent as an RTX 3070. Hell, the GTX 1080 Ti had 11GB and so too did the RTX 2080 Ti. If what you say is true (and I'm not one to call someone a liar unless I can prove it), I'm happy about it because I want people to enjoy gaming.

Going off the deep-end as you did however is not appropriate for adult conversation, I honestly don't care how "annoyed" you were about it because all I did was post a video done by a well-known and VERY well-respected benchmarker. Until shown otherwise, I have no choice but to trust his results. I broke no rules posting it and I've read nothing in the TOS that says:

"If the great and entitled KyaraM should ever (god forbid) be annoyed by something, that user and that user alone is allowed to fly off the deep end end."

Just pointing out to me that there is a patch and that it was the devs would've done a better job than flying off the handle like a lunatic and making demands of me that you're in no position to make. It's called being a grownup.
A test with very odd results - which the tester himself even admits are very odd! - without questioning and ignoring people telling you otherwise, in more than one topic, even, is still a bit short-sighted considering that there are quite a few videos backing me up that you could have looked up. Some are an entire week old already.

Also, my thoughts on HUBs review. It doesn't matter how reputable someone is. We are all humans, and humans make mistakes. There is always the possibility of some weird issue being present in his test system outside of user error as well. Still, tests like that show a very nonsensical picture for a long time afterwards and that is a problem for future customers since it distorts reality. But since you say you would or might listen to people who also conducted benchmarks on this and are well-known...

Good thing I come prepared!


It is in German, but I think the graphs themselves speak a clear language, especially together with my own observations that pretty much align 100% whit this. They didn't use Hogsmeade as the testing site here and didn't test the 3070Ti specifically, however, considering that the 3070 with the same amount of VRAM is in there, as is the 2080Ti, and that the 2080Ti also severely underperformed in HUBs benchmark, you can infer a lot from these findings nonetheless. They also tested bwfore the patch and on the first page of this review show in a graph the FPS average itself didn't change much.

I will only talk about 1440p here and compare PCGH toy own findings because to me that is most relevant and it is the testing I personally did, too. Also, first things first, memory use really went down a couple percent for me especially without RT, so there definitely was an impact from the official patch there that is also reflected by a comparison of the 6600XT on the first page of the article.

However, the 1080p results are there in the link above for you to see and compare with HUB, and they are quite clear. Also, they didn't use the community patch. I tested the game last night with the new patch Ultra quality, RT on, RT off, DLSS Quality on, off, and even with some settings adjusted at the end just to see how much they change. I got very similar performance as before, but with smoothed out framerates. There were still some drops in specific places in Hogsmeade, but they existed with all settings and I think that there is something weird in that area and it's not the RT itself that goes wonky there. You just feel it more. Oddly enough, my RT off, all settings Ultra average result got slightly lowere by about 4 FPS to 54 FPS instead of 58 FPS; however, the drops were cussioned off so well that the exact number of average FPS isn't really relevant here, it was around 43 FPS minimum or something and not very noticeable. These results match PCGHs almost 1:1 for the game. I think what caused it were either differences in conditions (weather etc.) or the fact that the heights were also smoothed out and lowered.

In the RT On, all Ultra, DLSS Off run, average FPS were 38, a tiny bit higher than before; most of the time, lows dropped to 26 FPS (though that happened rarely, most of the time it went to 31) and heights were at 45, however, there were two drops at the exact same position (and also the same position as without RT) down to 16. Again, this is that specific area next to the entrance to the Three Broomsticks that underperforms on any settings, so yeah. It's also not always, happened about twice in 10 passes; mostly it's down to 26. PCGH didn't test raw RT performance, but my findings and their findings for RT with DLSS Quality are a different matter.

When turning DLSS on, my FPS went up to about 50-52 FPS on average, which is in line with what PCGH shows for the 3070 and 2080Ti. I never saw less than 33 FPS under any circumstances, highs went to 65 FPS. Looking at the PCGH results, this is very comparable. Now, how does the vaunted RTX 3060 12GB fare? It gets 32.8 FPS on average and 26 FPS lows... with DLSS. Yeah. It outperforms the 3070Ti so hard indeed. Sampe picture in 1080p. Now, to test against pre-patch results.


Unfortunately, they only tested the 4090and 7900XTX here, but the results are quite similar especially for the 4090. If anything, it was the Radeon card that gained a little from the patch, Geforce seems to be similar to the same.

Lastly, I also tested some settings for myself to see if I can get an average of 60 FPS in Hogsmeade with DLSS and RT. Setting sky quality to low and turning off RT occlusion, which doesn't do anything visually, is enough to achieve that. I can live with turning off/reducing two useless settings... material quality and particle effects can go to High and Fog to Low without issues for nearly 70 FPS average without making the game look much worse. Like PCGH, I set the Field of View slider to +20. Also, V-Sync (ingame, else it ran in the driver with G-Sync enabled as well), Film grain and the setting above that which I forgot were turned off because they frankly look weird, but they shouldn't have much impact and most people turn them off anyways. So that should still be comparable to most people.
Oh, also, I set the DLSS sharpening slider to 0.06, which restored the looks for me to DLSS off levels. I don't think there was an impact, and if, then it was minimal and downwards rather than up.

They also tested different CPUs from different generations and both AMD and Intel on the next page. It's a weird mix they said they did because Denuvo was acting up when testing more than 5 systems, but it should still help to gauge your own CPU's capabilities in the game.

Ugh, I think that's all for now... got longwr than I thought...

EDIT:
A very quick search on YouTube also shows that my performance, not HUBs, is indeed normal for this card. Including this guy benchmarking in Hogsmeade with different settings and resolutions:
View: https://www.youtube.com/watch?v=mKRXBoyu7MI
 
Last edited:
  • Like
Reactions: AgentBirdnest
To all having this problem with 8gb or less Vram here is the real fix :

install Ascendio 11.1 with rt boost off and other options as you wish

then


modify the engine.ini as such:
LimitPoolSizeToVRam=1
Poolsize=3072

100+ Fps with no drops on 3070ti 2k high with Rt on and dlss on Auto

the game as a bug that floods the vram i think 🤷🏻‍♂️
 
  • Like
Reactions: KyaraM
To all having this problem with 8gb or less Vram here is the real fix :

install Ascendio 11.1 with rt boost off and other options as you wish

then


modify the engine.ini as such:
LimitPoolSizeToVRam=1
Poolsize=3072

100+ Fps with no drops on 3070ti 2k high with Rt on and dlss on Auto

the game as a bug that floods the vram i think 🤷🏻‍♂️
It does have a VRAM-bug, yes. They tackled it a couple days ago in a big update, explicitly fixing memory overflow. The Accendio fix improved things for me on my older 3070Ti that I run in my secondary PC, but the update really did the rest. Accendio is the community patch I mentioned above. I use DLSS Quality, and except in Hogsmeade where I have somewhat lower FPS on that card (which I attribute to my CPU considering that it makes no difference if I use DLSS or not and the CPU in question is a 12100F), I don't really drop below 60FPS, either, and the game runs pretty smooth.

Improve VRAM usage specially for video cards with reduced memory.

https://www.gamestar.de/artikel/hogwarts-legacy-update,3391017,seite2.html