Well, if there is a patch, great. It doesn't change the fact that 8GB of VRAM is not enough for a card as potent as an RTX 3070 Ti or that 10GB is not enough VRAM for a card as potent as an RTX 3070. Hell, the GTX 1080 Ti had 11GB and so too did the RTX 2080 Ti. If what you say is true (and I'm not one to call someone a liar unless I can prove it), I'm happy about it because I want people to enjoy gaming.
Going off the deep-end as you did however is not appropriate for adult conversation, I honestly don't care how "annoyed" you were about it because all I did was post a video done by a well-known and VERY well-respected benchmarker. Until shown otherwise, I have no choice but to trust his results. I broke no rules posting it and I've read nothing in the TOS that says:
"If the great and entitled KyaraM should ever (god forbid) be annoyed by something, that user and that user alone is allowed to fly off the deep end end."
Just pointing out to me that there is a patch and that it was the devs would've done a better job than flying off the handle like a lunatic and making demands of me that you're in no position to make. It's called being a grownup.
A test with very odd results - which the tester himself even admits are very odd! - without questioning and ignoring people telling you otherwise, in more than one topic, even, is still a bit short-sighted considering that there are quite a few videos backing me up that you could have looked up. Some are an entire week old already.
Also, my thoughts on HUBs review. It doesn't matter how reputable someone is. We are all humans, and humans make mistakes. There is always the possibility of some weird issue being present in his test system outside of user error as well. Still, tests like that show a very nonsensical picture for a long time afterwards and that is a problem for future customers since it distorts reality. But since you say you would or might listen to people who also conducted benchmarks on this and are well-known...
Good thing I come prepared!
Auf dieser Seite zum Technik-Test von Hogwarts Legacy zeigen wir Ihnen unsere Benchmarks mit Radeon- und Geforce-Grafikkarten
www.pcgameshardware.de
It is in German, but I think the graphs themselves speak a clear language, especially together with my own observations that pretty much align 100% whit this. They didn't use Hogsmeade as the testing site here and didn't test the 3070Ti specifically, however, considering that the 3070 with the same amount of VRAM is in there, as is the 2080Ti, and that the 2080Ti also severely underperformed in HUBs benchmark, you can infer a lot from these findings nonetheless. They also tested bwfore the patch and on the first page of this review show in a graph the FPS average itself didn't change much.
I will only talk about 1440p here and compare PCGH toy own findings because to me that is most relevant and it is the testing I personally did, too. Also, first things first, memory use really went down a couple percent for me especially without RT, so there definitely was an impact from the official patch there that is also reflected by a comparison of the 6600XT on the first page of the article.
However, the 1080p results are there in the link above for you to see and compare with HUB, and they are quite clear. Also, they didn't use the community patch. I tested the game last night with the new patch Ultra quality, RT on, RT off, DLSS Quality on, off, and even with some settings adjusted at the end just to see how much they change. I got very similar performance as before, but with smoothed out framerates. There were still some drops in specific places in Hogsmeade, but they existed with all settings and I think that there is something weird in that area and it's not the RT itself that goes wonky there. You just feel it more. Oddly enough, my RT off, all settings Ultra average result got slightly lowere by about 4 FPS to 54 FPS instead of 58 FPS; however, the drops were cussioned off so well that the exact number of average FPS isn't really relevant here, it was around 43 FPS minimum or something and not very noticeable. These results match PCGHs almost 1:1 for the game. I think what caused it were either differences in conditions (weather etc.) or the fact that the heights were also smoothed out and lowered.
In the RT On, all Ultra, DLSS Off run, average FPS were 38, a tiny bit higher than before; most of the time, lows dropped to 26 FPS (though that happened rarely, most of the time it went to 31) and heights were at 45, however, there were two drops at the exact same position (and also the same position as without RT) down to 16. Again, this is that specific area next to the entrance to the Three Broomsticks that underperforms on any settings, so yeah. It's also not always, happened about twice in 10 passes; mostly it's down to 26. PCGH didn't test raw RT performance, but my findings and their findings for RT with DLSS Quality are a different matter.
When turning DLSS on, my FPS went up to about 50-52 FPS on average, which is in line with what PCGH shows for the 3070 and 2080Ti. I never saw less than 33 FPS under any circumstances, highs went to 65 FPS. Looking at the PCGH results, this is very comparable. Now, how does the vaunted RTX 3060 12GB fare? It gets 32.8 FPS on average and 26 FPS lows... with DLSS. Yeah. It outperforms the 3070Ti so hard indeed. Sampe picture in 1080p. Now, to test against pre-patch results.
Hogwarts Legacy erscheint bald und PCGH prüft, welche Hardware wirklich nötig ist und wie gut das Spiel jetzt schon aussieht und läuft.
www.pcgameshardware.de
Unfortunately, they only tested the 4090and 7900XTX here, but the results are quite similar especially for the 4090. If anything, it was the Radeon card that gained a little from the patch, Geforce seems to be similar to the same.
Lastly, I also tested some settings for myself to see if I can get an average of 60 FPS in Hogsmeade with DLSS and RT. Setting sky quality to low and turning off RT occlusion, which doesn't do anything visually, is enough to achieve that. I can live with turning off/reducing two useless settings... material quality and particle effects can go to High and Fog to Low without issues for nearly 70 FPS average without making the game look much worse. Like PCGH, I set the Field of View slider to +20. Also, V-Sync (ingame, else it ran in the driver with G-Sync enabled as well), Film grain and the setting above that which I forgot were turned off because they frankly look weird, but they shouldn't have much impact and most people turn them off anyways. So that should still be comparable to most people.
Oh, also, I set the DLSS sharpening slider to 0.06, which restored the looks for me to DLSS off levels. I don't think there was an impact, and if, then it was minimal and downwards rather than up.
They also tested different CPUs from different generations and both AMD and Intel on the next page. It's a weird mix they said they did because Denuvo was acting up when testing more than 5 systems, but it should still help to gauge your own CPU's capabilities in the game.
Ugh, I think that's all for now... got longwr than I thought...
EDIT:
A very quick search on YouTube also shows that my performance, not HUBs, is indeed normal for this card. Including this guy benchmarking in Hogsmeade with different settings and resolutions:
View: https://www.youtube.com/watch?v=mKRXBoyu7MI