News Halo Infinite Benchmarked: Master Chief Eats Tons of VRAM

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Our test sequence uses one of the first outdoor areas of the game, cleared of enemies so that we can simply run the same route each time without the threat of dying. Note that performance can be quite a bit higher in other areas of the game
Yes, performance actually varies so much that I would say you test is really not indicative of how horrible the game runs in the most demanding parts...

Your test, with the older i9 9900k and a low demanding area cleared of enemies gets this:
agLiujzp7Y2dP3jDXkuApH.png


And here is Hardware Unboxed on a Ryzen 5950x, doing runs on one of the most demanding parts of the game, full of enemies... can you see how big the difference is?


What I don't understand is why the results are inversed? nvidia is stomping AMD in HUB's test while in your case is AMD is above nvidia...
 
  • Like
Reactions: phenomiix6
i've been playing the multiplayer at Ultrawide 3440x1440 and its been great however my setup is highend 5800x+6800XT. And I have SAM on which is suppose to give a 8% boost in performance from what i've seen posted elsewhere.
 

wifiburger

Distinguished
Feb 21, 2016
613
106
19,190
and... hardware unboxed shows better FPS for Nvidia and I trust hardware unboxed way more

At least hardware unboxed explained where they run the benchmark and showed on video the run.

This is not the first time I see crazy FPS numbers on tomshardware in favor of 6800/6900 but in reality / other reviews they fall behind the 3080/3090.

Not to mention crap AMD drivers which are crazy inconsistent with fps for various games.
 
and... hardware unboxed shows better FPS for Nvidia and I trust hardware unboxed way more

At least hardware unboxed explained where they run the benchmark and showed on video the run.

This is not the first time I see crazy FPS numbers on tomshardware in favor of 6800/6900 but in reality / other reviews they fall behind the 3080/3090.

Not to mention crap AMD drivers which are crazy inconsistent with fps for various games.

You may want to slow down on the fanboying bruh.

A comment from the steve on the youtube video.



My performance with SAM on matches what toms is showing.

View: https://www.youtube.com/watch?v=gFLrRdU5f-M
 

wifiburger

Distinguished
Feb 21, 2016
613
106
19,190
did they use the Windows store version of the game ?

Because the nvidia driver optimization for the game is not active with the ms store version !

good move by AMD lol

"as this is an AMD-promoted game. Sadly, that code was for the Microsoft Store "
 

Blacksad999

Reputable
Jun 28, 2020
71
50
4,620
Yes, performance actually varies so much that I would say you test is really not indicative of how horrible the game runs in the most demanding parts...

Your test, with the older i9 9900k and a low demanding area cleared of enemies gets this:
agLiujzp7Y2dP3jDXkuApH.png


And here is Hardware Unboxed on a Ryzen 5950x, doing runs on one of the most demanding parts of the game, full of enemies... can you see how big the difference is?


What I don't understand is why the results are inversed? nvidia is stomping AMD in HUB's test while in your case is AMD is above nvidia...

Yeah, I was curious about this myself. All other benchmarks I've seen from reputable places largely have the same results as HWU, with Nvidia ahead by a fair margin.
 
  • Like
Reactions: VforV
Yes, performance actually varies so much that I would say you test is really not indicative of how horrible the game runs in the most demanding parts...

Your test, with the older i9 9900k and a low demanding area cleared of enemies gets this:

And here is Hardware Unboxed on a Ryzen 5950x, doing runs on one of the most demanding parts of the game, full of enemies... can you see how big the difference is?

What I don't understand is why the results are inversed? nvidia is stomping AMD in HUB's test while in your case is AMD is above nvidia...
I did have ReBar enabled, and while having enemies and fighting them can reduce performance, that also introduces more variability and death. Ultimately, I pick an area of the game I can test repeatedly in as efficient a manner as possible. Testing Battlefield 2042 was an absolute nightmare, even with bots, so I wanted to avoid that this round. We're not the first to post results, but other sites use different hardware, different settings, different test areas. They're all a valid snapshot of what you're likely to see.
Would be greatly appreciated if the 3070 featured in these benchmarks. Its notable by omission. Are you guys missing a test card?
I grabbed representative cards where it's pretty easy to fill in the gaps.

RTX 3090, skip 3080 ti and 3080, RTX 3070 Ti, skip 3070 and 3060 Ti, RTX 3060, skip most of the RTX 20 series (because it overlaps 30-series), but include the RTX 2060. That's all. Then I matched up similar AMD cards (I skipped the 6800 and 6800 XT but included the 6700 XT, mostly to see if 12GB did fine for AMD — and it did). Every card takes more time to test, and you eventually hit saturation levels where more data doesn't really tell you much. It would be potentially more useful to test other areas of the game, or multiplayer, rather than adding more GPUs (IMO).
 

Blacksad999

Reputable
Jun 28, 2020
71
50
4,620
I did have ReBar enabled, and while having enemies and fighting them can reduce performance, that also introduces more variability and death. Ultimately, I pick an area of the game I can test repeatedly in as efficient a manner as possible. Testing Battlefield 2042 was an absolute nightmare, even with bots, so I wanted to avoid that this round. We're not the first to post results, but other sites use different hardware, different settings, different test areas. They're all a valid snapshot of what you're likely to see.

I grabbed representative cards where it's pretty easy to fill in the gaps.

RTX 3090, skip 3080 ti and 3080, RTX 3070 Ti, skip 3070 and 3060 Ti, RTX 3060, skip most of the RTX 20 series (because it overlaps 30-series), but include the RTX 2060. That's all. Then I matched up similar AMD cards (I skipped the 6800 and 6800 XT but included the 6700 XT, mostly to see if 12GB did fine for AMD — and it did). Every card takes more time to test, and you eventually hit saturation levels where more data doesn't really tell you much. It would be potentially more useful to test other areas of the game, or multiplayer, rather than adding more GPUs (IMO).

Did you use the Gamepass version for testing? It's not using the updated profile for Nvidia that the Steam version is using, which obviously would change the results.
 

wifiburger

Distinguished
Feb 21, 2016
613
106
19,190
Did you use the Gamepass version for testing? It's not using the updated profile for Nvidia that the Steam version is using, which obviously would change the results.

I asked the same question earlier. I mean, he even said he killed all enemies so he can go back at the start of the map to record FPS.

That's the biggest fail ever as a reviewer. How is removing 50+ enemies on the map indicative of GPU performance numbers?
 
Did you use the Gamepass version for testing? It's not using the updated profile for Nvidia that the Steam version is using, which obviously would change the results.
As noted in the article, I got a game code from AMD for the Microsoft Store version. It was not Gamepass, but the full permanent version of the game. I don’t know if that makes a difference. If the MS Store version runs worse because of a lack of driver optimizations, that’s what I tested. As noted in the text, I also used resolution scaling rather than changing the desktop resolution, which also impacts performance. Overall, the results are valid for what they show, but other versions of the game may change performance. That's always the case for driver updates and game patches.
 
I asked the same question earlier. I mean, he even said he killed all enemies so he can go back at the start of the map to record FPS.

That's the biggest fail ever as a reviewer. How is removing 50+ enemies on the map indicative of GPU performance numbers?
How is running the same path but with enemies in different places, with varying explosions and other effects, representative of performance? You could have good runs and bad runs and a margin of error of 5% or more. My testing was <1% deviation between runs. If you could run the exact same test every time but with enemies, performance would be lower but should scale in the same way.

I list the specifics of how I test so that people can understand what was done. There are infinite ways to conduct tests, and no one single way is perfect. This is not a failure on the part of my testing, it’s just the way things are.

I can tell you that I went into this with no expectations and I didn’t try to make AMD look good or bad, nor did I try to make Nvidia look good or bad. I found an area with typically lower performance than some of the early zones, then confirmed results were repeatable, and then commenced testing the eight GPUs. It’s what I do with any game, including those with a built-in benchmark. Check your own bias at the door.


EDIT: For the record, I just ran a longer test sequence, running around the test area WITH and WITHOUT enemies. (I had actually checked this before, but didn't keep the results.) This was all at 1080p ultra on an RTX 3090. Here are the full results, including percentiles:

109

You'll note that the 99th percentile FPS is lower than in my original testing on the first test run, but that's because it was the first run of the level -- caching of data comes into play. The second test (Clear 2) matches closely on 99th percentile, but both runs have a higher average FPS because I ran through some less complex areas over the course of 90 seconds (vs. 30 seconds of only the more demanding area). Now look at the battle runs. I died on the second and had to cut that short, but instead of a <1% variation on average fps, it's now 3% and there's a 15% spread in 99th percentile fps. Lots of active enemies and fighting to not die means things are less controlled. However, the average FPS of the "battle" runs compared to my original data is within 3% if I exclude the (2) "death run".

Bottom line is that if you think the testing I did is total fail, because there weren't enemies on the level, this proves how little you know about the way games work. 90% of the graphics workload tends to be pretty much static, rendering the level and all the other stuff. The last 10% can include rendering other NPCs and such, but it usually doesn't change things more than a few percent. So why did I test without enemies? And why did I fully disclose the way I test? Just to make sure people understood how benchmarking works. As one final example, I did a test on the first part of the game, running through the doomed starship. The average FPS for that sequence was 172.3, with a 113.0 99th percentile FPS. Testing in that area was far less demanding, but it would be equally valid because it still represents part of the game. I use areas that tend to be lower in FPS just so people don't get their hopes up.
 
Last edited:

Arbie

Distinguished
Oct 8, 2007
208
65
18,760
On the game itself, for those getting interested: It's called Halo Infinite because you run through infinite numbers of identical ship levels shooting almost interchangeable enemies using almost interchangeable weapons all vanilla in flavor and effect. So - if those ship levels aren't infinite at least they feel like it. I was bored in an hour. It's like playing the original Doom with much better graphics but less character. There are many games just like this now; all dreck.

Try Zombie Army 4 for hugely more fun. Or Bulletstorm, or the recent Wolfensteins.
 

blppt

Distinguished
Jun 6, 2008
580
104
19,160
I'm actually impressed with the 6900XT here. If memory serves, this is the second game where it outperforms the 3090 (Hitman 3 being the other) at 4k, and that is even more impressive when you consider that the 6900XT has less memory bandwidth than the 3090.
 
  • Like
Reactions: Sluggotg

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
I did have ReBar enabled, and while having enemies and fighting them can reduce performance, that also introduces more variability and death. Ultimately, I pick an area of the game I can test repeatedly in as efficient a manner as possible. Testing Battlefield 2042 was an absolute nightmare, even with bots, so I wanted to avoid that this round. We're not the first to post results, but other sites use different hardware, different settings, different test areas. They're all a valid snapshot of what you're likely to see.
Yeah, ReBar is one difference, then the PC specs, especially the CPU and the most important one, the section of the game tested and how it was tested.

You say your test is valid, but for me is not. Why? Because I care about the worst case scenario, the most demanding part of the game as it is when I'm playing with the enemies and all that jazz, not a controlled sterile environment. And it shows the difference...

The truth is HUB made the best benchmark (without ReBar, Steve said a new test is coming with it ON and he did not use the MS store version, which is the worst - he specifically says that), because it's the closest representative of what we, all gamers, get on those demanding parts of the campaign. Look at his video, how he did the run, set on easy, running thru all the enemies on that section, without dying. There is something to learn from that.

I don't care if I have 100fps of 120 fps in low demanding zones, but I do care if I get 45fps or 60fps in the worst demanding parts of the game. It's the difference of having a smooth gameplay and not.

I'm sorry, but not all test are equal and not all testers are either, some are better and some need to get better. And you can do better, so that in the future your tests are better too and your work is not in vain. It's just the cold hard truth.
 
  • Like
Reactions: wifiburger

omega215d

Distinguished
Aug 13, 2014
4
3
18,515
and... hardware unboxed shows better FPS for Nvidia and I trust hardware unboxed way more

At least hardware unboxed explained where they run the benchmark and showed on video the run.

This is not the first time I see crazy FPS numbers on tomshardware in favor of 6800/6900 but in reality / other reviews they fall behind the 3080/3090.

Not to mention crap AMD drivers which are crazy inconsistent with fps for various games.

And then you fanboys get wrecked.
View: https://www.youtube.com/watch?v=gr9TORbHA2M


Toms Hardware has been giving nvidia high marks for their GPUs so stop accusing them of being biased in AMD's favor with this one.
 
  • Like
Reactions: Sluggotg
I'm actually impressed with the 6900XT here. If memory serves, this is the second game where it outperforms the 3090 (Hitman 3 being the other) at 4k, and that is even more impressive when you consider that the 6900XT has less memory bandwidth than the 3090.

AMD have always been faster on halo game. i think it might have to do with the game engine itself that like AMD architecture more. also some games are more "sensitive" to bandwidth than the other. you can see how cards like 6600XT acting up when moving up the resolution. in some games we can see at 1440p cards that have more bandwidth like RX5700XT or RTX 3060 can close their gap with RX6600XT,
 

wifiburger

Distinguished
Feb 21, 2016
613
106
19,190
And then you fanboys get wrecked.
View: https://www.youtube.com/watch?v=gr9TORbHA2M


Toms Hardware has been giving nvidia high marks for their GPUs so stop accusing them of being biased in AMD's favor with this one.
I personally don't care and most won't care because AMD GPUs are notorious for pushing high FPS in certain less demanding scenes of certain games which doesn't represent real game play but look good on FPS charts !!!!

And... here Steve didn't actually enable rebar, it needs nvidia inspector, halo profile & change 3 flags...