And thus the other shoe just dropped, nVidia is providing "guidance" for it's affiliated partners, which I suspect they are required to follow under NDA or lose future access to products. This is why I'm going to wait until Steve on GN or Jay over at JTC do fully independent non-affiliated, unguided, and no-NDA testing. Steve was talking about how their team is looking to buy or borrow cards for the testing and report back to people about it.
The whole point of me writing about this (I didn't have to) was to disseminate information on the matter. Nvidia has not required anything, but as it rightly notes, you
can't measure MFG using any other tools right now. We are dependent on software to get this data, or else you have to use high speed cameras and spend 10X the effort. For little gain, frankly.
This is no different than AMD's AFMF / AFMF2, incidentally, except the only metrics you can get there have to come from the drivers — not even AMD's own PresentMon branch OCAT gives frametimes with AFMF enabled (last time I checked). It's effectively useless to try to provide proper benchmarks of AFMF2 outside of experiential reporting and video capture, which can turn into a massive time sink.
GN and J2C don't have a monopoly on information, and they're only independent insofar as getting paid directly from YouTube and others makes you "independent." I consider myself about as independent as I could be. No one at Future ever tells me how I should do my testing; they know enough to realize I know what I'm doing. I provide equal opportunity for AMD, Nvidia, Intel, Asus, MSI, etc. to send me information; what I do with it is my own decision. And the day I'm told I
have to write something positive about a particular company is the day I go looking elsewhere.
There's this mistaken impression among some that "old journalism" is biased and stuff like YouTube isn't, but in my experience a lot of the YouTube stuff ends up being highly opinionated and subjective and chases views more than any traditional coverage. It's like 10X more biased. There's good and bad YouTubers for sure, and good and bad traditional coverage, but the idea that a salaried individual writing for a site like Tom's Hardware is somehow more biased is laughable. I should go full YouTube and get paid 10X as much (if the channel gets big enough...) That's Jarred's 2 Centz, anyway. LOL
This is something Tim from HUB has mentioned consistently regarding frame generation. The most recent one I recall him talking about was Stalker 2 where the input was somewhat slow natively so using frame generation didn't feel any different than native. This has got to add a layer of nightmare for any reviewer because all you can really do is convey your experience with the technology in a single title as opposed to any sort of general recommendation.
The more I poke at games that use Unreal Engine 5, the more annoyed I get. If you have extreme hardware, it can work well enough, but I look at something like Stalker 2 or MechWarrior 5 Clans and compare that with Indiana Jones and the Great Circle and I'm shocked at how poorly UE5 tends to run. And not just poor performance, but the latency always feels worse in my experience. The games are playable, and humans are able to adapt to the higher latency (console gamers have been doing it for decades), but when you swap between certain engines, the difference in image fidelity, performance, and latency can at times be striking. Or stryking in your case.
😛