Name of the dinosaurs would be a better this time aroundJust change the names 😛
Aidivn HForce TheBigOne
Aidivn HForce NumberTwo
Aidivn HForce CheaperthanNumberTwo
Aidivn HForce AffordableOne
That depends on what fps you're talking about.
150 to 155 fps? Meh. 30 to 35 fps? That would be significant.
As InvalidError points out, it's mostly the 4GB 5500 XT cards that saw large improvements in performance from PCIe Gen4, because the lack of VRAM leads to situations where more data has to go over the PCIe bus ... plus it's limited to an x8 link width. I don't think anyone has shown games where Gen3 vs. Gen4 x16 slots result in more than a 1-2% improvement, and even then that's only if you're not fully GPU bottlenecked.
Shouldn't be much of an issue for GPUs since NVMe at 4.0x4 is 8GB/s, only half of 3.0x16's 16GB/s.I think it'll start to become significantly more important once Direct Storage API (or RTX IO, based on Nvidia branding) is put into the mix.
wouldn't being able to limit the pcie slot in the bios be able to balance the tests on a ryzen platform?I am! But obviously anyone buying this card hopefully isn't focused on 1080p gaming. Even the 2080 Ti was basically overkill at 1080p (unless you enable ray tracing). As for PCIe Gen3 vs. Gen4, I'm planning on testing with Ryzen 9 3900X and Gen4, but that's a different platform and CPU than i9-9900K and Gen3. It might make a difference at 4K, though. Maybe? Like, when you're GPU limited at 4K the added PCIe bandwidth might be useful. Next week ... start the 7 day countdown to launch day! 😀
Why? Using whatever-sync introduces a handful of extra variables to test repeatability that you don't have to worry about with vsync off.When you do the benchmarks with a RTX 3080 (and later when you get the 3070 and 3060), please consider using 1440p 144Hz with G-SYNC
Why? M$ F$ 2020 is not CPU or GPU bound, it is DirectX 11 bound. Until M$ and the company they worked with gets there act together no need to worry about it. Now seeing if one of the new CPUs with graphics built in can handle Crysis Remastered will be nice, and compare it to a system with the new GPUs - sweet.Forget Crysis Remastered. Everyone just wants to see a MSFS2020 benchmark.
Will it run Crysis Remastered on "Will it run Crysis?" settings?Now seeing if one of the new CPUs with graphics built in can handle Crysis Remastered will be nice, and compare it to a system with the new GPUs - sweet.
Virtual Link has been discontinued.No USB-C on this one?
Alt-mode sickness. Pushing 50 different standards down a single plug is madness and asking for trouble. Now we have a "universal" standard where some ports and cables are more universal than others so you have to keep tabs on which non-universal 'universal' port goes with which non-universal 'universal' cable to get things to work right.Virtual Link has been discontinued.
Alt-mode sickness. Pushing 50 different standards down a single plug is madness and asking for trouble. Now we have a "universal" standard where some ports and cables are more universal than others so you have to keep tabs on which non-universal 'universal' port goes with which non-universal 'universal' cable to get things to work right.
Why? Using whatever-sync introduces a handful of extra variables to test repeatability that you don't have to worry about with vsync off.