So much ignorance in this thread, oh my goodness...
This is a hardware/driver problem with nVidia and their AIBs. Period.
New World can be coded like crap, much like any school/uni project can. It should not be able to override or bypass HARDWARE protections. How is that hard to understand?
In Jay's video, he only found a bug with Afterburner and EVGA cards. He should've been using Precision instead. And no one here has any idea how the game itself was coded. And it shouldn't matter. Power Virus or not, it must not be able to kill a GPU so easily.
Yeah, I think there are a number of people who lack understanding of how modern computers work. Hardware and its drivers simply shouldn't allow software to exceed the capabilities it was designed for. If that happens, then it's a problem on the hardware manufacturer's end, not the software developer's. If a game manages to make a graphics card exceed its power limits, then that's the fault of the hardware manufacturer, either Nvidia or its board partners like EVGA. And even EVGA was claiming that the problem was down to inadequate soldering on their end, though that might just be the point of failure, while some other part of the hardware design may be at fault for allowing power draw to reach levels exceeding what the solder could handle.
If anything, people should be happy that this faulty hardware got exposed while it is still under warranty, rather than having the same faults triggered by other software a few years down the line. Games are only going to get more demanding over time, and if it's not New World exposing these problems, it's likely to be some other piece of software eventually. I imagine there is already other software out there that can trigger the same issue, but people are more likely to encounter it in this game simply because it's a new release with a large player-base, something that owners of a new high-end graphics card are very likely to try.
That's fine for users who have unlimited disposable income but is far from addressing the immediate issue. And most of us are on a budget. Odd response..
Anyone with limited income should not even be considering an RTX 3080 or 3090, especially with the way the pricing and availability of these cards has been since launch. Even if someone managed to snag one of these cards for the MSRPs Nvidia suggested they would be, that would still be at least $700 for a 3080, $1200 for a 3080 Ti or $1500 for a 3090. The 3090 in particular was an incredibly poor-value card at launch, being only slightly faster than the 3080 at over double the price. And cards of this level are not even a necessity to run any game on the market today. These cards are luxury items that people pay a premium for to run games at settings that give them minor improvements to visuals, not something anyone needs to play games.
It's probably Nvidia's fault; they likely had to fully open the taps to stay competitive with RDNA2. 😀
That might not be an unreasonable assessment. The power draw of the RTX 3080, 3080 Ti and 3090 are beyond what has been standard for "enthusiast-level" graphics cards for years. Typically, cards in this class top out around 300 watts, but even the Founders Edition variants have 320-350 watt TDPs, with many real-world gaming scenarios exceeding those levels. Partner cards with heavier factory overclocks can even exceed 400 watts. The board partners may have simply not had enough experience testing the power delivery systems for cards of this power level.
And it wouldn't be surprising at all if Nvidia chose to up the clock rates for these parts to be higher than initially planned once they discovered that AMD was actually going to have competitive products at the high-end again. The 3080 could have easily been configured as a sub-300 watt part if they weren't concerned about cards like the 6800 XT outperforming it. Or Intel's upcoming cards for that matter.