Infamous GPU stress testing app update is in final beta stage, and version 2.0 will hit general release shortly.
New FurMark Version 2.0 Arrives After 16 Years : Read more
New FurMark Version 2.0 Arrives After 16 Years : Read more
The thing about Furmark is that it seems both highly-parallel and 100% compute-bound. So, it's basically guaranteed to find a GPU's peak temperature, since there's no bottlenecking on memory, occupancy, or anything else to throttle it.I didn't even realize the 1.x was still relevant.
Make | Model | GFLOPS | GB/s | FLO/B |
---|---|---|---|---|
AMD | RX 6950 XT | 38707 | 576 | 67.2 |
AMD | RX 7900 XTX | 46690 | 960 | 48.6 |
Intel | UHD 770 | 825 | 90 | 9.2 |
Intel | A770 | 17203 | 560 | 30.7 |
Nvidia | RTX 3090 Ti | 33500 | 1008 | 33.2 |
Nvidia | RTX 4090 | 73100 | 1008 | 72.5 |
I'll bet the main difference is Vulkan support. Depending on how well they implemented the OpenGL version, there might not be much performance difference, between it and the Vulkan path. If there is, then you're right that Vulkan could pose a more serious fire risk!The original FurMark was punishing as hell, can't even imagine 2.0
People use to fry the graphics card! My first two days with the 6700XT have used the furmark to burn in the card and loose some coil whine in process be cause you can set the resolution change the fps behavior bur the power are Aways in max My graphics giver about little more than 175w when gaming in furmark way beyond the limit!I didn't even realize the 1.x was still relevant.
*edit:SP
Most definitely, it still heavily punishes a graphics card, more than once ive found a defective one in a build with it. The card would work fine on Unigine Superposition pretty much indefinitely, but as soon as it ran furmark it would crap its pants, letting me know that its not 100% stable.I didn't even realize the 1.x was still relevant.
*edit:SP
It just does a lot of shader arithmetic. No matter how fast a GPU is, a benchmark like that can saturate it - although, you might get absurd framerates like 900 fps....How is it that a GPU stress-test from 2007 can still be relevant anyway?
You could also say that about cryptomining. Newer GPUs do it faster, but a GPU can never do it too fast.If it was designed to stress GPUs from 2007, then surely it's a cakewalk for 2023 GPUs?
Because GPUs typically ramp up their fans and then throttle their clock speeds, as their temperature increases or if their power limit is exceeded.if it's still stressful for 2023 GPUs, then how on Earth did 2007 GPUs not instantly catch fire while rendering at 1 frame per hour?