News Diablo IV PC Performance: We're Testing a Bunch of GPUs

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Okay, I'm done for now! All the charts have been updated, I reran some tests as well. DLSS 3 is mostly a bust, though! LOL. It increases minimum fps but often drops the average fps. Oops!

No RT at launch? Boo!

Looking forward to playing when I get home in a few hours... as said upthread my first journey into the Diablo universe since the original in 1997. 🤣 I never played D2 or D3.
 
Pretty much. I always get a laugh out of the "I'm building a PC for WoW what kind of hardware do I need?" threads...

Ummm... anything made in the last 20 years? 🤣 🤣 🤣
I ran the initial D4 beta on a Xeon E3-1245 CPU/16Gb memory, and a GT 1030.

I got 30 fps on 1080 low.
 
How do the 4070TI and 7900XT fare? I'm buying one of these two GPUs this week, and if the 7900XT is better, I might actually go that route.
7900xt can run game at 1440p at 144hz with max settings.
I am not sure about 4k but its probably on a chart there.

Is the testing using the high res texture pack?

I know its not part of test but memory usage would be interesting. Curious about page file usage really as I think this game is leaking all over the place.
 
I noticed memory leak looking in HWINO as it tracks minimum/max usage on virtual memory. I hadn't noticed anything in actual game itself but maybe I didn't play long enough.
 
I only hit 90% today although I haven't played long

1mRsBoI.jpg

thpse figures tell me its not ram, but page file its using. Could explain why my nvme runs at 60c while I am playing
 
Funny no one talks about it. If it was AMD it would be front page news. One would think with so much of the market being Nvidia it would be investigated.
They also ignore that AMD has the fastest CPUs across the board and will continue to use Intel CPUs for all their testing purposes every year.
Also if there's a negative piece of news to be had about AMD they'll make sure blow it up on their front page.
 
Its a shame it doesnt work very well without a constant internet connection. Considering its a single player game with a few multiplayer enhancements.

Agreed, I get server-based-authentication/validation for multiplayer to avoid cheating.

But today it's all just data mining of solo-player customers for profit. No intrinsic value to the user. The good old days of single-player offline play are missed indeed.
 
Funny no one talks about it. If it was AMD it would be front page news. One would think with so much of the market being Nvidia it would be investigated.

They also ignore that AMD has the fastest CPUs across the board and will continue to use Intel CPUs for all their testing purposes every year.
Also if there's a negative piece of news to be had about AMD they'll make sure blow it up on their front page.

Go Team Red! 👍👍
 
  • Like
Reactions: artk2219
They also ignore that AMD has the fastest CPUs across the board and will continue to use Intel CPUs for all their testing purposes every year.
Also if there's a negative piece of news to be had about AMD they'll make sure blow it up on their front page.
No, AMD CPUs are well represented on our best CPUs page. For GPU testing, I went with Intel because it was the fastest at the time. Non-X3D didn’t make sense for switching the test PC again. And while the 7000-series X3D chips are now faster overall, retesting 50-ish GPUs on a new platform is not something I want or plan to do. I’ll have to see what Zen 5 and Lunar Lake look like to decide if it’s time to update again next year, but a 5-10 percent delta at 1080p isn’t enough to start over on testing.

I stayed with i9-9900K for two years for the same reason, 10900K and 11900K weren’t enough to warrant a PC change, and neither was Ryzen 3000/5000. At least, not from the perspective of seeing major changes. RTX 40-series was enough to warrant a switch to reduce CPU bottlenecks last year, and 13900K was the best at the time.
 
No, AMD CPUs are well represented on our best CPUs page.
👍 👍

Let me just say I'm having a blast with D4... Blizzard and their dungeon crawlers always seem to hook me. First WoW... and now D4.

It's just unfortunate I have to go to work in a couple hours but there's always tomorrow. 🤣
 
  • Like
Reactions: JarredWaltonGPU
I have 4k@108fps on a stock clocked RTX 3080 ti without DLSS. DLSS quality I am 4k@160fps. Basically power limited at 350 watts. These are maximums. 99% drops abit. This is a 10900k system. I cant compare averages as I dont know what they ran as a demo.

Basically I am locked 4k60, it never drops below 60 fps. With or without DLSS. I can turn on DLAA and still get 4k60.

AMD cards are not going to be happy with FSR2. Its badly blurred compared to DLSS performance mode. DLSS you get near native looking picture. So FSR2 results look good fps wise but the image quality is not a good even on the highest quality settings.

View: https://youtu.be/tZ-uvHKR-aE


Here in this video the RTX 3080 ti is doing 108fps in the beta for its average. This is at 4k. Max background FPS at 60, same as my system. DLSS off. With frame generation the whole RTX 4000 series is the fastest and smoothest playing cards in this game.
 
Last edited:
No, AMD CPUs are well represented on our best CPUs page. For GPU testing, I went with Intel because it was the fastest at the time. Non-X3D didn’t make sense for switching the test PC again. And while the 7000-series X3D chips are now faster overall, retesting 50-ish GPUs on a new platform is not something I want or plan to do. I’ll have to see what Zen 5 and Lunar Lake look like to decide if it’s time to update again next year, but a 5-10 percent delta at 1080p isn’t enough to start over on testing.

I stayed with i9-9900K for two years for the same reason, 10900K and 11900K weren’t enough to warrant a PC change, and neither was Ryzen 3000/5000. At least, not from the perspective of seeing major changes. RTX 40-series was enough to warrant a switch to reduce CPU bottlenecks last year, and 13900K was the best at the time.
That's too many paragraphs to say you make sure to set the conditions in such a way there will always be an "Intel Inside" your systems. You had 9900K, then 12900K and now 13900K in your test systems, you are gonna tell me it's a coincidence that AMD was never featured?
 
  • Like
Reactions: SSGBryan
That's too many paragraphs to say you make sure to set the conditions in such a way there will always be an "Intel Inside" your systems. You had 9900K, then 12900K and now 13900K in your test systems, you are gonna tell me it's a coincidence that AMD was never featured?
There are still websites using the AMD 7700x as their test cpu. Hell one website I know of still uses the 10900k/11900k. Others 5900x/5950x cpus. Once they change the CPU they have to do all the tests again for 50 odd GPUs.

Hell one website just added the new test PC results in with the old test PC results because they dont care.

Be thankfull tomshardware does care.
 
  • Like
Reactions: JarredWaltonGPU
Be thankfull tomshardware does care.
I don't much care what brand CPU is used in testing... the high end processors are close enough in performance that you can get a pretty good idea of where you stand no matter what CPU you are running.
 
  • Like
Reactions: JarredWaltonGPU
I don't much care what brand CPU is used in testing... the high end processors are close enough in performance that you can get a pretty good idea of where you stand no matter what CPU you are running.
In some games one cpu does better than others. Some reviews use this to make some gpu look better performing.

Take the review, Ryzen 7 5800X3D vs. Core i9-12900K in 40 Games. We see that both CPUs are equal. Within 1% of each other. Both systems use the RAM they would likely be paired with for gaming.

On average in our test suite at 1080p, the 5800X3D is ~9% faster than the 12900K which costs 30% more, and ~7% faster than the Core i9-12900KS which costs a whopping 64% more. The 5800X3D even manages to carve out a 3% lead in the face of the heavily-overclocked 12900KS, making it both the fastest gaming chip in our test suite and a better value for gaming than the Core i9 and Core i7 models. Source
The difference is down to the RAM each system uses. Here both systems use DDR4.

Another way to make CPUs look better is lower the resolution so the cache improves performance.

480p-720p

Now onto the GPUs, 50 Game Benchmark: RTX 3080 12GB vs. RX 6900 XT.

Thing is though, it's the same story when looking at rasterization performance. Surely they trade blows, but overall it's unlikely you'd be able to spot a difference between these two products when actually gaming.

Playing games at 4K, the 3080 was 5% faster on average, and typically we deem anything 5% or less to be a tie.

So with SAM reviews. Forget that Intel was faster at the time. The test system is now AMD, with maximum overclock on the memory side. Games that benefit from SAM. AMD were also overclocking their own CPUs in systems with an AMD GPU via drivers.
 
Last edited:
If you think "We're Testing a Bunch of GPUs" is a pointless comment, then you really need to think about what you are publishing here.
I would have been wise to include a /sarc tag. This discussion has blown up quite large.

In any case, as has been noted more than once I think, this was "testing in progress." The article I replied to is not the article as it exists now, even the headline is changed.

One commenter early on even asked, "where are the benchmarks?"