News Diablo IV PC Performance: We're Testing a Bunch of GPUs

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
D

Deleted member 2838871

Guest
The game is well designed, looks good and everyone can play with decent setting. Later they will add Ray Tracing. A stark contrast to console ports.

I’m having a blast. I’ve got a few hours into the game and just got level 16 playing a Sorcerer.

It definitely has a big WoW feel to it which is fine. I was a big caster player in WoW (mage/priest/warlock) so a lot of the talents and gameplay are reminding me of it.

Necromancer is definitely the next character… it’s just unfortunate work and real life have to get in the way. 😂😂
 

zx128k

Reputable
I’m having a blast. I’ve got a few hours into the game and just got level 16 playing a Sorcerer.

It definitely has a big WoW feel to it which is fine. I was a big caster player in WoW (mage/priest/warlock) so a lot of the talents and gameplay are reminding me of it.

Necromancer is definitely the next character… it’s just unfortunate work and real life have to get in the way. 😂😂
I'm a Sorcerer as well. Currently near level 50.
 

Thunder64

Distinguished
Mar 8, 2016
114
161
18,760
That's too many paragraphs to say you make sure to set the conditions in such a way there will always be an "Intel Inside" your systems. You had 9900K, then 12900K and now 13900K in your test systems, you are gonna tell me it's a coincidence that AMD was never featured?

I didn't think much of it until you brought up using the 12900k and 13900k, as if there is some big difference there. Yet the 9900k to Zen 3 is not a big enough difference? :rolleyes:
 

Elusive Ruse

Commendable
Nov 17, 2022
375
492
1,220
I didn't think much of it until you brought up using the 12900k and 13900k, as if there is some big difference there. Yet the 9900k to Zen 3 is not a big enough difference? :rolleyes:
At this point, any discerning reader should make a mental note of adding an Intel product placement disclaimer to every article they read on Tom's.
 
  • Like
Reactions: Thunder64

zx128k

Reputable
At this point, any discerning reader should make a mental note of adding an Intel product placement disclaimer to every article they read on Tom's.
Intel control the CPU market, using an Intel CPU makes sense. Would like to see both because SAM affects performance on AMD GPUs but AMD's GPU market share is small. Less than 10%.

LYQyaNd33govhDno6AaG3B-1200-80.png


 
Last edited:
„Incidentally, DLSS 3 Frame Generation didn't seem to be working right. Or maybe we just needed to fully exit and restart Diablo IV for it to take proper effect.“

Good to see professionals at work… not.
I mean WTF?! Don’t be lazy, just restart the damn game ffs!

What a joke 👎
1685971570971.png
That was written after initial testing. I've updated the text to clarify the situation, but right now, Frame Generation doesn't seem to work properly. Average fps was lower on every single RTX 40-series GPU. That's after double-checking with a full restart of the game.

Next time try not to be a jerk about it. This was "testing in progress" until 3am after launch, so definitely not lazy. My best guess is that the final code had some updates so that Nvidia's earlier test results were no longer applicable. That or Nvidia's use of higher levels of upscaling was at play.
 
I didn't think much of it until you brought up using the 12900k and 13900k, as if there is some big difference there. Yet the 9900k to Zen 3 is not a big enough difference? :rolleyes:
For gaming, since I have XMP enabled, the 9900K was within a few percent of the 5900X. The same applied to 10900K and 11900K. We were waiting for a more significant bump in gaming performance, and while the 5800X3D certainly qualified, by the time it was available, it made more sense to wait.

For the RTX 4090, there was a relatively large difference, and we had an extra 13900K available. Plus we weren't using DDR5 memory for the 12900K setup, so a full update seemed reasonable. AMD sent a Ryzen 9 7950X over as well, which I used for some of the AMD reviews. The problem is trying to get everything tested within the time limits of a review, and if you look at the 7900 XT/XTX results, it was still no real difference.

8qpJfwPLAoXjbb7tLZno67.png

d8bjSkj6ZyoKQ6EnjFqZXL.png

FqPWWgYgevMBiSaUhTGZ8Z.png


But some people (specifically, you and Elusive Ruse) are missing the point of the GPU reviews. It's not really about the CPU, other than to have something that's relatively the fastest available. Less than a 10% delta at 1080p effectively doesn't matter. Because unlike our CPU reviews, I do full testing at 1440p and 4K, where the CPU has far less of an impact — outside of Flight simulator, we'd be looking at a <3% difference between most CPUs. And you can reference the CPU reviews to decide if you want to upgrade to a Zen 4 X3D instead of the 13900K, or if a Core i5/Ryzen 5 or whatever will suffice for your needs.
 
  • Like
Reactions: zx128k

zx128k

Reputable
Diablo 4 seems to run well on most hardware. There are issues but for the most part you dont need the best cpu or gpu to play. The real test in performance will come when blizzard add ray tracing to the game. Expect the ray tracing to be the light kind that runs well even on AMD cards. Blizzard made a game that looks good and doesnt need a tower of power to run.

The "tower of power".

Sega-Tower-Of-Power-Reddit.jpg.webp
 
Someone was asking about some lower-end AMD hardware I think? Or maybe not. But I found an R9 270 2GB card, which should be faster than a GTX 660. I decided to test it and added that to the 720p/1080p Low chart:

1685977018783.png
Technically, that's below minimum spec (which lists R9 280), but it still clears 30 fps for the most part. Maybe other areas of the game will struggle more, YMMV.
 

Thunder64

Distinguished
Mar 8, 2016
114
161
18,760
For gaming, since I have XMP enabled, the 9900K was within a few percent of the 5900X. The same applied to 10900K and 11900K. We were waiting for a more significant bump in gaming performance, and while the 5800X3D certainly qualified, by the time it was available, it made more sense to wait.

For the RTX 4090, there was a relatively large difference, and we had an extra 13900K available. Plus we weren't using DDR5 memory for the 12900K setup, so a full update seemed reasonable. AMD sent a Ryzen 9 7950X over as well, which I used for some of the AMD reviews. The problem is trying to get everything tested within the time limits of a review, and if you look at the 7900 XT/XTX results, it was still no real difference.

8qpJfwPLAoXjbb7tLZno67.png

d8bjSkj6ZyoKQ6EnjFqZXL.png

FqPWWgYgevMBiSaUhTGZ8Z.png


But some people (specifically, you and Elusive Ruse) are missing the point of the GPU reviews. It's not really about the CPU, other than to have something that's relatively the fastest available. Less than a 10% delta at 1080p effectively doesn't matter. Because unlike our CPU reviews, I do full testing at 1440p and 4K, where the CPU has far less of an impact — outside of Flight simulator, we'd be looking at a <3% difference between most CPUs. And you can reference the CPU reviews to decide if you want to upgrade to a Zen 4 X3D instead of the 13900K, or if a Core i5/Ryzen 5 or whatever will suffice for your needs.

Anandtech gace Zen 3 a "Editor's choice gold" yet that was not enough of a big deal? Give me a break.

Zen3 gets a gold award. No question.
 

zx128k

Reputable
Anandtech gace Zen 3 a "Editor's choice gold" yet that was not enough of a big deal? Give me a break.
13900k
Editor's Choice Techpowerup
Guru3D Recommended
PCMag Editors' Choice Best of the Year 2022. Note AMD dont get best of the year.

12900k
Guru3D Recommended note most amd cpus are recommended as well. Only the 7800X3D cpu is higher, 5800x3d is recommended.
5 star review out of 5
PCMag Best of the year 2021 Editor' Choice. Note AMD dont get best of the year.

anandtech
7950x3d spot the award
Ryzen 9 7900X3D spot the award
7900/7700 spot the award
Burnout issues
5800X3D review spot the award
Award This was after the 11900k but before the 5800X3d and 12900k. The 11900k was not a good CPU release from Intel.

I really had to look for the award, given the only CPU that came close to Intel for game performance was the 5800x3d and that got no award. Cherry picking are we?
 
Last edited:

Thunder64

Distinguished
Mar 8, 2016
114
161
18,760
13900k
Editor's Choice Techpowerup
Guru3D Recommended
PCMag Editors' Choice Best of the Year 2022. Note AMD dont get best of the year.

12900k
Guru3D Recommended note most amd cpus are recommended as well. Only the 7800X3D cpu is higher, 5800x3d is recommended.
5 star review out of 5
PCMag Best of the year 2021 Editor' Choice. Note AMD dont get best of the year.

anandtech
7950x3d spot the award
Ryzen 9 7900X3D spot the award
7900/7700 spot the award
Burnout issues
5800X3D review spot the award
Award This was after the 11900k but before the 5800X3d and 12900k. The 11900k was not a good CPU release from Intel.

I really had to look for the award, given the only CPU that came close to Intel for game performance was the 5800x3d and that got no award. Cherry picking are we?

You put together all of those links just to defend Intel? Shill.
 
13900k
Editor's Choice Techpowerup
Guru3D Recommended
PCMag Editors' Choice Best of the Year 2022. Note AMD dont get best of the year.

12900k
Guru3D Recommended note most amd cpus are recommended as well. Only the 7800X3D cpu is higher, 5800x3d is recommended.
5 star review out of 5
PCMag Best of the year 2021 Editor' Choice. Note AMD dont get best of the year.

anandtech
7950x3d spot the award
Ryzen 9 7900X3D spot the award
7900/7700 spot the award
Burnout issues
5800X3D review spot the award
Award This was after the 11900k but before the 5800X3d and 12900k. The 11900k was not a good CPU release from Intel.

I really had to look for the award, given the only CPU that came close to Intel for game performance was the 5800x3d and that got no award. Cherry picking are we?
The other problem with AnandTech's CPU testing is that most of those gaming tests are fundamentally useless. 360p Low? Nonsense! It has almost no bearing on the way people actually play games. The low resolution means texture sizes and cache hits are radically different than at 1080p! The same applies to 1440p and 4K Low: Minimum quality settings fundamentally changes what the games are doing. There are plenty of games where a big chunk of the FPS drop in going from medium to ultra settings comes from all the additional geometry and stuff that's added to the rendering process.

So of the four tests that Ian ran for the games, three don't actually convey practical data. And then the test that is useful (1080p max), there's far less of a difference between the CPUs. Focusing there, here are the numbers comparing AMD's fastest 5900X (usually fastest) versus the i7-10700K (which is basically equal to an i9-9900K):

Chernobylite: 3% lead
Civilization VI: 51% lead (highly questionable benchmark, given how much of an outlier this is)
Deus Ex Mankind Divided: 1.5% loss
Final Fantasy XIV: 11% lead
Final Fantasy XV: 5% loss
World of Tanks: 0.5% loss
Borderlands 3: 3% lead
F1 2019: 0.6% loss
Far Cry 5: 8% lead
Gears Tactics: 10% lead
Grand Theft Auto V: 5% lead
Red Dead Redemption 2: 1% loss
Strange Brigade: 1% loss

So, outside of Civ6, it wasn't a massive change in performance. And I often wonder about some of the results from Ian, since he's usually focused on automated testing and that can cause anomalies that he wouldn't notice because he's not actually watching the screen and testing the games.

I also tested with XMP and DDR4-3600 memory, and when I checked Ryzen 9 5900X against Core i9-9900K and Core i9-10900K, the difference was minimal. As in, at 1080p Ultra (sometimes with DXR), here's what I measured back in January 2021:

ADeKgqW7Uu8J6XG8FYkgeW.png

The 5900X was 0.5% faster than the 10900K and 3.5% faster than the 9900K at 1080p with a 3090. With a 6800 XT, it was tied (fractionally slower at 0.2%) than the 10900K and 1.9% faster than the 9900K. With a 3080, it was 0.9% faster than the 10900K and 2.8% faster than the 9900K. RX 6800, it was tied (0.4% faster) than 10900K and 1.8% faster than 9900K. The margins shrink to basically nothing once we drop below the RX 6800 level. Is that sufficient data?

When I considered that I had benchmarked over 60 GPUs on the 9900K, and that if I swapped to a new test system, I would need to retest everything, I decided to stick with the 9900K for another year. Then I skipped the 11900K as well. Only with a major change (12900K) did I decide to upgrade and retest, which I did at the start of 2022. It took months to get everything retested!

Then I swapped again because of the RTX 4090 ("too fast" for the 12900K). I'm still finishing the last few tests/retests with that one for the GPU hierarchy, several months later. (I tossed two months of testing when I realized I had VBS enabled.)

If you want to be an AMD or Intel or Nvidia fanboy, fine. But don't expect others to buy into your world view. They're all fine for various purposes. And I will still go on record with saying I've had far more issues and oddities using Ryzen PCs than with Intel PCs.

The first and second generation Ryzen motherboards and CPUs were, IMO, decidedly not awesome. I had two 2700X CPUs die, along with a Ryzen 7 1700. I also had about 10 X370/B350 motherboards die (seriously, by the time I was done with CPU testing a few years later, I only had two fully functional X370 boards left!), and three X470 boards died.

Do you know how many Intel CPUs and motherboards have died in that same timeframe? Not. One. Every CPU and mobo still worked when I switched from PC Gamer to Tom's Hardware. That's anecdotal evidence, but that experience has always left me a bit gun-shy of switching to an AMD system as my primary testbed.
 
  • Like
Reactions: zx128k

zx128k

Reputable
No one plays below 1080p these days with a decent gpu. I remember the 720p reviews for the 5800X3D (here, here, here) because the cache worked better the lower the resolution. Yet with Intel reviews no one cared about 720p. If you look at the same sites, example here. The 10900k is reviewed at 1080p, so the 720p stuff was just for the AMD cpus. techpowerup is not a good example as they always used 720p. overclock3d didnt here 720p on thier 10900k review. anandtech dropped all the way down to 480p etc for their 5800X3D review. Same with the 11900k review 1080p. The 10900k review only did down too 1080p. Funny reviews doing this change for just AMD CPU 5800X3D reviews. Then they do it again for the 7800X3D, 720p and lower. Same with the 7000 series reviews. 12900k back to 1080p. 13900k they changed the review to have 720p and lower. Then there is the memory for the Intel cpus, the memory speed begs the question about fairness of the tests.
  • DDR5-5600B CL46 - Intel 13th Gen "can go over DDR5-6000 just fine with xmp" and many systems have DDR5-7200 kits. DDR5-6000 is a good DDR5 speed.
  • DDR5-5200 CL44 - Ryzen 7000 "can't go over DDR5-6000 or RIP SoC >1.3 volts and CPU burns". Above DDR5-6000 is hard to get working and there is a sweet point for RAM frequency on AMD systems. Even so some people have issues getting DDR5-6000 to work.
  • DDR5-4800 (B) CL40 - Intel 12th Gen "better off with DDR4" Lowers Intel performance.
I know its stock RAM but these platforms can safely run at higher speeds. The AMD 7000 without tuning the RAM makes no sense. Massive performance reduction and no one will run it a stock. Intel and AMD can do at least DDR5-6000. Everyone does faster RAM these days. These are not very fast kits timings wise. There are DDR5-6000 CL36 kits.

 
Last edited:

zx128k

Reputable
This does look fun but I'm not sure I can bring myself to pay £60 for a game.
£60 is just the base version. It will be £89.99 or £78.99 for most players. Looking for that 4 day early release. So they could start leveling to 100 faster than anyone else. Players are level 100 before the release on the 6th which is a very big deal in this game.

It is a good game, unlike Redfall. nVidia bundled it with GPUs lol.
 
Last edited:
£60 is just the base version. It will be £89.99 or £78.99 for most players. Looking for that 4 day early release. So they could start leveling to 100 faster than anyone else. Players are level 100 before the release on the 6th which is a very big deal in this game.

It is a good game, unlike Redfall. nVidia bundled it with GPUs lol.
That's even worse. I guess it's not an enjoy the journey kind of game.
 
D

Deleted member 2838871

Guest
That's even worse. I guess it's not an enjoy the journey kind of game.

It is what you make it. If you want to no life power grind to 100 like some people are doing... go for it. I know I'm enjoying the journey and am not taking days off work just to see how fast I can get to level 100. Currently level 22 on my Sorcerer and have enjoyed every minute of it.
 
  • Like
Reactions: Flayed