Dying Light: Performance Analysis And Benchmarks

Status
Not open for further replies.

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
there is one setting in the game, i believe its draw distance, that is able to halve if not drop the games fps to 1/3rd what you would get if you set it to minimum, and from what people have tested, it impacts gameplay in almost no meaningful way.

what was that set to?
did you change it per benchmark?
is it before or after they patched it so even on max draw distance they lowered how far the game was drawing?

i know on my brothers 290X, i dont know if he was doing 1920x1200 or 2560x1600 was benching SIGNIFICANTLY higher than is shown here.

when you do benchmarks like this in the future, do you mind going through 3 or 4 setups and trying to get them to play at 60fps and list what options you have to tick to get that? it would be SO nice having an in depth analysis for games like this, or dragon age which i had to restart maybe 40 god damn times to see if i dialed in so i have the best mix between visuals and fps...
 

rush21hit

Honorable
Mar 5, 2012
580
0
11,160
Q6600 default clock
2x2GB DDR2 800mhz
Gigabyte g31m-es2L
GTX 750Ti 2GB DDR5
Res: 1366x768

Just a piece of advice to anyone on about the same boat as mine(old PC+new GPU and want to play this), just disable that Depth of Field and/or Ambient Occlusion effect(also applies on any latest game titles). And you're fine with your new GPU + its latest driver. Mine stays within 40-60FPS range without any lag on input. While running it on Very High Preset on other things...just without those effects.

Those effects are the culprits for performance drops, most of the time.
 

Cryio

Distinguished
Oct 6, 2010
881
0
19,160
Was Core Parking taken into account when benchmarking on AMD hardware ? It makes no sense that the FX 4170 is faster than the 9590.

The game works rather meh on my 560 Ti and Fx 6300 @4.5 GHz. But once I mess with the core affinity in task manager my GPU is getting 99% usage and all is for with the world.
 

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360
Just shows how badly they optimize for AMD hardware ...no wonder everything works faster on Intel. This comes from an Intel CPU user BTW.
 

xpeh

Distinguished
Jun 25, 2011
341
0
18,790
Typical Nvidia Gameworks title. Anyone remember Metro: Last Light? Unplayable on AMD cards until 4A issued a game update a few months later. Can't make a card that competes? Pay off the game developers.
 

Grognak

Reputable
Dec 8, 2014
65
0
4,630
@ xpeh - Couldn't agree more. A 750 Ti beating a 270X? 980 better than 295X2? I'm gonna stay polite but this is beyond ridiculous. This is pure, unabashed, sponsored favoritism.

Edit: after checking some other sites it seems the results are all over the place. Some are similar to Tom's while others appear to be relatively neutral regarding both GPU and CPU performance (though the FXs still struggle against a modern i3)
 

Empyah

Reputable
Mar 25, 2014
27
0
4,530
Sorry guys but you messed something up in these tests - cause my 290X is getting higher averages than your 980(i got an 4930k and view distance at at 50%), everybody knows know that you are heavily biased towards Nvidia and Intel, to the point it stops being sad and starts being funny how obvious it is - but if this test is on purpose we have ourselves found a new low today - cmon guys were in the same boat here - we love hardware and want competition.
 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285
Looks like very high CPU overhead with the AMD drivers, and really poor use of multiple cores with the CPU test. The former can be solved easier, the latter sounds like the developers haven't yet grasped the idea of more than two CPU cores working on a problem. Is this game really heavy on the L3 cache? It could explain major issues for the 9590 in trying to use it effectively with more cores and its higher clock speeds counting for naught (and/or the CPU is being throttled), but as the L3 cache is screwed on FX CPUs, that would also have a detrimental effect on things - it'd be worth testing a 7850 alongside a Phenom II X4/X6 to see if the removal of L3 or falling back to a CPU family with good L3 would make any sort of difference to performance.
 

caj

Distinguished
where are the anti i7/ hyperthreading fanboys? all we hear that i7 & hyperthreading is a waste 4 d future & here is the perfect example how a game can thoroughly use an i7 resources and pull along with the correct coding. hell my i7 870 with a 7850 can pull medium settings at 1080p on this game.there's no such thing as a PC just for gaming. You'll always want to do more with it, hence why you've bought a PC over a console in the first place.
 
I would say that gameworks affected the game so Nvidia would perform better but...
Turning a GTX980 to a GTX960 with an AMD CPU (who would want to cripple their own product?) is a product of an other factor.
Console porting and optimising lazyness...
 

neieus

Distinguished
I'm not debating the performance difference between Intel vs AMD because the vultures have picked the bones clean of that horse. I'm confused because when I game I never seem to experience this terrible level of sub par performance found in most of these articles.
 

razor512

Distinguished
Jun 16, 2007
2,130
68
19,890
From my experience with the game, it performs a little better with the Phenom II x6 overclocked than with any of the FX CPU's that I have tried. It seems that the game does not does not know how to properly spread its most demanding tasks across the core modules first before relying on the shared resources.
 

ykki

Honorable
Tom's I think that you should dedicate a page in these type of articles which state what settings that are available in the game and whether turning down/up one affects the game in a positive or negative way.
For example, people have been asking about draw distance setting.
 
This game looks ...interesting.
I too would like to see a target FPS selected, then an indication of what settings tweaks can be made to get there, and what affect they have on appearance.
 

Eggz

Distinguished
Great review! This game wasn't even on my radar as something to play until reading this write up.

Graphics also seem impressive and capable of challenging my 780 ti at 1080p. Thanks for the info!
 

InvalidError

Titan
Moderator

If the game actually used more than quad cores, there would be a significant difference between 8-core AMD and 4-core AMD. Since AMD's CPUs score the same regardless of core count, the Intel chips are pulling ahead mostly due to IPC.
 

Dustin Mock

Distinguished
Aug 15, 2014
11
0
18,510
Strange, how is my crossfire 4GB 290X set up getting steady 60 FPS with very high textures @ 4K while the author is getting upper 20's?

Either the author isn't smart enough to know how to turn on AFR mode on his crossfire profiles (not a good trait for PC reviewer), or he intentionally tried to get horrible results on purpose (also not a good trait for a PC reviewer).

Any chance of an explanation for this?
 
Status
Not open for further replies.