Dying Light: Performance Analysis And Benchmarks

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Are people really this dumb? Nobody said anything about a GTX980 being better than a 295x2 in pure performance on this article. It's obvious that this game doesn't like Crossfire, look at the 290X vs the 295x2, you'll see that the 295x2 is only like 3-5 min/avg fps higher than the 290X. This CLEARLY points to lack of crossfire support, or EXTREMELY poor scaling if it's even working at all. It's not like other game benchmarks haven't shown similar results. I mean, it's obvious if you actually look at everything that this game just isn't optimized for AMD hardware, and i speak of GPU's not CPU, when AMD actually makes a "new" CPU architecture instead of re-branding ancient 32nm FX- x100 series for 3 years in a row i'll hop on the benchmark fixing conspiracy wagon. Until then, AMD processors are SERIOUSLY falling behind, this is not the only test showing a lowly I3 dual core w/hyperthreading beating out a 9590 in most tests.
 
Uh... Why does everything perform so bad compared to the Techspot analysis? Aside from this... The graphics are made out so that a Geforce card is always at the top. In the Techspot analysis the R9 270 consistently beats the GTX 750 Ti and is on par with the GTX 660. The R9 270 is deliberately left out of the lower settings graphics. It's not as if that card is a lot more expensive than the 750 Ti so there's no reason to leave it out. With the medium details when for whatever reason the R9 270X loses in that specific setting, suddenly the 270X is added to the graph to make the GeForce cards look better. The bias is way too transparent and uncanny.

If you want reliable and honest performance graphs, go check out techspot, and forget these crappy analyses by TH.

http://www.techspot.com/review/956-dying-light-benchmarks/page3.html
 
1 question why did you use an i3 3220?it's not in your best picks and it's not a new i3...why not a G3258?or i3 4160? or i3 4330? cuz the 1st two are in your best picks and also when you overclock a G3258 it performs almost on par with i3 4330...so?
 

Nope. I posted this on a PCgaming forum and all people there were talking about how crappy this benchmark was and that techspot's numbers are far away from reality.
Techspot's benchmark was done inside a tower where the distance draw doesn't exist.
So Tom's Hardware benchmark is more close to real world gaming than Techspot's.
Not to mention that Techspot obviously doesn't have the performance patch of 1.4 cause it was released after Techspot's performance review.
 
Disgusting analysis. One could just simply argue that Nvidia got their drivers optimized for this game before launch. Still I don't believe this.
 
Seeing TechSpot's benchmarks makes me wonder what's going on behind the scenes with Toms' results.

It makes absolutely no sense for a 4.7Ghz FX-9590 to have the same or lower result as a 3.5Ghz FX-6300, much less a Bulldozer-derived 3.6ghz FX-4170.

290X same as a 960 at 1080p? Right.

Regardless of Nvidia or AMD, this game seems poorly optimized, even with TechSpot's correct measurements.
 
This was a nightmare for me. I bought the game on Steam and after all that wait of downloading the game says that I cannot run the game... My hardware is up to date on the minimum specs. It seems the game did not like one of my pieces of hardware. Hey developers! How about you let me and my computer decide if I can run the game like I don't know EVERY OTHER GAME OUT THERE.
 


I didn't know about the patch. Although this might be true, it still does not address the fact that selective results are portrayed so that Geforce cards are always on top. There's also the Guru3D benchmarks:

http://www.guru3d.com/articles_pages/dying_light_vga_graphics_performance_review,7.html

Look for yourselves which ones seem more consistent with each other...
 
It's really obvious that people don't read the comments before posting their own comment, especially here.

Repeating an earlier (more in-depth post), the consistency between sites on this game will vary in performance. There is no built in bench mark, so reviewers just pick a spot to test.

Some spots are hard on the hardware, others aren't. It's as simple as that. Tom's picked a spot that is rough on the hardware, other benchmarks didn't.

Understanding that is a basic part of knowing how to read benchmarks. The only time you should expect consistency in game benchmarks is when the game has a stand alone benchmark with preconfigured settings (e.g. Metro Last Light, Batman, Tomb Raider, Grid Auto Sport, etc.).
 
This game was so awful. Seriously it was bad, Evil Within was better than this but still pretty awful. Terrible AI and mechanics, its basically doing frustrating jumping puzzles all day for very little reward. Skip this and pick up farcry 4.
 
So in summary, the game is your typical Nvidia Gameworks title. Poor AMD optimization all around. Not only does AMD have to worry about Intel forcing developers to use gimped AMD code but it has to fend off Nvidia gimping the game engine towards AMD video cards.

The fact that they made this game run fine on the 8 core AMD processor in the consoles and not on AMD desktop processors, which are VERY similar, is pretty striking.
 


that's mean the game is not for you
 


i don't know why people like to assume "since the console have AMD hardware it must run good on PC with AMD hardware". unlike PC console have low level optimization that is not possible on PC. probably it will change with DX12 and Vulkan but i still think console low level API can do so much more because the hardware was outright static.
 

Guru3D uses the same patch as Techspot. Patch 1.2.:
After the initial launch issues, we figured, let's wait a little while until most of the stuff is rooted out by some patches. Based on patch 1.2 it's that time, so hence today we will present you our performance review of this title.
Also they found strange results too:
index.php
GTX690 easily beats R9 295x2 in their tests...

Since there is no built in benchmark, every site did random benchmarks, running around in the game.
It seems that Tom's Hardware benchmark was more difficult to handle by AMD Hardware.
Should they run an other benchmark cause this run gives a performance hit in AMD hardware? No. Since its in the game and all who play they are going to experience it, its nice to know that your PC doesn't have issues, its the game. Should they also do one, two different benchmark? Maybe. I would love to see more benchmarks in the same game in different time/place of the game, but it should be time consuming...
 
there's been new patch released and they have again improved performance this data is not entirely valid it's +10-20 fps more on maxed out details.
 
A game does not have to support SLI in order for it to work properly. I have been playing all my games for years in SLI, and have never had a problem. On the very rare occasion that you may run into a problem...simply go into your NVIDIA control panel and disable one of the cards. Gee...that was hard wasn't it?
 

While most games will work without explicit SLI/CF support, performance scaling is often grossly unpredictable even with games that explicitly support it.
 
Some of you completely miss the fact that NV has dedicated themselves to BETTER DirectX11 drivers, while AMD went Mantle at the expense of DX11 or DX12 (or decent linux drivers, they are worse at all three right now). Consoles, Mantle etc cost them valuable resources (30% layoffs etc over the time creating these) and cash to do R&D on their CORE products. This isn't a dev slowing AMD down, it's AMD management slowing down R&D where it SHOULD be spent (drivers, IPC CPU, GPU watts etc).

NV mentioned consoles would distract their R&D attention to their core products. He wasn't lying, and we see it in AMD's drivers, 290x launch the throttled, Freesync driver that is available AFTER monitors launch touting that feature, SLI Freesync coming a "claimed" month or two later. They should have both been done, but AMD has 30% less people to do all this stuff now, and far less money (~no profits yearly). We just have to face it, AMD management has continually made bad decisions for years. Can we get Jerry to come back? :) I love that guy. Maybe he could get Dirk back too for more cpu help (they worked well together).
 
HardOCP made tests too and their results were more close to Tom's Hardware tests:
1425883267pY1DNqXG9i_4_4.gif
In this graph we are raised the settings to the highest in-game settings to compare all video cards at equal settings. The GeForce GTX 980 is 28% faster than the AMD Radeon R9 290X. The GeForce GTX 970 is 20% faster than the AMD Radeon R9 290. In fact, the GeForce GTX 970 is even faster than the AMD Radeon R9 290X by 15%.
 
There must be something wrong with the testing methodology because techspot's CPU results are completely opposite of what we see here. To quote them:
"As always, a Core i5 or Core i7 is going to net gamers the most reliable performance, though the AMD FX-8000 and FX-9000 series were just as good in Dying Light. Based on our experience, AMD Phenom and Intel Pentium/Celeron processors are going to deliver less reliable performance with more periods of noticeable frame lag."

Source:
http://www.techspot.com/review/956-dying-light-benchmarks/page5.html

I don't believe that there is anything contrived here, unless the copy sent for testing was tampered with. I think that tomshardware might want t try testing this again.
 




I kinda have to agree with you. dying light has a latest patch & that seems to improve fps on amd hardware.anways i see that my i7 870 still can handle a good beating.
 
Status
Not open for further replies.