Dying Light: Performance Analysis And Benchmarks

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
these barely-researched articles need to stop being written. don't even post the 295x2 if it's obvious that crossfire/sli isn't working. You might as well post the Titan Z in that case, if your intent wasn't just to make AMD look bad. And why not actually come away with a real conclusion at the end? How about calling out the developer for this crap? This game isn't 'demanding,' it's poorly optimized. A 960 as powerful as a 290x? You're better than this, Tom's.
What if you shuffled the writers around, so they don't get cozy with developers huh?
 


http://www.hardocp.com/article/2015/03/10/dying_light_video_card_performance_review/7

"The issue here, unlike NVIDIA, is that AMD hasn't updated its drivers in more than three months! That means no new CrossFire profiles for games like this which have been released since AMD's last driver update. That is poor support from AMD on new games that need new CrossFire profiles. Shame on AMD for backsliding on CrossFire support in games."

It's not toms, hardocp's etc's fault that AMD hasn't released a driver update in 3 months, when this game, as he notes in the review, came out BEFORE that last driver. It is the review website's job to REPORT reality. Reality is AMD is strapped for cash, 1/3 of the employees have been let go in the last 3-4 years, management that has no idea how to run AMD, console wins which lead to much of the cuts on workers (can't do all this stuff without people to DO the actual work, like drivers, engineering etc) as R&D that went to consoles should have been spent on CORE products (CPU, GPU, and drivers for those GPUS) etc. I could go on but you should get the point. AMD IS WEAKER than ever before. There is a reason bean counters at Nvidia said taking on consoles would cost their core products, hence Jen passed (GOOD management).

You're blaming the messenger, instead of the company themselves for slacking off. But this really shouldn't be a surprise, they are battling on too many fronts with no resources with which to fight (IE, no profits for R&D), and it's about to get worse as Intel speeds down and arm speeds up the food chain. Making matters worse (on top of APU's getting squeezed to death), you have NV gaining share and stuffing up the channel on AMD's side of the gpus, hence the delay of their next card until Q3 most likely (or worse if they continue to have sales problems vs. Nv 900's). It again gets worse if NV puts out titan X shortly piling on top of all of AMD's current problems making them look even worse to the public at large in gpu.

What AMD needs is a IPC CPU king sans gpu to take a bit of wind out of Intel's sails and retake the crown on cpu. You can charge a premium for THAT product and it would take Intel a while to solve the issue as they are purely aiming at stopping arm from marching up the food chain by making their integrated crap better and lowering power INSTEAD of giving us gamers IPC and more than 5% on cpu each rev. AMD has a golden opportunity here if they'd remove the gpu and dedicate that space to ALL CPU. Gamers just turn off the gpu anyway and buy a card. They need to win back gamers who pay PREMIUM prices for cpus AND gpus. APU's are losers when ARM vs. Intel will squish any chance of premium margins/profits. Consoles barely pay the interest on their debt. They have the engineers (papermaster etc) to make a great CPU again, but they'd better get it out quick while Intel is concentrating on ARM's armada of friends.
 


SLI working with my setup. probably you should ask AMD why they did not release proper driver for the game. hardocp also test the game and they call amd out for not releasing drivers for almost three months.
 
I'm using a AMD FX8320 and I got a massive improvement in FPS by disabling core 4,5,6 & 7 in the BIOS. Don't really know why this worked (Maybe latency or poor task allocation?) but now I'm getting 60fps with only very occasional drops (I used to get 40-50)
 


The game is most likely trying to utilize the more cores and fails, so disabling the extra cores help speed things up i guess?
 

I suppose it could be an extreme case of cache trashing between the modules' two cores. One way this could happen is if they used helper and worker thread pairs but set the threads' CPU affinity wrong for AMD's desktop CPUs, so the helpers end up trashing caches on the wrong module instead of preloading data for the thread they were meant to assist.
 


Yea, my friend just told me he is dipping on Old Town, he has a 280x + AMD processor(forgot what his is)

and I am running old town just fine with my I3-4130 and my 260x.
 

And why exactly are you showing only one side of the story? From your own link:

"However, as of right now, SLI is still a no-go for us. We know other gamers have reported SLI not working issues as well, and for others only minor improvements in performance when it does work. This is a shame, we really need SLI to play at 4K.

---------------------

Note that AMD CrossFire did NOT work at all, just like SLI, at default settings."


Don't act like only AMD is dropping the ball. The fact still remains that TH is putting nVidia on top and AMD on the bottom when it's convenient. They should put all graphs up for every test so we have a clear picture, rather than selectively leaving out and adding in cards with a Geforce always on top.

Also from HOCP:
Keep this in mind though, NVIDIA has been on top of optimizations in this game and has delivered new drivers for this game.

Obviously nVidia will have the drivers ready in time, and AMD has to wait till the game is released to even start optimizing. Throwing this as solely AMD's fault is ignorance. I didn't hear anyone blaming nVidia for bad support when the HD7970 was killing the GTX Titan when Tomb Raider came out.
 


is there any source to this or just pure speculation based on the title was TWIMTBP title? i mean is there an official statement from AMD that they did not get access to the game until the game actually comes out? if AMD really doesn't have access to the game they will probably talk about it. like what happen with Arkham Asylum. as for the tomb rider case nvidia made official statement about they don't have the driver ready at launch.
 
Q6600 default clock
2x2GB DDR2 800mhz
Gigabyte g31m-es2L
GTX 750Ti 2GB DDR5
Res: 1366x768

Just a piece of advice to anyone on about the same boat as mine(old PC+new GPU and want to play this), just disable that Depth of Field and/or Ambient Occlusion effect(also applies on any latest game titles). And you're fine with your new GPU + its latest driver. Mine stays within 40-60FPS range without any lag on input. While running it on Very High Preset on other things...just without those effects.

Those effects are the culprits for performance drops, most of the time.
Q6600 default clock
2x2GB DDR2 800mhz
Gigabyte g31m-es2L
GTX 750Ti 2GB DDR5
Res: 1366x768

Just a piece of advice to anyone on about the same boat as mine(old PC+new GPU and want to play this), just disable that Depth of Field and/or Ambient Occlusion effect(also applies on any latest game titles). And you're fine with your new GPU + its latest driver. Mine stays within 40-60FPS range without any lag on input. While running it on Very High Preset on other things...just without those effects.

Those effects are the culprits for performance drops, most of the time.

Dude, that is almost my identical setup and I was wondering how it would do with a game as new and nice looking as this. Your post is much appreciated. :) It's crazy how these old core 2 quads are still keeping up!
 


http://techreport.com/news/26515/amd-lashes-out-at-nvidia-gameworks-program
 
This week, in the Adventures of Fan Boy: AMD vs Nvidia
51TwWmvIsvL._SL500_.jpg
 


And from your own link, even techreport thought it might be BS:
"Curiously, however, Evangelho's story includes no statement from Nvidia—nor does it indicate that Nvidia was asked to comment. The story also makes some odd claims. It asserts, for example, that AMD's Mantle API "doesn't require" AMD graphics hardware to function and "will work equally well on Nvidia cards." (AMD told me at GDC that, with DirectX 12 now on the horizon, we're "probably not going to see" Mantle on Nvidia hardware.) Evangelho cites Mantle's purported vendor-agnosticism as evidence that AMD "clearly waves a banner of open-source development and ideals."

We've asked Nvidia to comment, and we're currently awaiting a response from the company. For what it's worth, though, a former Nvidia software engineer, John McDonald, sounded off on Twitter yesterday about this story. He wrote:

It is extremely frustrating to see an article criticizing work you did at a former employer and not being able to comment that the person who you are quoting from was just completely full of unsubstantiated [expletive]. Thanks, Forbes. . . . [A]nd while I never did, and certainly do not now, speak for nvidia, let me say that in the six years I was in devtech I *never*, not a single time, asked a developer to deny title access to AMD or to remove things that were beneficial to AMD.

Perhaps that's a hint of the response we'll receive from Nvidia. In any case, we'll update this story as soon as we hear back."

Mantle does NOT work on NV cards. But the article that is referenced from forbes says it does...LOL. AMD just making excuses for lack of driver updates. You can turn off gameworks features also. Of course NV will be best at their own stuff (when used in game), just like AMD was best (for a bit...LOL) at tressFX hair etc. You didn't see NV whine over that, they just worked in the drivers to end up winning in it :) Gameworks adds stuff special for NV cards to make games better on their hardware. Shocker. AMD seems to be complaining that NV has the R&D cash to pay for making games do special effects for their hardware. Well, join the profit making companies and you will too. They bet on consoles to get that same leverage (hoping they'd lead to more games being optimized by default for their hardware), but when they didn't explode out of the gate, games started being made for PC/mobile first (still same at GDC2015 2 weeks ago, with 56% making their game first for PC next). When you make a game for PC and NV owns 70% of discrete (up from 65% last year), you tend to lean towards them first by default, and probably optimize for #2 after, especially with #2 being broke. It's just common sense, you aim at the largest market that will see your work.

Read the 2nd post on your link in comments...I couldn't have said it better myself 😉 My point before was AMD isn't on drivers like they used to be, and haven't really been for quite some time. HardOCP did an article about both sides ages ago showing AMD being behind on optimizing for some games by 6mo+, where NV releases game day drivers for pretty much all top new releases. That is what happens when you don't lay off 30% of your engineers and don't have profits that can pay for them to do R&D/Drivers etc. That is what happens when you don't divert resources to console development, stealing that R&D from your CORE products just as Jen noted it would do (and has to AMD). AMD management made the wrong bet, and they are paying for it now. That isn't fanboy crap (as OP said), it's facts and financials. Those quarterly reports aren't lying. Unless AMD comes up with some killer product that can have some PRICING leverage, I don't see how they get out of the continuing drop of drivers, product perf etc. AMD's response to Jen's comment "they seem butt hurt over console loss". I'm sure Jen just laughed knowing he'd be killing them in drivers a year or two later as they spent all their resources on DX11/12 as shown by anandtech etc recently (and even linux is decent on NV, again cash to spend on this stuff helps).
 
One more point, freesync monitors, OUT. But drivers, late this week MAYBE. How does that happen? Exactly as I laid out above. NO money, no R&D/DRivers. When you're the guy pushing this tech, your driver should be out on day one right? Crossfire support driver won't be out for another month after that (AMD's words), again, ridiculous. Will they both even be on time (though, already late IMHO) for their own dates?
 

Let me just put some things here from this article:
http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy

On page 2, the main problem is stated...
- GameWorks restricts developer access to the source code
- Developers agree that access to the source code is often vital.
- nVidia charges developers for access to the source code

Let me tell you what this actually translates to. It means that if something is working well on an nVidia card through the GameWorks API, but developers are having trouble getting it to work right on an AMD card, it means developers have to pay nVidia to get access to the source code to fix things, or AMD has to do it for the developer. The problem is obvious here...

Also, your Mantle argument is nonsense. Mantle is actually open. It was made to counter GameWorks. Not by putting restrictions on nVidia, but rather by trying to make nVidia's restrictions irrelevant. It's nVidia that refused to support it... From PC World:
http://www.pcworld.com/article/2894036/mantle-is-a-vulkan-amds-dead-graphics-api-rises-from-the-ashes-as-opengls-successor.html

Although AMD has had unexpected success with developers adopting Mantle, it was unlikely to become a standard without the support of its chief competitor Nvidia.

One thing's for sure, while Nvidia ignored Mantle it can’t ignore Vulkan

There's a reason Mantle was 'dropped', and that it was transformed into Vulkan, the OpenGL API. Vulkan's main source code is pretty much a mirror of Mantle. DX12 also adopted Mantle codes. Again, it's about making nVidia's closed proprietary stuff irrelevant.

Back to the Extremetech article... Page 3:
Nvidia’s dedication to great experiences is narrowly constrained to great experience on Nvidia hardware, whereas Valve is more concerned with building code that runs well on every system.

From the conclusion:
Everyone we spoke to also recognized that GameWorks libraries are more difficult for AMD to optimize and that the company has a legitimate reason to be concerned about this. For better or worse, the GameWorks program has the potential to shift the industry towards a development and optimization model that’s closer to the Nvidia way of doing business than the AMD equivalent. AMD, with its emphasis on standards compliance and collaborative effort, believes this is a bad thing.

Take that for what you may. It's obvious to me... AMD actually wants what's better for the industry thinking that's the way they can grow as a company, while nVidia wants more for themselves and less for everybody else, thinking that's how they can grow as a company.
 


it is more about their concern with Gameworks. but did they specifically said they don't have any access to Dying Light prior to launch? because we are talking about Crossfire support here not about feature that run slower on AMD hardware. surely AMD able to give CF profile even without the need to optimize the Gameworks stuff.
 

AMD used to release drivers every month up to around 2012. Before that time nVidia released drivers every 6 months or so. Things have changed.
Yes AMD has not released a new driver after Omega. But what exactly is the problem here? nVidia's SLI drivers are not working either as stated above, so why the hate on AMD? Also, nVidia's latest driver of February is giving performance increases to games like Metro: Last Light which is a game from 2013. Even if they mean the redux, that was released in August 2014 so they're 5 months late. Are we going to go around blaming nVidia for being late as well, or are we simply being extra harsh just because it's AMD?

As for the optimizations being late, I've explained and given enough information about why GameWorks is a problem for AMD being able to release proper drivers on time, caused by nVidia. If you wish to blame it all on AMD, suit yourself. It just shows you're unwilling to face what's happening.
 


why only talk on one side of the coin? from the article itself:

Do AMD and Nvidia need access to developer source code?

AMD: Being able to see and share source code access is very important to our driver optimization process.

Nvidia: Having source code is useful, but it’s just one tool in our toolbox. There are many, many things we can do to improve performance without touching it.

Developers say: They’re both telling the truth.

Why did Nvidia build GameWorks?

AMD: As a means of locking developers into using their own software solutions through business contracts, thereby controlling more of the market and making it more difficult for AMD to compete.

Nvidia: To give developers better access to NV’s advanced rendering capabilities and to free them up to focus on other aspects of game development.

Developers say: Hurting AMD probably wasn’t the primary motive.

about Mantle. nvidia have political reason not to support mantle. BUT even if they want to support it there is no way to make it work when AMD denying access to intel and nvidia since the very beginning. they said they cannot give access because Mantle still in beta but at the same time have no problem give access to game developer to use mantle and even use Mantle on commercial games making AMD was the only hardware capable of running Mantle. intel keep asking and eventually giving up because they already have access to DX12 while AMD keep talking about opening the spec to others but never really do it.
 
Because whether it was the intention or not:

Could GameWorks be used to harm AMD (and by extension, AMD gamers)?

AMD: Absolutely yes. Games that are part of the GW program have been much harder for us to optimize.

Nvidia: Theoretically yes, but that’s not the point of the program or the reason we developed it.

Developers say: AMD has valid reason to be concerned.
 
AMD used to release drivers every month up to around 2012

they stop that because monthly release does not actually address their problem. in fact it make it worse in the eye of public. there is one time that AMD has to release 6 hot fix driver for a single month. what's the point releasing driver every month when they have to patch it later for 6 times on the same month?

Before that time nVidia released drivers every 6 months or so.

when? what year?

nVidia's SLI drivers are not working either as stated above, so why the hate on AMD?

SLI working for me. without SLI i can't even get 60FPS. but scaling probably were not high. without SLI my FPS ranging from mid 30s to mid 40s. with SLI enabled i can hit 60FPS.

Also, nVidia's latest driver of February is giving performance increases to games like Metro: Last Light which is a game from 2013. Even if they mean the redux, that was released in August 2014 so they're 5 months late

nothing ordinary. sometimes i can see one game being improved on each driver release like Dirt:Showdown where the game totally favoring AMD hardware from the get go. in that game nvidia card was a joke. 680 barely faster than 6970.

http://techreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/6

did we ever heard nvidia complaining about it? instead they just keep optimizing their driver than making noise like AMD. anyway Dying Light is not the only game comes out in 2015. what AMD excuse for not releasing drivers for games that coming out so far?
 
Status
Not open for further replies.