Crytek Adopts AMD's Mantle API for CRYENGINE

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


all GCN based gpu will support DX12. for nvidia part they said all their DX11 capable gpu will support DX12 which means Fermi and later. some people already asking why AMD did not make DX12 available to 5k and 6k series since nvidia did with all their DX11 gpu

http://techreport.com/news/26199/directx-12-to-support-existing-hardware-first-games-due-in-late-2015
 


You know that giant AMD or Nvidia logo you see at the loading of most games these days? That's why AMD cares about Mantle. If you hear a game you want supports Mantle, Mantle is advertised with the game (a la BF4) and you want the "best experience" you may very well be inclined to buy an AMD card at your next upgrade.

That's how marketing works. If you had asked an AMD rep, "would you rather see Mantle or DX12 supported in a game and which would be more advantageous for AMD," the answer would be clear.

My point was simple: whether or not developers adopt, on a large scale, Mantle and thus if it is ultimately a success will depend on how relevant they think it is in comparison to DX12. As DX12 will be a universal API and Mantle is a proprietary API, this decision will probably be made based on marketing and deals (with AMD & devs) and performance levels of DX12.

If I build a game and it runs equivalent or better on DX12, I'll be hard-pressed to be roped into potentially alienating Nvidia players by promoting it as supporting Mantle.
 


...the hell are you talking about, boy?

I know exactly what Mantle does, and how it works - there's not need to try to condescendingly correct something I never said. That being said, you are incorrect when you say that existing hardware is well-optimized; far from it.

As for Crytec having only three games, all three of those are legendary for their own reasons. Why are you whining about it, other than to be a troll?

 




Gotta love how when a feature is added it is just Mantle being added to it even though we don't know how it will be implemented yet or if it is even the same.

So instead shouldn't we say DX12 is DX 11 + Mantle and Mantle is just Glide therefore DX12 = DX11 + Glide?
 
DX12 by Microsoft involves Mantle, at least according to semi-accurate.comhttp://semiaccurate.com/2014/03/18/microsoft-adopts-mantle-calls-dx12/

Sorry, on mobile so no option from Tom's to quote properly.

I see no proof from this site whatsoever that DX12 is DX11+Mantle.

The ignorance of that statement is that due to the nature of APIs, it simply couldn't be true. It would mean than MS magically found a way to provide Nvidia cards with Mantle support, which would mean a rewrite of the API.

I get the comparison; they're trying to achieve for all cards the benefits of Mantle and I hope they succeed. But that doesn't mean they just plugged Mantle into DX.
 
The biggest issue with Gsync is that you need a new, expensive monitor. Love the idea, but i hope DX12 adds more.
The biggest issue with Gsync is that you need a new, expensive monitor. Love the idea, but i hope DX12 adds more.
 

I didn't say it was a stupid thing to do, only that the benefit of doing so seems limited; all it really does is gives them a boost during a short period until the newer versions of OpenGL and DirectX start being adopted by newer hardware, and that boost will only come on AMD hardware. Since NVidia and Intel are extremely unlikely to ever adopt Mantle, it seems like something a dead-end other than a fairly brief short-term window. Don't get me wrong, as a vehicle for much needed change, Mantle may spur some great things from OpenGL and DirectX, and Mantle may give AMD (and Mantle developers) a bit of an initial advantage since lessons learned with Mantle may still apply to OpenGL and DirectX's new features too, I'm just dubious about how much the effort is really worth it now that OpenGL and DirectX are already on the way to catching up, it just kind of seems like Mantle's days are already numbered.
 


ROFL...In an engine and USED IN A GAME are two totally different things. Support in the engine doesn't mean mantle just works in your game automatically either. YOU still have to do the work, it just means you CAN use mantle in your new game. You can have support for physx in ALL engines, but it doesn't get you much in games. YOU still have to do the work. AMD paid for this advertisement, it's up to devs to use it, and the answer is still, no they won't. ONLY a company paid by AMD will go through the effort which gets you ZERO extra dollars for that effort. It makes some AMD gamers happy (and there are only a % of amd users capable of using it and high end cards get basically nothing, so worse than I'm saying, it's only for crap cpus really), but gets you NOTHING from even those users. Just a grin is about all you're going to get in return for extra dev time to support a niche audience. You will instead see devs seek OpenGL and DirectX which aims at FAR more users (including the other 2/3's of the discrete market and Intel's cpus too).

I challenge someone to tell me WHY any dev that wasn't paid by AMD would waste months (or any time) on a product that makes you nothing but a few happy amd users. We're all waiting for why you'd do this for under 10% gains on anything high end which is what REAL gamers buy. You are chasing an audience that can't afford a decent gpu or cpu, so how will they buy your extra work $60 Mantle game? You act as though game devs are just stupidly tripping over themselves to help AMD get an API off the ground for ZERO financial gains. That only happens if you PAY me, thus erasing the ZERO gains part and making it worth my time. Common sense people, it isn't rocket science. NOBODY likes working for free, which is what this is.

With 8mil for frostbite, I'm not sure how many engines AMD can afford to support, and even after doing it nobody has to use it at all. I'd rather see AAA drivers debut on AMD again 😉 And a REAL Gsync alternative that you are SELLING today. I'd rather see AMD spend on CPU's again not this apu crap I'll never use or recommend as they give up and chase the low shrinking end. The second the word GAMING comes out of a users mouth, I say DISCRETE back to them, without fail followed by "avoid APU's which hold back your future gpu's for life" :) As neither side gives you a worthy upgrade today, and you're better off moving sockets for better tech surrounding your NEW cpu socket. A new board is $50-200. For most that's $120 or under and if it makes it 3+yrs just drop a new board in with the newer tech, which is a FAR better option than surviving longer in a board that will likely die a year later anyway after you plop in the cpu upgrade. They know how to produce crap that dies fairly early after warranty with 30yrs of making these with statistics telling them just how good to make them. Solid caps didn't bring us longer life, they just botched another component so it's still tough to get what you're supposed to from the better caps. The military boards maybe get you 5, but you pay for that so no loss to them giving you longer.

You can make these pipe dream comments when they are reality, as in let me know when you can't count the mantle optimized games on ONE hand. I won't really be impressed until I can't count that list on both hands and both feet. I'm not even impressed by the number of games using Physx yet and that's been out for how many years? It's a bonus feature that MAYBE gets used occasionally, but it doesn't make me buy a card alone, nor would mantle without a list of 100 games that you can't play on anything else like mantle. The fact is, even when those hit (if...big if), I'll only be ~10% slower and can still play because only a dev smoking crack would make a MANTLE only game (you have to include dx &/or opengl or go broke). A driver update gets you that puny % on many games...I knew this wouldn't amount to much when they said "we wouldn't do mantle for 5%". If that comment had been 25% I'd have said WOW, bring it on expecting 30-50%. You say 5% and we get <10% for gamers that matter (discrete+gaming cpus - meaning HIGH end). I'm almost dying laughing over here 😉 You seem to forget OpenGL has this crap already via extensions and DX12 is coming. Since OpenGL has it already it was clearly in their heads around the same time AMD conceived mantle and I doubt DX didn't know both were coming either. MS was working on this before yesterday...LOL.

AMD did nothing here but waste money fixing something that was already on it's way to being fixed by OpenGL/DX. You HAD to add better threading etc to both as we move to even MORE cores and quad becomes kind of low end/mainstream this xmas+. The implementation that is best doesn't even matter (lets pretend mantle is the best for now), as AMD has no funding to push mantle and certainly no market clout to do it either with only 1/3 discrete and Intel showing no sign of losing the low end to AMD while they ramp up their own GPU's in Haswell/Broadwell etc (still suck, but everything down here sucks, can barely do 1366/768...ROFL and not many games get that without dropping settings). It would be different if they could get 50% of lower end, or over 50% in discrete, but that won't happen before mantle dies via DX/OpenGL. Well, it will never happen without being bought first, or massively funded by some fool as Arabs can't fund themselves now so not much more blood from that turnip for AMD - They delayed their fab long ago due to being cash strapped themselves). My dads last board died within a month of warranty (well it had boot problems, you boot until it finally REALLY boots...LOL). Granted his is a heavily used PC (but dust free house, cool pc, etc - it's built to last), but 3yrs+ a month? SPC at work :) Statistics don't lie much.

Crytek didn't add it for fun, they were paid. They aren't in the business of doing free favors; nobody is.
 
I love it, but pretty soon here microsoft will get enough momentum built up to make DX at least somewhat comparable, meaning that nvidia will have the advantage once again, at least if gsync takes off the way it should.That being said, I would much rather have an API that actually makes a difference for mid/high end hardware, especially those of us who do actually have good cpus.
The biggest issue with Gsync is that you need a new, expensive monitor. Love the idea, but i hope DX12 adds more.

I don't understand your point. Mantle requires a new card for most people. DX12 will most likely require win8/win9 so another wad spent to get that. You can't even buy an AMD Gsync competitor (IE freesync) as it isn't a product but when you can (if ever) you'll need a monitor for that too as ZERO desktop monitors use this so far. Most people don't have 1440p/1600p monitors now anyway (less than 2%) so we all probably have a monitor upgrade in the not too distant future anyway so who cares? You think you have a special one that will last for life? Most cheap monitors don't make it past 6yrs (not many splurge here) and tech has moved on in many ways by that time making me want a new monitor anyway (I buy a new one or add another every 5-7yrs). IE 1440p/1600p/2160p now and most color gamuts etc have improved in the last 6-10yrs making a new monitor an awesome gaming upgrade for a ton of people now that we have cards that might actually be able to push 1440p+ (I'm talking 20nm here, as they will come about right when MANY xmas models have gsync built in).

Every user in this forum has a monitor upgrade in store at some point or they will no longer compute on a PC :) I GUARANTEE IT. Every electronic device you own is on its way to death just like us humans from the day we're born 😉

Anyone shouting "but you have to buy a monitor" apparently doesn't understand "yeah, but they will ALL die, so get over it" :) Newsflash, you'll need a new EVERYTHING soon unfortunately.
 


i don't think that's going to happen. the problem is AMD intends to make Mantle as a key advantage to their gpu compared to competitor. if AMD really serious about other gpu maker to adopt Mantle they will invite them from the very beginning when they start developing for mantle. they said they will release public SDK for mantle later this year or early this year but it doesn't mean nvidia and the others can use mantle straight away. they still have to study the API first before adapting the API into their architecture. also since Mantle has been build with GCN architecture in mind there might be some feature will needing hardware that does not exist in nvidia or other gpu. if mantle really is easy to implement on other gpu then why amd did not give their HD5k and HD6k series mantle support?
 


DX12 will likely provide an equivalent boost for "non-Mantle" cards, though that's yet to be seen. The reason the HD5xxx and HD6xxx cards didn't get Mantle support is that AMD was already dropping major driver support (meaning new profiles, etc.) in the next year for them on their dev roadmap.

I know everyone thinks these companies love their customers but the truth is, like any company, they're in it for the money and that means keeping customers buying your new products. (This is NOT meant to be trash talking, it's just how business works). Supporting older cards with new technologies gives users of older cards less incentive to upgrade, which is part of the motivation for continuing to develop new GPUs.

The fact is, the longer AMD supports older cards, the less often people upgrade, the less money they make. For better or worse, this is the nature of the consumer product business.
 


this could be the reason. but this is not from AMD themselves but from Johan Andersson interview with tom's hardware:

Johan: Mantle requires a certain set of key functionality of the GPU, so it can’t be supported on older architectures before AMD’s GCN architecture.

http://www.tomshardware.com/reviews/johan-andersson-battlefield-4-interview,3688-8.html
 


Nothing in my statement said Mantle was expensive to develop. In fact, nothing in my statement addressed cost at all.

What I did address what the obvious marketing advantage it would give AMD should Mantle be adopted into a game. Mantle doesn't NEED an ultra-expensive marketing campaign. It IS it's OWN marketing campaign. If it makes it into a game, the game developers will market it for AMD. This in turn will incentive people to go with an AMD card, as it already has with many on games like BF4.

This isn't rocket science and it isn't a conspiracy theory, either. It's the same thing as the AMD or Nvidia logo you see ("They way it's meant to be played") at the loading screen of almost every game. It doesn't MAKE you use either brand, but it strongly suggests one or the other is optimal.

Mantle is just that: a strong suggestion to the consumer. And who knows, those with Mantle-enabled games and cards may well have a better experience.

But the full-on adoption of Mantle will be based on the developer's incentive to support both DX12 *and* Mantle in a given game, which will in turn be affected by the average gains from DX12. If DX12 provides 90-100% the boost that Mantle does for both AMD and Nvidia cards, it's less likely to be adopted because DX12 will make more sense from a development standpoint.

I work in software development. We make a lot of things, including powerful engines used in today's apps. We don't market the engine publicly, we instead wait for the adoption of the engine to market itself, much like the Facebook and Twitter APIs. AMD will do the same thing with Mantle, but if it's not adopted, or of DX12 is equivalent, it will be irrelevant.



Good call. I knew I read that somewhere. I'm willing to bet AMD drops support for those cards once the next gen of their new line comes out.
 
Status
Not open for further replies.