Mantle API to do better / differently than DirectX?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

devilgodspider

Distinguished
Mar 7, 2013
265
1
18,795
Now, before I start, I'm going to say what I always say on all my questions, if there's a thread like this already that you know of, please post it here.

Now, I'm buying an R9 280X, and my biggest like about it is because of Mantle API. I would like to know in simple terms what exactly will Mantle API do to my favor over, in example, GTX 770?

Info/Rules (on my case ofc):
Before anything let's make this straight, you CAN, at will, also turn this into a GTX 770 vs R9 280X, but you will have to pay the price on rules:
NO PhysX or TXAA/MSAA arguments as they are both very aesthetic and honestly I don't care at all about AA as I will play everything @ 1080p, maybe more once 2K/3K/4K prices come down a notch. Shadowplay is a valid argument, but still not so strong in my opinion as still prefer Dxtory/FRAPS (depends on games and what I'm doing with my recordings) and it's something I've been working for a long time with. Anyway, I really like both, but the R9 280X is cheaper on my country and I am more fond of the 3GB as being more future proof in case I do go to above-1080p gaming or go very mod-intensive on games.
With that out of the way, I want to know what Mantle API's strong points are and what it will do significantly better or what extras it has over DirectX/NVIDIA cards.
Oh and I almost forgot, I do take overclocking arguments as well about the GTX 770 vs R9 280X debacle or w/e, and NO TDP price argument I've had it with that too... 😛
 


Oh so it was also the media going by this, yeah that's one of the earliest articles back at the initial reveal.

For some reason those rumors persisted up to the november presentations, it never was proprietary, I remember I heard not being proprietary in the initial reveal, so i didn't even focus on those

What is true, that currently is GCN only because ofcourse AMD doesn't have access to Nvidia's internal stuff. Johan reconfirmed again it's meant to be an open API just like OpenGL, and not meant to replace DirectX or OpenGL and those APIs can still be used and will probably be used for fallback compatability option.

Those apps that don't need top performance and don't push the limits of the DX API (funny thing I meant to say "push the limits of hardware" as it's what's commonly said but quickly realized the catch), don't have to use Mantle if they don't like it for some reason.





As noted I'm not an expert on tearing, If I look closely I can see some weirdness but I didn't want to look or think about it further for the sole reason of being able to not spot it all the time.

But yes, one of the main things I took 144hz is for eye health, my eyes just don't hurt at all on this screen compared to 60hz, took 3 hours, now I can last for 10 (tried it only a few times, when following a big esport event etc)

Also I was sure you're going to mention that, it actually doesn't matter if the game is not stable at 120, just having the higher refresh rate does the trick, that's just a fact and I think if you see a difference that's just percieved one.

But that still doesn't invalidate, because it's the percieved one that matters to be fixed, however only if it's genuine, if you're just looking at something and blaming it on tearing then no.
 


“I think at this stage it makes sense for us to develop Mantle, at least in its current form, because nobody knows our hardware at the lowest level best than we do. So for us to have to do that for alternative graphics hardware [would be] almost impossible,” said Ritche Corpus, AMD’s director of software alliances and developer relations

So Ritche Corpus, AMD’s director of software alliances and developer relations should be ignored then?
 
it is proprietary, period. If it is 'close to the metal', which we all agree it is, then you can't change the metal very much and hence it can't work on more than one architecture. In fact it may even limit the evolution of amd's gpu's as they will be locked in to the metal. Every article states that it is propriety, it was thought that they would be leveraging the fact that they have both the next gen consoles, but later conversations have indicated that those platforms are not going down the mantle route. You need an abstraction layer between the software and differing hardware so that it all looks the same to the software, sounds like DX? yes it does. DX just needs to be improved.
 

please explain lol
 


I'm not familiar with that quotes because I've watched the stuff and seen no indications of being proprietary, on the contrary, but then a month later the AMD Main Driver developer an Johan cleared it out and that overwrites any previous words and quotes the media might got, many times over.

I wasn't even discussing in any forums and wasn't even paying attention to the "proprietary" discussion because that just wasn't what something Johan ever said, there was the Nvidia conference with Carmack as well, they kept saying it's open, many times over by now.
 
Stewox is excited about Mantle. There is no harm in that, but we do have to be reasonable here. PR and marketing is not the same as the final product. You can't take everything at face value.

What I can't stand is someone getting angry at me for not jumping on the bandwagon before something is released. We need to see something before we can recommend it. Even when 1 game has used Mantle, we may still want to be a little cautious about what we'd recommend, as it may not be a good representative of what most will be like (good or bad).

It may be exciting to talk about, but Mantle is not ready yet. We'll find out how good it is when we see it. Demos do not count. There are lots of demos for lots of products. They show you the good about a product, they don't show you the bad. Mantle may speed up certain things by 10 times, but if you take away one bottleneck, you often find yourself with another. The end result may not change much at all, it might change a lot. We do not know yet.
 
“I think at this stage it makes sense for us to develop Mantle, at least in its current form, because nobody knows our hardware at the lowest level best than we do. So for us to have to do that for alternative graphics hardware [would be] almost impossible,” said Ritche Corpus, AMD’s director of software alliances and developer relations
What he said just means that they can only do it with AMD cards and not with nvidia because they are obviously only knowing how they made their own chipsand not the nvidia ones, which makes mantle proprietary and open at the same time. All they need to make it run with nvidia cards is information about their chips
 


Stewox is not referring to marketing. AMD's presentation is pretty technical, even though it does contain a rather mind-boggling demo.



There is no product to recommend at this stage. What I would personally recommend though is to keep an open mind and understand the following two crucial facts:

1) today's GPUs are capable of much more than what we get to see because they are being bottlenecked by the APIs (DirectX or OpenGL)

2) Making those ineffective APIs produce acceptable performance requires horrible amounts of unnecessary developer time, causes thousands of bugs, and even requires under-the-table deals between publishers and graphic card vendors to inject game-specific code into the *drivers* (!)

(I don't need to tell you what that last part means for indie game studios. But even that aside, doesn't it make anyone shudder that we are actually getting our game software piece-meal and from multiple vendors now?)



Mantle does not simply remove one bottleneck. A new version of DirectX could probably do that (if MS had a mind, that is). What it does is nothing less than replace APIs that were out-evolved by the hardware years ago. This means that game developers will be able to remove the bottlenecks by themselves at last.
 
Press releases, marketing and even tech demo's are not all that different. They all serve the same purpose, getting people to use their product.

And just because they give dev's more control, does not mean they can remove bottlenecks at will. Many bottlenecks are hardware bound, and not even possible to remove through software. We do not know what we will get when all the Mantle excitement ends, and we see something. We have clues, but only clues.

I expect some gains, do I not expect multiple times better performance than today. We really should be cautious about our expectations on the API at this point.
 


I can only recommend watching the presentation:

https://youtube.com/watch?v=QIWyf8Hyjbg

It's not exactly short, but well worth it.
 

I've watched it and many more.

I'm not saying Mantle won't improve things. I'm saying to temper your expectations until we see it. Demos have a way of showing the best of everything.

Once you add in physics, AI, internet, input and all the things that go into a game, once spectacular demos turn into reality.

It's cool to get excited about the possibilities. It is not cool to get upset with someone for being cautious in what to expect from it.
 
Well, this sure spread fast :|
I should really stop making topics so important lol
Anyway, this whole thing confuses me now :\ the bottom line (for me) is:

I really don't have much money right now... Nvidia seems to have the pricier cards atm and quite honestly don't seem very innocent on their costing practice (as seen on their "sudden" drop of prices when the R7/R9 series came out).
I feel AMD has more support for it's customers and interest on the evolving technology rather then how much money they can make out of it.
And I am pretty sure that G-Sync will be pricier than normal monitors by a lot, would not be surprised to see Nvidia milk this.

So yeah, I'm getting an R9 280X, and to clarify my use of Battlefield 4 as avatar (because it's not what I will be mostly playing, I'm thinking of playing more unusual types of games over FPS) it was purely by random. I can put another but I'm too lazy for that :3
 


There is nothing wrong with going with AMD. The G-sync monitors are supposed to cost $100 more than their non G-sync counterparts. At least at first. I don't know if you consider that pricey or not. I don't, not for what you get.

Mantle may be good. It may be great, it may end up having problems. We don't know, so I would not base my purchase on that alone. Even if it is great, it is likely only going to be in one game for quite some time.

If you aren't getting a monitor anytime soon, then G-sync is not an option and to be ignored in your purchase. At which point, get what ever is the best bang for the buck. Due to litecoin or bitcoin mining, I do not even know which that would be. It may still be Nvidia.
 


100$ more seems a lot more pricier to me bystander :| & yeah, I really am not thinking of buying a new monitor any time soon... Been with this one for a long time and even if it was a Sanyo on sale :)
 


I can agree to that. But the demo itself is not the most important part IMO. What matters to me is that someone is finally tearing down the old mouldy constructs that the industry as a whole had erroneously grown to accept as unshakable facts of life.



From that list, networking will probably move to the top of the list of troublemakers. I'm just glad the graphics API is about to fall away from the pole position.



We enthusiasts ask for forgiveness. We have this nagging fear that too much negativity might discourage the makers of this still fragile hope.
 


Just watch the videos, everything is there.




I said I didn't knew about for a month, but then I looked into it, but this article he linked is the one I didn't saw, that's all, and also it's obsolete, doesn't really matter anymore, so no need to look other old articles about the same thing.
 

Which is what Nvidia said about PhysX when they locked out AMD a few years ago and how did that turn out?


I have watched the videos which is why I take the "wait and see" approach because as others have already noted, PR demo's are one thing and do not necessarily translate into reality. If they did then AMD would be the leaders in PhysX style physics wouldn't they? After all they were the first to have showcase demo's.
 
Of course no one would give the competitor information about their technology...
I don't think anyone except nvidia themself is interested in physx and amd is not.
I understand why it failed (dont tell me it didnt), watched all those unrealistic demos and played the supported games with extreme explosions and unrealism so I turned it off.

Mantle PR show was pretty technical and theoretically, so let's wait for real life benchmarks and experiences
 

Given that there are over 400 games with PhysX, I don't think you can consider it a failure. GPU Accelerated PhysX is used sparingly, but PhysX is alive and well.

http://physxinfo.com/index.php?p=gam&f=all
 
I was thinking about the amount of games with hardware support where getting nvidia could make sense (imho getting a GPU for a single game is stupid, playing M&B warband, skyrim and bf4 right now 😀)

Back to mantle: mantle will probably fail likewise, unless dice does a great job implementing it in bf4 and shows its potential to other developers
 


Actually AMD were very interested but they couldn't afford to buy Aegia when it was up for sale and some years before that ATi were the first to show a working demo of GPU based physics.

http://www.engadget.com/2007/11/21/amd-tosses-around-the-idea-of-acquiring-ageia/

http://www.hardocp.com/article/2006/06/05/ati_takes_on_physics/