Mantle API to do better / differently than DirectX?

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

devilgodspider

Distinguished
Mar 7, 2013
265
1
18,795
Now, before I start, I'm going to say what I always say on all my questions, if there's a thread like this already that you know of, please post it here.

Now, I'm buying an R9 280X, and my biggest like about it is because of Mantle API. I would like to know in simple terms what exactly will Mantle API do to my favor over, in example, GTX 770?

Info/Rules (on my case ofc):
Before anything let's make this straight, you CAN, at will, also turn this into a GTX 770 vs R9 280X, but you will have to pay the price on rules:
NO PhysX or TXAA/MSAA arguments as they are both very aesthetic and honestly I don't care at all about AA as I will play everything @ 1080p, maybe more once 2K/3K/4K prices come down a notch. Shadowplay is a valid argument, but still not so strong in my opinion as still prefer Dxtory/FRAPS (depends on games and what I'm doing with my recordings) and it's something I've been working for a long time with. Anyway, I really like both, but the R9 280X is cheaper on my country and I am more fond of the 3GB as being more future proof in case I do go to above-1080p gaming or go very mod-intensive on games.
With that out of the way, I want to know what Mantle API's strong points are and what it will do significantly better or what extras it has over DirectX/NVIDIA cards.
Oh and I almost forgot, I do take overclocking arguments as well about the GTX 770 vs R9 280X debacle or w/e, and NO TDP price argument I've had it with that too... 😛
 


woops meant 'not that bad', my bad.
 


whats wrong with nvidia? we have the best drivers, software and proably the easiest OC program called Evga precision x + Evga acx coolers are the best on the market for aircooling at the moment :)
 

Even if Mantle sucks, which I doubt, that doesn't force you to use Nvidia. AMD cards still use DX too.

Though Nvidia does have some nice features, including G-sync.

When was the last time you looked at prices anyways. Most places, AMD 280x's are more expensive than 770's. Bitcoin, litecoin, or what ever mining going on, has cause AMD cards to be very expensive most places.
 


First off, this thread just reached 100 posts! Thank you so much for everyone that posted here so far! 😀

Anyway, about the whole price thing, I was wrong on the nVidia price, it's 320€ (the cheapest branded one). And I live in Portugal, since like the R9 280X launched here on Portugal, the website where I'm getting it shipped to me from (the shipping is FREE too, from there and amazon.co.uk too on some products), the price tag is still 289,90€ (MSi TwinFrozr R9 280X 😉 ). And I know AMD can still support DirectX, but I said I that because my main reason for buying it is Mantle API coupled with the price tag.

As for EvgaLover's post, nVidia might have that, but the price I'm getting here where I live, just gives little to no reason for me to buy a GTX 770 :3 (especially with my budget, and interestingly my build is better than the 800$ build that th's did on the whole Q4 2013 build a PC marathon, mine has 850W XFX PSU 80'+, an i5 3570 and 2TB Seagate, everything else I think is the same, over here, it's currently at... 865,52€ already including shipping costs)

EDITED (had some errors) & when I say better it's because while from $ to € it's way below 800€, one must also count with your taxes on the usa. So I normal think that $ to € is xxx$+100=xxx€
 
Here we go again :)

http://imgur.com/a/4WiVM


mPz3AnM.jpg


I kind of feel bad about the leak, i mean, it just 7 days to official announcement, kind of a weird taste in my mouth while linking these pics ...
 
The question is, what does "Up to 45% faster" mean.

Will that be some spots are up to that much faster?
Will it mean that certain hardware configs are that much faster?
Will it be minimums, maximums or averages?

We see these types of quotes in driver updates all the time, but they never seem to be the average.
 


It could mean anything.
Marketing isn't meant to tell the whole truth, just the best sounding bits with not enough information to properly figure out if its bull or not.

Just like this slide from Nvidia's press conference.
screen-shot-2014-01-05-at-8-32-36-pm-2.png


Its got 192 CUDA cores...
Its really a quad core, in terms of CUDA its between a GT630 and 640, so nothing that special there.
 
Upto 45% can Jean something like: all the time +10% and at one point in the game its 45% faster. I think something like 15-25% should be possible and its said to remove some CPU bottleneck (correct me if I'm wrong 😀)
 
I think I've probably mentioned this already, if I didn't or did, I probably shouldn't because it goes off-topic, kinda...
Does anyone think that because of the 8 core AMD Jaguar APU on the consoles we might see the 8-core chips from AMD take a significant leap in gaming performance in comparission with the i5 3570K? (I know it's either against that or the i7 3770K, but the i7 is way too expensive for me, and between the i5 3570 and Fx 8350 there's like an 30€ price difference) just passing through 😛
 


I know, though that slide sounds impressive, because it's on a 5w mobile part for tablets and phones.