AMD Mantle: A Graphics API Tested In Depth

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

cleeve

Illustrious


You're missing two really important bits of info here:

1. Both AMD and Nvidia routinely advertise performance increases in games with new driver releases, and...

2. These percentages are "best case scenarios" and you might see "UP TO" this kind of performance increase. Meaning that real people almost never see them.

For instance, if a Radeon 240 sees a 30% increase at 1080p in that game, and gets 13 instead of 10 FPS, they label it a 30% increase and move on. A Radeon 270X might see no increase at all, but 30% is what gets published.

As a hardware reviewer, after years of seeing these announced driver improvements, I don't recall seeing a real-world example that lived up to the claim in the driver notes. Well, maybe once or twice, but its the rare exception, not the rule.

 

KevTheGuy

Reputable
Jul 18, 2014
3
0
4,510
Alright, I think I have a handle on the basics of Mantle. What now?
For the sake of professional image, please remember that Alright is Alwrong. All right?
[/nazi]
To the meat of the article, it does look like Mantle will help some of AMD's weaker CPUs some of the time, but if that isn't what you have, Mantle does not make a large enough difference to influence buying decisions (sort of like PhysX; in only a few cases does it really matter). If your system is one of those cases, it is a pretty substantial difference, but for many people it won't be.

Mantle does help lower end CPUs in multiplayer. I took some screenshots to prove this
.https://www.flickr.com/gp/125423318@N05/518s16

The server had ~40 players in it and I played in 720p to remove GPU bottlenecks. It's not a perfect testing but I did it to show someone how the FX 6300 performes on BF4 so yeah p:
 

mapesdhs

Distinguished


Others would then just say it's an unrealistic test scenario (720??), that the real
solution is for AMD to develop better CPUs and just make better drivers for all their
cards, spend the money where it benefits everyone, not just a few in a limited no. of
games, on specific grades of hw, for narrow usage cases.

We had a similar situation back in the days of Athlon2/Phenom2 and P55. People asking
for advice on what kind of upgrade would help boost gaming performance on systems with
lesser Athlon64 X2s, Athlon2 X4s, etc. All sorts of info posted on how an Athlon2 or
Phenom2 upgrade would help, especially when oc'd, but the reality was that in just
about every case, a stock i5/i7 gave far better performance anyway, in some cases as
much as 40% better (eg. Far Cry 2), so the sensible conclusion should have been to just
switch platforms given it was already obvious AMD was falling way behind. Example ref.

To me, Mantle is just AMD's way of trying to make up for the fact that they have weak
CPUs, but it's such a narrow case solution. I mean, you have to test at 720 to
demonstrate the difference? Really??

IMO they've followed up bad new CPU designs with even sillier direction decisions. Pity
nobody at AMD has the guts to say, hang it, let's go back to what we know worked well,
sort out a proper 8-core Phenom2, improve the design, shrink the process, lower the
power, improve the IPC. Then they might have had something usefully competitive, but
instead we have the 8350 which has an IPC no better than the best Ph2 that was released
18 months earlier, and a power consumption high enough to cook your dinner. Don't even
get me started on the FX 9Ks...

The fact that AMD is in this mess makes the following comment from a Jan/2013 toms 8350
review all the more relevant: "Our benchmark results have long shown that ATI's
graphics architectures are more dependent on a strong processor than Nvidia's." So
it's not as if AMD didn't know about the issue.

Thus the answer atm is simple: if the best value/performance GPU at any one time
happens to be an AMD, then fine, get one (driver fun not withstanding), but stick it on
an Intel board to get the most out of it, even an old one. Using it with an AMD CPU in
a new build is just wasting potential performance.

Many of my earlier builds were AMDs, and I have lots of their CPUs (3400+, 6000+, 7850,
X2 250, X4 635, X4 640, several X4 965BE, Ph2 1090T), but IMO Mantle is just diverting
resources away from where they really need to go, and that's better AMD CPUs. The
longer that doesn't happen, the worse the competitive CPU space becomes at the same
time, so everyone loses.

At the end of the day, Mantle is not a universal gain for all AMD users, and that's
going to hurt them in the long run if they focus on it at the expense of improving
their CPU tech.

Ian.

 

InvalidError

Titan
Moderator

You do not need multiple identical configurations since the reproducibility on the test PC depends only on how closely the other players manage do duplicate their track - their position on the test PC will never be any more accurate than the position reporting rate and timing of the game itself allows.

As far as the test PC is concerned, it makes no difference if the other PCs are running 800x600 with all non-essential stuff at lowest or disabled - all they need to be is repeatable.
 


Funny, I don't remember writing that. ;)
 

InvalidError

Titan
Moderator

It is that annoying thing with forums remembering stuff I start replying to but never actually post then automatically re-quoting from that when I reply to something else.. ends up with the wrong starting quote block.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


No I'm not missing #1, Didn't say both sides don't do this, just the AMD isn't doing work on DX like they should TODAY (your own results show their DX drivers suck as I noted before in my post to you, they're working on MANTLE). The point was they need to GET BACK to doing DX improvements so they can make the same claims now! Nvidia is upping perf right now to discount mantle's effect and it affects more than just mantle games. I tested every new model (current gen I was selling I mean) when I had a pc business for 8yrs against the new branches of drivers to see if it was worth telling my customers to update. When most perf claims came (usually they only say huge gains at new branches of drivers, not much in between except for specific launch titles after that), usually they came pretty close, and pretty dead on when they mentioned which model was used to GET those numbers and even across other models in the same gen so I beg to differ based on my years of testing. IE, NV lists on their blog, the model & scenario they were using to achieve the scores listed. You will see that in those games with the card they listed (usually the top card of course, or the top and an sli pair). No argument there, of course the test case works. But you are making a blanket statement that UP to [basically] always means NEVER...LOL. Over the last two years though, I'll admit NV did nothing, because they were so far in front until AMD's never settle drivers as hardocp showed in their driver testing articles. It's clear when they have no reason to give you free perf neither side will (well duh, just good business to hold back when you can). But until DX12 hits and mantle is still being pushed by AMD, we'll keep getting Dx/OpenGL enhancements that effect a LOT of situations as Ryan says:
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-33750-Driver-Analysis-Single-GPU-and-SLI-Tested
"In our SLI results I was personally able to see up to 36% performance scaling in a couple of titles with the GTX 780 Ti and GTX 770 cards. Single GPU results in our choice of games showed gains up to 16% with the GeForce GTX 750 Ti and up to 26% with the GeForce GTX 780 Ti."
"And for those users that see better than the 5%+ gains in a favorite gaming title, NVIDIA comes off looking like a savior. For the rest we can look at the 337.50 driver as the first step in NVIDIA's goal of continued DirectX 11 optimization that will continue for as long as AMD touts the Mantle API as a key selling point."

3 different cards, and multiple situations (in SLI and out). Unfortunately they chose too high of a cpu to really show mantle like benefits (it acts like mantle since they were improving mostly cpu limited situations in dx11 as Ryan discussed with NV). This is a major driver type change I'm talking about where they were out in the press claiming improvements. You will see them in those cases as they KNOW we're going to test them. Tons of review sites tested the 337.50 beta's after the claims.

http://www.tomshardware.com/news/nvidia-geforce-337.50-driver-benchmarks,26473.html
You proved it yourself...ROFL.
"We actually got some very interesting results for our 1080p performance tests. Tomb Raider and Battlefield 4's performance went up with similar amounts to Nvidia's claims."
Is this why you guys went and bumped the res for high end to NOT 1080p? ;) Unlike your testing NV aims at WHERE WE PLAY (majority of us anyway). Anandtech did the same, thus go not improvement from NV's new driver back then (purposely hog-tied the gpu so no improvements that effect cpu could ever be seen...ROFL). Ryan Smith never ceases to amaze me with AMD slant.

Another comment from your tests above:
"The biggest surprise was Star Swarm, as we got a very big performance jump (almost 60 percent) -- much higher than Nvidia's claim of "up to 21 percent".

Well gee, you even blew away NV's own claims...LOL. You chose a less powerful cpu as as you guys noted it had better results (and 4770K isn't exactly wimpy). Basically NV is giving mantle like results for any game where it's cpu limited at all (hence you should drop the res to 1080p more often). This is my point, AMD should be doing the same NOT Mantle crap in which EVERY game needs special coding. You would have seen even better results with an i3 vs. i7 which the driver is really aimed at just like mantle. NV could have chosen a low end cpu to REALLY showcase the mantle like perf from lower driver overhead. But even they went with 3960, so not exactly evil there. They chose their WORST CASE SCENARIO correct? Again, your blanket statement is incorrect right? Not always best case.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/66075-nvidia-337-50-driver-performance-review.html
Many came to the same results and more directly to my point hardwarecanucks says:
"By simply reducing latent driver overhead NVIDIA has empowered developers by deftly avoiding the complications associated with programming Mantle into a game engine."

EXACTLY. My point wasn't that nvidia is the only company CAPABLE of putting out "magical" drivers. It was that all R&D spent on Mantle should be spent on AMD putting out "MAGICAL" drivers of their own. I never said AMD can't do it, I said they SHOULD do it. Mantle is dead, if not by NV's driver improvements, then by DX12 next year.

More importantly since you missed it before in my previous post to you I guess:
"Can you retest the 290x/780ti in 1080p? It's clear from your results all of the cpus at the top are waiting on 780ti because you raised the resolution (for who I don't know? <4% of the planet?). It makes the benchmarks here pointless as you can't see the cpu's spread their wings. All of the scores are essentially the same and less than 4% run above 1920x1200 so what is the point (and most of those people have 2 or more cards to play above that according to steam survey)? 1080p would be more interesting and then you wouldn't have an FX4170, i3-3220, A10-7850 etc running like an i7-4770 due to gpu limits. i3-3220's don't score like 4770's unless you're graphics is holding you back. Drop the res and rerun at least the 780ti/290x tests in ONE of the games please. You pretty much killed the whole point of the high-end benchmarks."
 

KevTheGuy

Reputable
Jul 18, 2014
3
0
4,510


Others would then just say it's an unrealistic test scenario (720??), that the real
solution is for AMD to develop better CPUs and just make better drivers for all their
cards, spend the money where it benefits everyone, not just a few in a limited no. of
games, on specific grades of hw, for narrow usage cases.

We had a similar situation back in the days of Athlon2/Phenom2 and P55. People asking
for advice on what kind of upgrade would help boost gaming performance on systems with
lesser Athlon64 X2s, Athlon2 X4s, etc. All sorts of info posted on how an Athlon2 or
Phenom2 upgrade would help, especially when oc'd, but the reality was that in just
about every case, a stock i5/i7 gave far better performance anyway, in some cases as
much as 40% better (eg. Far Cry 2), so the sensible conclusion should have been to just
switch platforms given it was already obvious AMD was falling way behind. Example ref.

To me, Mantle is just AMD's way of trying to make up for the fact that they have weak
CPUs, but it's such a narrow case solution. I mean, you have to test at 720 to
demonstrate the difference? Really??

Ian.

I know it's unrealistic P:
I'll get you some other screenshots at 1080p Ultra on a full server if I can
 

KevTheGuy

Reputable
Jul 18, 2014
3
0
4,510


Others would then just say it's an unrealistic test scenario (720??), that the real
solution is for AMD to develop better CPUs and just make better drivers for all their
cards, spend the money where it benefits everyone, not just a few in a limited no. of
games, on specific grades of hw, for narrow usage cases.

Ian.

Here it is
https://flic.kr/s/aHsjZMLkh6
I couldnt at Ultra because of the memory issue. The stutter was insane
 

childofthekorn

Honorable
Jan 31, 2013
359
0
10,780


Don't forget their also working with DX12 and OpenGL 5.0 (the future version that was kicked around, not officially stated), as well. Their main talking points (outside of marketing) were to disrupt the market into pushing DX/OpenGL to work on delivering the performance with less overhead. Regardless if thats true, it was perfect timing given the announcements a few months after mantles demonstrations.

Mantle was intended to act like the API already found in consoles, there would be no reason for the ps4, or xbox for that matter, to use Mantle as it would be redundant. Ist to compliment the port from Console to PC which AMD has struggled with for years (saints row 2 was one hell of a game but performed horribly). Furthermore Intel wasn't necessarily turned away, AMD postpones the release to other manufacturers as they haven't perfected the product for their own setup yet. They keep getting a hard time for having a majority of open sourced software, but as soon as they want to make sure its done prior to release everyone cries foul.
 

thebigbug

Distinguished
Nov 7, 2011
52
0
18,640
I feel this test is a poor representation of Mantle. In my opinion, it is (at the very least) incomplete. It doesn't give the full picture.

The point of Mantle is to reduce CPU overhead. You're not going to see much difference when you're taxing the GPU more than the CPU. The point of using Mantle is to get rid of the CPU bottleneck, so the most apparent gains can be seen on either a lower end CPU or when the settings are pushed down in pursuit of a higher framerate, commonly seen with people with the 120hz/144hz monitors. I myself am one of those people. I have a 3570K and an R9 290. With my 120hz Lightboost-enabled monitor, I find the smoothness of the game to be much more visually pleasing than higher graphical settings. By putting everything in Battlefield 4 on low (except for Textures and Meshes, which are on Ultra), I see a huge improvement. With DirectX, I would get an AVERAGE of 120fps. With Mantle, I get a MINIMUM of 120fps (average is somewhere between 140 and 160).

I don't mean to be rude, but you were testing the wrong thing. That's not where the improvements are in moving to Mantle, and no one seems to get this. Show a test focusing on the removal of a potential bottleneck, and you'll have a more complete review than anyone else out there.
 
MANTLE, a few points.

What most people fail to understand is that it's not just about raw performance. Game developers have constant DRIVER issues due to the wide variety of PC hardware, as well as other technical issues that DX11 and previous have.

These problem add to development costs beforehand, technical support after as well as any costs associated with problems that reduce the number of games sold due to reviews and word-of-mouth.

Game developer first sat down and listed all their main problems, then Mantle was designed based on these issues, which also includes no support for older hardware which further reduces problems.

*There's no guarantee that DX12 will solve all the problems that Mantel will address (Mantel is a work in progress), and additionally Mantle isn't as locked down for code as Microsoft's DX12 is. Also considering the issues Microsoft has been having overall and the poor return of gaming to their bottom line it's hard to say what we can expect.

Mantle thus has the potential to be much BETTER than DX12 if the code is fully open and getting more constant udates. The problem of course is that game developers aren't going to want to develop multiple API's. So, why do Mantle AND do DX12? Sure, we see games in development currently but that's because DX12 is a ways away, plus there's a fairly simple way to convert Mantle to DX12 so many of those games may not end up as Mantle at all.

I would really, really love to see Mantle end up as the ONLY choice for PC, Steam/Linux, XB1 and PS4 but I can't see how that would ever happen unless Microsoft decides to bow out of PC gaming.

(Steam/Linux and the PS4 are actually pretty good options for the future. I don't see Sony objecting if Mantle got better than their current API. Since Steambox PC's use Linux they would require OpenGL or Mantle so I can see Mantle there at some point.)

Currently (once Steambox is out) we're looking at a cross-platform game having to use as many as FIVE different API's, such as:
1) DX12 - Windows
2) OpenGL or Mantle - Steambox
3) PS4's API
4) XBOX ONE's API (similar to DX12 but not identical)

That's only four, but on Windows I don't know if they would be DX12 only or require DX11 or previous to include more hardware. We currently see games that can use DX11 or DX9, however I'm just not sure how DX12 works with backwards compatibility.

So if Microsoft would just ah heck off at some point, once Mantle improves, we could theoretically get developers using Mantle only. They would LOVE that.
 

Avocade

Honorable
Apr 12, 2013
1,002
0
11,460
I'm going to keep this short. I think mantle is stupid. Just as I thought physx was stupid. Directx12 is for both manufacturers while mantle favors is for amd. Exclusivity splits and kills markets and communities. Not to mention its further adding on development time games as they are developed and setup for consoles and PC then have to have mantles api injected and rearranged to run off it. I think manle is amd trying to shift the market in a really stupid way. Its shitty the market favors Intel and nvidia but creating exclusivity isn't the way yo go at least physx is an option not a requirement.
 

tyr8338

Honorable
Mar 17, 2012
9
0
10,510
Very disapointing you didnt test battlefield in multiplayer mode, lets be honest single player is not important or cpu bottlenecked - it`s really short and terrible anyway and if anyone plays bf4 they do it in multiplayer and there mantle gives huge boost because it`s very cpu itnensive with destrutcion around. Same case when considering hardline, on my friend system with athlon x4 at 4ghz when I switched from direct x to mantle he got stable 50-60+fps while before fps often dropped to 30 and it was barely playable. I understand testing in multipalyer is hard but other sites can do it reliably http://pclab.pl/zdjecia/artykuly/chaostheory/2014/02/mantle/charts/bf4_mp_cpu_radeon_mantle.png http://pclab.pl/zdjecia/artykuly/chaostheory/2014/02/mantle/charts/bf4_mp_cpu_radeon_dx.png
On fx-8350 oc to 4700mhz fps goes up from 55 on direct x to 80 with mantle so its huge improvement.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


NO, devs have said PS4 has no need for this api as it's already good, and MS will wait for dx12. ZERO hope of consoles using Mantle. Mantle won't take over anything. Valve will push OpenGL as it works on ANY steamOS box. No point in pushing their devs to Mantle, they won't waste the time as gabe isn't stupid. He wants linux opengl gaming to work everywhere and be portable to anything else easily and playable by everyone. DX11 already solves many problems Mantle addresses, see BF4/Thief scores, Star Swarm Scores where now NV wins all 3. Mantle not needed. All the mantle games are only coming because AMD is still writing checks. A dev has no desire to write that code for free when no expected EXTRA price on top of a game when they sell it, and it's for a FEW cards as you note. Total waste of time if AMD doesn't write you a check. Reviews of the mantle games haven't been good, it's the same on all sides. The games have problems because they push out betas and expect us to beta test them for a year while the patch them to death. Mantle hasn't changed that at all.

AMD says devs sat down and said that stuff. And DX12/OpenGL were already in progress for years, so not sure they weren't all working on the same issues anyway. Nv has proven you can fix draw calls etc with DX11, or they wouldn't have take over BF4, Thief and Star Swarm with it.

The only api that works everywhere that you listed above is....OpenGL. Xbox1 is the only odd man out, and they can fix that if forced as DX dies a slow death on win9. Mantle wouldn't win even if MS just left the field today. OpenGL would. Mantle will never be open (even if you could call it that at some point) because NV will never use it as it will hinder them because it is about getting AMD's latest features used, so they'll always have an issue in Mantle games. Since NV owns 65% of discrete (maybe more now, AMD's quarter sucked, lost share to intel, probably NV also), they will stay DX (until weakened) and OpenGL pushed heavily off windows (mobile etc) so at some point they can sell you cpus for full desktops along with their discrete cards (that's the point of denver, steal from Intel/x86 down the road) on a system that has ONLY FREE Operating systems on board (linux, steamos, androidL). I'm sure a steam OS port to ARM (for denver etc) is in the works as we speak. Valve will want a piece of the game sales going on over on android for sure.

Steam Boxes already have OpenGL, why re-invent the wheel when it only runs on a few piece of hardware vs. the original (opengl) that already works everywhere and most know it inside out? It has already had the ability for mass draw calls for years, people just didn't know how to use it as NV explained on stage a while back. A few lines of code sped up a devs code by 4x, and he said he could easily get 8x with a bit more effort.
https://www.youtube.com/watch?v=-bCeNzgiJ8I&html5=1
The draw call topic is covered in this NV speech from steam dev days. Great vid.
"In this session, Cass Everitt and John McDonald from NVIDIA will talk about some newer extensions to OpenGL and how they can reduce (or even eliminate) driver overhead. We'll discuss where performance goes, how to effectively profile GL, as well as specific extensions such as bindless rendering and MultiDraw commands."
 

Justin C

Honorable
Jul 27, 2013
8
0
10,510
The following video is 100% reproducible on my system and should be with any 280x.

http://youtu.be/503tHTZX6WI

So much for no loss in image quality if using Mantle...

Rig used for video:
AMD FX-8320
HIS Radeon 280X
Windows 7 x64
Video recorded with fraps using borderless screen mode and BF4 set to ultra

 

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360


Is not useless, it just does it`s job just like DX or OpenGL ... but it will benefit the lower end CPUs. I don`t call that useless.
 

jlwtech

Honorable
Mar 8, 2012
58
0
10,630
This is possibly the most tantalizing and intriguing article I have read in a long time.
Thank You Toms Hardware.
And Thank You, AMD, for Mantle.

PS:
I hope this catapults AMD to new heights, putting pressure on Nvidia and Intel, so that we, the consumers, get even more performance for our dollar!
 

mapesdhs

Distinguished


If you want more performance for your dollar spent on an AMD GPU, then put it on an Intel board.
No need for Mantle at all. Mantle is an attemt by AMD to solve a problem created by AMD, namely
poor GPU performance when an AMD card is used with an AMD CPU.

Ian.


 

jlwtech

Honorable
Mar 8, 2012
58
0
10,630


 

Astrix_au

Honorable
May 4, 2014
41
0
10,530
I still believe mantle has the more stable fps. Dice did break crossfire with my 2 290x's on Mantle with the July 8 patch but one card is fine btw I use a 4770k.

All other online benchmarks have shown Mantle to have lower frame times with 290x's specially crossfire. But sadly with one update that came out July 8 DICE broke CF Mantle.

Mantle wasn't created to benefit low end CPU's, it's just something that has came out of it. The problem with DX11 is it is bloated and the GPU needs to communicate a lot more with the CPU and game and the performance is't always at 100% it dips more. Mantle was designed to give more direct access to the GPU giving more control on performance, the CPU help just came from this.
Mantle was designed to get the most out of a 290x or two. I was open to buy either 2 780ti's or 2 290x's and I saw 870ti's weren't able to stay at 100% all the time specially when playing high refreshrates at 135% res to get extra detail on 1080p screens and beable to use that power as the screens can do 120-144Hz.
Now with ASUS swift monitors will now have native 1440p 144hz ;)
With Mantle 135% resolution gives me 130-150fps with maxvariable i set it to 120 and my GPU will be pumping at 100% and not constantly dipping down to 75% like DX11. So I would get a constant 120fps at 135% res, thats 1440p 120hz 120fps downscaled to 1080p which makes it look amazing.

When DX12 comes out and they design it with low level access in mind I think then possibly DX11 will have the hardware control wich in my opinion they are lacking.

 

Ronshere

Reputable
Mar 25, 2014
50
0
4,660
Mantle...Shmantle
Gaming is going 3D. Its part of the total virtual reality experience.
AMD and Mantle does not address that fact.
AMD needs to compete with Nvidia or its dead.
 

jlwtech

Honorable
Mar 8, 2012
58
0
10,630


Isn't AMD already competing with Nvidia?
They do make the fastest GFX card money can buy, the r9 295x2.
Not to mention they offer better performance per dollar at almost every price point. The r9 290 is a great example of that: It offers slightly better performance, and more memory, than the GTX 780 for $100 less.

3D gaming has been around for years. It's a niche thing. Gives me a headache.
Same with 3D movies. Those have been around since the 50's. Also gives me a headache.
3D is a great concept, but until they offer a truly 3D experience, and not these goggles/glasses that trick our brain into thinking it's 3D, it will never be mainstream. Using a 2D screen is not the answer. We need a 3D display (a cube, or something), or some kind of holographic display. Then it will go mainstream, and "flat" displays will be history.
 

childofthekorn

Honorable
Jan 31, 2013
359
0
10,780


If AMD lives up to their word and eventually opens up Mantle to any/all competitors like they said they would, would you feel different about it then?
 
Status
Not open for further replies.