AMD CPU speculation... and expert conjecture

Page 479 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


This is one of the central reasons the 7850 has such poor value for it's cost, there isn't enough memory bandwidth to keep 512 shader units busy. The 384 units seen in the 7700 and 7600 are a better match for 128-bit DDR3-2133 memory.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Interesting viewpoint. Well, as mentioned plenty of times. Main MANTLE goal is to eliminate CPU bottleneck. If you test it in situations where there is no CPU bottleneck, then there isn't anything to gain.

If you max everything in _your_ config you are migrating from CPU bottleneck to GPU bottleneck and MANTLE gains vanish. Now add a second dGPU and you are again in a CPU bottleneck situation.

What I try to say here is that common belief that MANTLE only helps weak CPUs is false. MANTLE can bring big gains to owners of top CPUs as well.

It is also worth to mention that when sites as Anand test MANTLE using only an i7-4960x I don't see some people complaining about it in the same way they complain against AMD slides. At contrary Anand testing of CPU bottlenecks using a $1000 CPU is considered fair and independent instead being their usual sponsored-by-Intel 'review' style.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


MANTLE development started about four years ago or so and in collaboration with game developers. Thus unlike you explain how Microsoft built a time-machine and travel in the past and shared their development of DX12 for AMD to steal it, your hypothesis doesn't stand up.

In fact:

We’ve spoken to several sources with additional information on the topic who have told us that Microsoft’s interest in developing a new API is a recent phenomenon, and that the new DirectX (likely DirectX 12) will substantially duplicate the capabilities of AMD’s Mantle.

Not only Microsoft is 'stealing' MANTLE but it is 'stealing' slides from MANTLE presentation

https://twitter.com/repi/status/446787503953944576

About your MANTLE is DOA... How could it be if is inside DX12?
 


And only working for about 20% of the dGPU market.

MANTLE development started about four years ago or so and in collaboration with game developers. Thus unlike you explain how Microsoft built a time-machine and travel in the past and shared their development of DX12 for AMD to steal it, your hypothesis doesn't stand up.

Except I can basically guarantee DX12 has been cooking for close to two years now. Or has everyone here apparently forgotten how my reactions to the "There will be no DX12" roumers a year ago? Or how I predicted [even before Mantle] the next version would deal with driver overhead? There's a reason for that, you know.
 
http://blogs.nvidia.com/blog/2014/03/20/directx-12/

Speaking to a crowd of about 300 developers and press, Anuj Gosalia, development manager of DirectX at Microsoft, described DX12 as the joint effort of hardware vendors, game developers and his team. Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead. For the past year, NVIDIA has been working closely with the DirectX team to deliver a working design and implementation of DX12 at GDC. - See more at: http://blogs.nvidia.com/blog/2014/03/20/directx-12/#sthash.AKrq5bFg.dpuf

EDIT

Few more tidbits:

http://www.extremetech.com/gaming/178904-directx-12-detailed-backwards-compatible-with-all-recent-nvidia-gpus-will-deliver-mantle-like-capabilitiesv

2806.cpucompare-640x450.png


Highlights the difference in driver overhead for DX11 and DX12, according to MSFT. Would certainly be a boost to AMD's architecture if the improvement is that much. The improvements on the multithreaded-rendering side should also help some.

Secondly:

http://www.extremetech.com/gaming/178831-balkanized-gaming-nvidia-gameworks-now-a-core-part-of-ue4-amd-counters-with-mantle-integration-in-cryengine

NVIDIA Gameworks is integrated into UE4. Which is HUGE for NVIDIA if UE4 is used as much as UE3 was, considering NVIDIA would gain a significant GPU advantage in any UE4 games.
 
AMD Kaveri A10-7850K (the igpu) Overclocking – Unleashing GCN’s Potential
http://www.eteknix.com/amd-kaveri-a10-7850k-overclocking-unleashing-gcns-potential/
some of the bench results are kinda good. it only made the memory bw bottleneck look even worse. the injustice!! ;-;
it got me wondering, would be possible to make the apu toggle between dual channel mode and quad channel mode, but with 4 memory slots (like the cheaper x79 motherboards)? when 2 sticks are installed, the imc would use 2 channels. when 4 sticks are installed, a bios option would enable a toggle between dual and quad channel operation.
or, what if amd raised igpu's L2 cache up 1-2MB?
http://www.tomshardware.com/reviews/a10-7850k-a8-7600-kaveri,3725-3.html

toms' a88x atx motherboard roundup
http://www.tomshardware.com/reviews/a88x-socket-fm2-motherboard,3764.html

asrock fm2a88x extreme6+ (try saying that in one breath) review
http://www.anandtech.com/show/7865/asrock-fm2a88x-extreme6-review

Nvidia talks DirectX 12 support
http://www.fudzilla.com/home/item/34279-nvidia-talks-directx-12-support
claims to have worked with microsoft for 4 years on this. despite that, i don't see how running forza5 on a gtx titan b.e. be considered "sweet" if xbone(R)'s gpu is like a 7770 (with esram). :whistle:

AMD talks DirectX 12
http://www.fudzilla.com/home/item/34280-amd-talks-directx-12
4 years... isn't that how long the consoles were in development? i wonder if amd repurposed xbone(r)'s graphics api for pcs, iirc there was a speculation like that a while ago. if nvidia is right, then the timeline matches. if nvidia is lying, both nvidia and microsoft are "retconning". :p
 


If built right an APU system is perfect for gamers in the DOTA, SWTOR, CS:GO, did a A10 build for around $350 that was significantly faster than his Phenom X4 9550 and GT430. Once dual graphics is ironed out more you can have a pretty tidy gaming system for around $400

I would be very surprised not to see DDR4 for Excavator.

 
4 years... isn't that how long the consoles were in development? i wonder if amd repurposed xbone(r)'s graphics api for pcs, iirc there was a speculation like that a while ago. if nvidia is right, then the timeline matches. if nvidia is lying, both nvidia and microsoft are "retconning". :p

A lot of what's in Mantle isn't in the low level graphical API's for consoles; its halfway in-between. What makes more sense is if MSFT knew it needed a middle-API for the XB1 (DX12), worked with AMD and NVIDIA to define the minimum hardware necessary to run said API (GCN/Fermi), and used a GPU for the XB1 that met those specifications (GCN), them began work on the specifics of the API for future release.

Now, lets throw in the fact AMD's Mantel supports only GCN GPU's at present, just like DX12...

Just saying, the timing of things is very suspicious right now, especially in light of AMD's earlier "There will be no DX12" comments, which seems downright disingenuous at present. Playing down DX12 for a market advantage with Mantle, perhaps? This would put NVIDIA is a very awkward position, since they'd need to support Mantle in the short term to keep performance parity with AMD, but doing so could kill DX12 and force them into supporting AMD's API. Wouldn't be shocked if DX12 gets bumped up a bit due to pressure from NVIDIA...
 


Even then, DDR is slow compared to what a GPU needs. Especially if the amount of draw calls starts to increase in games (Mantel/DX12). But using anything else is too expensive for the part.
 

AMD’s Roy Taylor Started the Obituary of DirectX, but Microsoft Says Not So Fast
http://www.hardwarecanucks.com/news/amd-roy-taylor-directx12/
i think this was it.

seems like a marketing ploy to push card sales on amd's behalf. i'd say that it kinda worked a little bit. if dx12 is mantle, then it'd save nvidia and intel's face from printing "supported apis: mantle".
 

jdwii

Splendid


If this is true i lost all faith in Amd and i will fight with any Amd fan when they say Amd is better than Intel or Nvidia. I never heard of anything so nonprofessional.
 


This was a talking point here about a year ago, with me, alone (as per usual) saying this was bogus. And I can't believe at all AMD wasn't consulted ahead of time in the development of DX12 (which would be idiotic for MSFT).

Either way, those comments look idiotic for AMD now.
 

there's no reason to put any faith in any corporation in the first place. they are businesses, aim to make money.
imo, it's not non-professional, it's common practice.
 

jdwii

Splendid


lying is being unprofessional to their consumers and to their business partners. Just because it happens doesn't make it ok. However you should never put any faith into any company.
 

Rum

Honorable
Oct 16, 2013
54
0
10,630
Speculation has turned into accusation and conspiracy theories on this thread which is pretty funny to read. Between Jdwii and Gamerk I get my fill of tinfoil hat conspiracies, cmon really guys??? We go from thanks AMD for pushing M$ into developing DX12 to AMD OMG you are terrible software thieves...Bi-polar much? You honestly think that AMD would lie about DX12, then spend money they don't really have on an API that would be an after thought (Save for the possibility of be multi-OS compatible in the future) when said DX12 would be released? I would be more inclined to say that nvidia is lying than AMD to make them look bad...Wooden cards come to mind -_- lol... If DX12 was in development for that long then why didn't M$ and Nvidia come out and say so when Mantle dropped? Don't you think it would have been more financially fruitful for them to head it off at that point? Why did companies approach AMD for a developer friendly API/software when most of the industry probably knew that DX12 would be coming? If AMD knew then would it not have been a better financial move to program and perfect their hardware for DX12 and sit tight for it to come out? There are so many holes in these time lines and company statements, I just think it's funny you guys are getting your panties in a bunch over something that most likely isn't true the begin with. Carry on!
 
The DX12 announcement is because of Valve as well.

Valve said they got better performance from OGL in Linux than DX in Windows. COLOR ME SURPRISED! haha.

What better way to tell MS "we'll take away your XBox candy and Windows honey gamers, muahaha!" than stating they're faster and better.

Now, the idea might have been from 4 years ago and all, but since OGL has had extensions since like a bazillion years ago, it has always been there. AMD just played the "hey, look at me doing something no one knows it already exists!".

Could be factual incorrect, but its still funny to think the guys at Redmond got angry/fired up with MANTLE+Valve.

Cheers!
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
More or less four years ago developers asked Microsoft to develop an improved API with low-level access and other improvements. Microsoft ignored them. Then AMD developed MANTLE in collaboration with game developers (the same ones adopting MANTLE today).

Then AMD's Roy (formerly a Nvidia representative) shared in public Microsoft attitude saying "there is no DX12".

Microsoft replied to AMD with their "we will continue developing DX", but Microsoft never mentioned DX12, neither low-level access, because was not in the menu... then.

AMD announces MANTLE and Microsoft get nervous. Besides being crushed by PS4 in the one side and Steam in the other side now Xbox1 has to battle against PCs with MANTLE.

AMD claims their goal with MANTLE is to catalyze the gaming ecosystem. Aka to force Microsoft and others to adopt something with the spirit of MANTLE or even with MANTLE copy-pasted inside.

Microsoft decides to develop DX12 and meets with AMD.

AMD license MANTLE (and slides :) to Microsoft, who is now introducing it in Xbox1. Microsoft claims that the Xbox1 will get a 20% boost per frame.

Since MANTLE is multiplaftorm capable, Nvidia only had to make the driver to support DX12. It is funny that Microsoft choose a Xbox1 game to illustrate DX12 'alpha' in a Nvidia GPU. The GPU used was a Titan Black.

Nvidia admits that its direct work on the DX12 started last year.

Of course, AMD likes DX12

amd-direct3d-12-645x333.jpg
 


If said API helped them sell more hardware over a two year period, do they care if it dies out? Likewise, NVIDIA was likely under NDA prior to the GDC release, and MSFT, well, you know how they are.

Point being: I simply pointed out the POSSIBILITY that, rather then MSFT stealing Mantel, the reverse is true. I outlined how this could be possible. I can't prove it one way or the other. But I do sincerely doubt AMD had no clues about DX12 when they said there would be no DX12.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


You bolded when Nvidia, AMD, Microsoft, and game developers did meet four years ago, but you forgot to bold that Nvidia work on DX12 began only the last year:

While the broad feature-set of DX12 looks very much like a response to the Mantle initiative, Nvidia claims that it began discussions with Microsoft on the subject four years ago, with direct work on the API starting last year.

DX12 is a post-MANTLE phenomenon.

Is not a fantastic thing that when AMD announced MANTLE some experts claimed in forums and blogs that DX11 was already very optimized and that in the best case MANTLE could provide a 10--15% gain over DX11, but now that MANTLE is inside DX12 the same experts claim how bad optimized DX11 was and how DX12 can bring 50% gains in performance... ¡Lovely!

PR motto has changed from "mantel is not needed" to "mantel is dead because DX12 provides the same performance" and the "10--15%" has been multiplied by about 5x and changed to "50%" in some few weeks.
 
You bolded when Nvidia, AMD, Microsoft, and game developers did meet, but you forgot to bold that Nvidia work on DX12 began only last year:

Because you define the specifications LONG before you begin any actual coding. Same thing at my job (defense): We spend 2-3 years just defining all the specs; no coding until the design, interfaces, documentation, and all that gets defined. The fact NVIDIA began actual work on the API last year means that was about when the specifications finished. Which fits in nicely with the Mantel timeline, no?

Now that MANTLE is inside DX12 the same experts claim how bad optimized DX11 was and how DX12 can bring 50% gains in performance... ¡Lovely!

Best case: Infinite performance gains
Worst Case: Infinite performance loss
Typical case: Somewhere between the other two cases.

Which one are you talking about again?

Also, 50% utilization != 50% performance gains. Two different things.
 


I am all for reducing overhead and pushing the GPU to becoming the bottleneck. And that is fine on systems that are lower end. But still using a top end GPU that can push a game at max settings and only using the presets that show the performance gains you want is cherry picking marketing 101.

No one is going to buy a top end GPU to play on normal, they want to max games out. That's why people pay the premium price for top end GPUs.

As for Anandtech testing, it is pretty normal as Mantle is supposed to be a GPU boost. Everyone was expecting it to come out and make GCN cards perform better but as with my results with a decent CPU and max settings, it could be just margin of error.

I am a bit disappointed in Mantle that it doesn't give the boost people were hoping for. Then again I don't think Mantle will last once DX12 is out.





I never would have thought MS would stop DX development Mainly because they are in the console business and that can help them improve their next console, which it looks like it will.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Do you mean MANTLE? Then yes, Nvidia began to work in the DX12 driver when MANTLE was ready and licensed to Microsoft.



Hint: check the emphasis (bold font) in the paragraph of my post that you omit to quote.

Yes, it is not 50% performance gain, it is 100%, my mistake.

Once again, the same experts that then claimed that DX11 was very optimized, that the API overhead was inexistent, and that MANTLE could provide only a 10-15% performance gain in the best case, are now pretending that DX11 is badly optimized and that DX12 can provide a 100% performance gain. The same experts now forget to mention "best case" when referring to DX12. This is an excerpt from one of those experts:

And to prove that isn’t just marketing talk, they ported 3DMark from D3D11 to D3D12 in order to demonstrate the improved CPU scaling and utilization. The CPU time was roughly cut in half.

See it by yourself? "Best case" is only used against MANTLE.
 
Do you mean MANTLE? Then yes, Nvidia began to work in the DX12 driver when MANTLE was ready and licensed to Microsoft.

And now, you start making up facts (again) to try and prove your point.

And to prove that isn’t just marketing talk, they ported 3DMark from D3D11 to D3D12 in order to demonstrate the improved CPU scaling and utilization. The CPU time was roughly cut in half.

Emphasis mine. Cutting CPU time and improving scaling does not necessarily lead to any performance gain. The benchmark simply showed the reduction in overhead on one core (about 50% less core usage), and a slight improvement to loading on the other cores. The actual performance increase for that benchmark is likely on the order of 10-15%.

But again, you ignore the facts and fail at READING and COMPREHENSION.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I already explained you how MANTLE can improve performance of Haswell i5. My explanation also applies to the i7-4770k. This Haswell i7 matching a pair of 290X in crossfire sees a 23% framerate improvement thanks to MANTLE (ultra settings)

http://www.extremetech.com/gaming/175881-amd-mantle-benchmarked-the-biggest-innovation-in-gaming-since-directx-9/3

MANTLE improves performance in CPU bottleneck situation. By testing only an i7-4670x plus single GPU, Anand has just taken the worst case possible. So typical of them.

I don't know which were you expectations about MANTLE, but I did specific predictions about performance and MANTLE fits them nicely. Developers like MANTLE also, this is why so many developers are adopting it.

Once again how will Mantle last when it is inside DX12?
 
Status
Not open for further replies.