AMD CPU speculation... and expert conjecture

Page 308 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

+1
 

jdwii

Splendid


Their is not one program in that comparison that does a IPC comparison Cincebench uses the Floating point unit more than the Integer, i recommend you start to read more about what these benchmarks mean before coming back and stating statements, its ok to be wrong but learn from your mistakes.
 

jdwii

Splendid


SuperPI is a x87 floating point benchmark and more importantly no programs even use this operation and this is why its not relevant
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810
But SuperPI shows the biggest advantage for Intel CPUs. Do you really think hafijur will stop using it even after the two dozen (in several other threads too) or so times he's been told over the last 6 months how useless of a benchmark it is?
 

Simple answer, no.
 

Sigmanick

Honorable
Sep 28, 2013
26
0
10,530
Nice 8-bit Theater sprite, Palladin. I remember reading that on dial up internet from 2001 to 2003.

VG247
So AMD has paid for DICE's endorsements, in a way. I'm speculating that the 5 Million paid was to help cover the cost of re-coding from DirectX to Mantle as well as the right to use the Battlefield 4 imagery and benchmarks as often as AMD wants for promotional materials. I'm curious as to how genuine the developer's comments at GPU 14 were because of this.

Either they are real excited about Mantle because it works, or because they got paid.

Thoughts?
 


You don't bring a knife to a gun fight... Or in this case, happy thoughts to a Big Enterprise world. You need to dump tons of money on the stuff you want to succeed. Good or bad, time will tell. What I'm sure of now, is that AMD needs to break the Intel/nVidia status-quo.

I won't condemn AMD for using cash to sway looks their way (benefit of doubt long term), because that's the way the world of software works. We have to condemn companies that play dirty, outside the fair trade laws, because they hurt our pockets and they don't give a damn about it.

Cheers!
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


5mil is chump change as far as marketing campaigns go. And they got results out of it too, even better. You can bet most gamers are following BF4 news so it's a lot of extra free marketing up to and after the product hits the shelves.

Microsoft spend 900 million marketing Surface.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


No. This was easy.



Who says that AMD has paid for endorsement? The link says something completely different:

Now, Fudzilla reports that a new deal between both parties will see Battlefield 4 given away with selected AMD cards via exclusive partners. It is said to have cost AMD between $5-8 million.

The deal is for including a copy of BF4 with some of the new cards.

In any case, this is what developers such as Timothy Lottes, creator of FXAA (Nvidia), were saying about DirectX much before anyone know something about MANTLE:

The real reason to get excited about a PS4 is what Sony as a company does with the OS and system libraries as a platform, and what this enables 1st party studios to do, when they make PS4-only games. If PS4 has a real-time OS, with a libGCM style low level access to the GPU, then the PS4 1st party games will be years ahead of the PC simply because it opens up what is possible on the GPU. Note this won't happen right away on launch, but once developers tool up for the platform, this will be the case. As a PC guy who knows hardware to the metal, I spend most of my days in frustration knowing damn well what I could do with the hardware, but what I cannot do because Microsoft and IHVs wont provide low-level GPU access in PC APIs. One simple example, drawcalls on PC have easily 10x to 100x the overhead of a console with a libGCM style API.

MANTLE provides that. In fact Eurogamer speculates that MANTLE is the PS4 low-level API.
 
^^ But again: You lose portability, even across vendors on the same platform. Portability and coding efficiency was the reason we stopped using hardcoded assembly for crying out loud!

Sure, if I coded to the metal, in pure assembly, violated every good programming practice, I'm sure I could double the performance of every application I write. Until the next hardware revision changes one of my assumptions (and it can be something REALLY trivial too) ends up costing me more performance then I gain. Devs HATE going back and re-optimizing for a different arch, and that's what API's like MANTEL force you to do every time the hardware changes.
 
Also, I'm awaiting BF4's first benchmarks for the Beta, which should be hitting sometime today. So far, all we have are older Alpha benchmarks back from June, though they do show similar performance to Bf3:

https://linustechtips.com/main/topic/30410-battlefield-4-alpha-gpu-and-cpu-benchmarks/

Already getting ready for the "crappy PC port" arguments to start up again.
 


If no one adopts Mantle, you'll have a valid point. Now, at least we know the PS4 and AMD hardware will have this. Microsoft and nVidia will wait and see, I'm sure, if the thing catches on with Devs, but to remind you, this is still to be seen.

I have high expectations with BF4, to be honest. The MP demo from E3 was friggin mind blowing and I REALLY expect a lot. Specially since AMD is hyping it... Well, for that reason as well I'm trying to gather salt on every announcement, haha.

Cheers!
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780
Carmack already claimed MANTLE is a huge Thing along SteamOS, but MS and Sony will definitely be very Aggresive towars SteamOS and MANTLE, it is not good news to either MS and Sony.

The good thing is, the big name Devs are loving the Idea of MANTLE and SteamOS, MS and Sony can go screw themselves as long as the Devs support them.

http://www.vg247.com/2013/09/26/sony-and-microsoft-may-be-hostile-to-new-amd-tech-says-carmack/
 


The Devs PR departments, who has AMD splattered all over their products, love it.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


Yeah, because support for Mantle on Xbone, PS4, and AMD GCN PCs is a porting nightmare when you're forced to support Nvidia PCs and older AMD GPUs! AMD has the majority of gaming systems running GCN.

Also, what is up with your second paragraph? You're entirely assuming that "low level" means that it's low level to the point where you're manually filling in registers by yourself and writing assembly?

The API exists to be much lower level than DX yet high enough level to account for differences in some architectures. It's sacrificing compatibility with nearly everything in exchange for less compatibility but more performance.

And your incessant nagging over "OMG THE ARCHITECTURE IS GOING TO CHANGE AND KILL MANTLE!!!" is completely invalid.

Hawaii is already a revision to GCN and Mantle did not break Mantle on GCN 1.0 cards. Your supposed "ZOMG HARDWARE REVISION OF DOOM GAME DEVS GONNA HAVE TO RECODE THEIR GAMES FOR THE NEW VERSION OF GCN!!!!" has already happened before Mantle was even released.

You seem kind of confused on how Mantle works. Let me explain APIs to you.

The reason why there's an API there is to allow implementations of the API to work for different hardware. So instead of Microsoft writing the sole implementation of the API via DirectX, you now have a way for AMD, Nvidia, and whoever else wants to make a compatible version of the libraries needed for the API for their own hardware.

And then you follow it up with another amazing wankerK post, where you somehow try and paint the fact that a first generation game on a brand new console isn't going to perform exceptional? Really buddy?

I'd love for you to show me the consoles that have been released in the past where release games looked as good as the games at the end of the console's lifecycle.



I don't think you comprehend what Mantle and GCN are. Mantle is an ISA and GCN is the implementation. Do you know what else is an ISA? x86. And do you see game developers having to go back and re-optimize x86 code? Oh, you don't, because we have APIs and libraries which abstract.

DirectX is the CPU equivalent of Java where you have blobs that run in a contained environment with massive overhead, but it runs on nearly EVERY ISA out there. ARM, x86, MIPS, amd64, SPARC, Power, etc.

Mantle is the CPU equivalent of C, where libraries help maintain compatibility in the same ISA and previous versions and has much lower overhead, yet it only runs on one ISA (GCN) and if you want it on another ISA (Kepler, Maxwell, GT3, etc) you have to make your own compatible libraries.

What you're suggesting AMD is going to do which will cause developers to go back and re-optimize is the equivalent of Intel releasing Haswell and going "we added new instructions and changed the architecture, all existing assembly code and all existing x86 programs are no longer compatible and need to be tweaked!"
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780



Carmack and ID has always been Loyal to nVidia, same goes for Valve... if they say this is good news well better believe them.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


If you continue attacking MANTLE at least spell it right. Besides that

MANTLE != coding to the metal level
MANTLE != assembly

Also DEVS and the tech. press are loving MANTLE. The only critic voices that I am reading on forums are from people who is very pro Intel/Nvidia.



Ok

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_proz.jpg


And as predicted FX-8 > FX-6 > FX-4. In fact the 8-core offers about the double performance of the 4-core which suggest that the game scales well up to 8 cores. People who believed that an i3 would be top for next gen was dreaming.

Also interesting the GPUs

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_1920_msaa.jpg


TITAN performance is disappointing. Any bet which will be the performance of the new R9 with the MANTLE version of BF4? Some developer said it will humiliate the TITAN.
 
http://gamegpu.ru/action-/-fps-/-tps/battlefield-4-beta-test-gpu.html

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_proz.jpg


and

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_intel.jpg


and

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_amd.jpg


First off: Notice how much better the FX-8350 does in comparison to the FX-8150. BF4 Beta clearly likes per-core performance (as expected). About the same as BF3 in that regard.

It's also very worrying that even with twice the amount of cores, AMD's per-core workload is about the same as the i7-2600k. And one core is clearly overworked (95% load), which could very easily cause a deadlock situation, costing some significant FPS. I'd wager an OC would help this chip out a LOT.

I'm reasonably sure that is in fact what is happening to the FX-8150: One core being overworked, costing application performance. That's the downside to weak individual cores: If one gets overloaded, performance for the entire application can suffer.

Second: Note that the FX-6300 is clearly CPU bottlenecked. Competes against the i5-760. I'd be VERY interested to see how an OC'd FX-6300 holds up though.

Third: Game likes HTT a LOT. Helps keep the i3-2100 around FX-4300 level performance.

Would like to see how the i3-3220 holds up, given how well the i3-2100 does.


So, AS PREDICTED, overall performance is about the same as BF3, as far as CPU ordering goes.
 
And as predicted FX-8 > FX-6 > FX-4

FX-8350: 4GHz
FX-8150: 3.6 GHz
FX-6300: 3.5 GHz
FX-6100: 3.3 GHz
FX-4300: 3.8 GHz [AMD quads clearly bottlenecked at this point]

So yeah: The faster processor does better. Shocking!

Clock them at the same speed and call back; I'd wager the 6300 and 8350 would suddenly be performing about the same.

I really find it hard to defend an architecture when an AMD quad (FX-4300) at 3.8GHz is clearly CPU bottnecked, but an Intel quad at 3.3GHz (i5-2500k) clearly isn't. Or when an 8-core chip clocked at 4.0GHz (FX-8350) looses to a quad with HTT clocked at 3.4GHz (i7-2600k). Hell, in this case, the quad with HTT has lower average CPU usage! (Which again: I've been saying for years now)

Now OC the 2600k to 4GHz, and do the comparison again. Woops, Intel is likely a good 10-15 FPS faster then AMD at the same clock. So what do you think will happen if Intel ever releases a 4GHz @ stock chip? Oh, that's right: AMD loosing straight up.


I'll say what I said three years ago: AMD's CPU architecture is not well suited for games, even ones that scale reasonably well.
 
Final point for a bit:

Remember the following: Task Manager's fastest update speed is 1 second. Meanwhile, a Vsync refresh window is 16ms. You can, in theory, have a single threaded program that takes a core to 100% "scale" to two cores, each at 50%, is a low-level kernel task ends up causing the thread to jump cores after 500ms. So be very careful when discussing scaling based on Task Manager numbers, because there's likely a LOT of core hopping going around.

That being said, based on what I'm seeing here and past trends, I'd imagine the thread breakdown is about the same as Crysis 3. There are clearly two threads doing more work then the others [Main + Render], based on the Task Manager loading characteristics. I'd also assume about a half dozen helpers for the main render thread, similar to Crysis 3, each maybe worth 5-10% loading on a single core. Would need to confirm with GPUView if that's what's going on though.
 

jdwii

Splendid


its actually quite amazing seeing a processor from 08 still game well and the I7 920 came out for just 280-300$.

 

anxiousinfusion

Distinguished
Jul 1, 2011
1,035
0
19,360


In my work with older computers, I've come to appreciate outdated hardware. Almost as much as the fancy new stuff. I'm currently contemplating enslaving a 450mhz Pentium III for server work.

I'm not at all surprised that the 920 is still going very strong for modern gaming.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


hmm 4300 +20% for kaveri = 40 fps average.
8350 = 60 fps,
i5 760 = 45fps
how is kaveri a smart move to abandon high end cpus again and going to reach any i5 performance?
 
Status
Not open for further replies.