AMD CPU speculation... and expert conjecture

Page 527 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Your "at least 2.5 ghz" is your pure invention.

Your mention of number of cores is irrelevant, because the 10 over 7 score given by AMD is per core. Overall throughputs were 80 over 28.1.

Your "jaguar > pd" is another epic reading fail from you. What I wrote was "Jaguar IPC > Piledriver IPC" and the source is

http://www.extremetech.com/computing/174980-its-time-for-amd-to-take-a-page-from-intel-and-dump-steamroller



A relevant quote:

Bulldozer was without doubt an unmitigated failure. We know it.

It cost the CEO his job, it cost most of the management team its job, it cost the vice president of engineering his job. You have a new team. We are crystal clear that that sort of failure is unacceptable
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


AMD has been doing GPU acceleration for years. This is what its GPGPU FirePro brand is aimed to. During APU14 AMD shared several professional applications that do physics on GPUs: simulia, THEIA-RT, Autodesk...

Autodesk does advanced physics simulations on GPUs

http://www.fireprographics.com/ws/mae/autodesk/index.asp

Now about games. First, don't forget Nvidia was accused of bribing game developers to use their PhysX instead AMD own technologies. Of course, Nvidia rejected the accusations

http://www.xbitlabs.com/news/multimedia/display/20100311101148_Nvidia_Denies_Bribing_Game_Developers_for_Implementation_of_PhysX.html

http://www.theinquirer.net/inquirer/news/1596167/nvidia-denies-bribing-developers

Yeah, sure and Nvidia has not been caught trying to play dirty again

http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd

http://wccftech.com/nvidia-playing-dirty-gameworks-propriety-nature-pose-serious-danger-amd-dev-s-power-diluted/

http://linustechtips.com/main/topic/137965-developers-criticze-nvidias-gameworks-program-on-twitter-for-its-blackbox-nature/
But now that AMD won the consoles, things have changed since PhysX cross-accusations. Current PS4 games already are using the AMD GPU for physics processing, AI processing...

http://www.ps4hax.net/ps4hax-news/71234/naughty-dog-explains-ps4-cpu-and-memory-management.html

http://www.officialplaystationmagazine.co.uk/2013/07/01/cerny-on-ps4s-hardware-the-gpus-doing-many-tasks-not-directly-related-to-graphics/

http://www.craveonline.com/gaming/articles/681841-ps4s-compute-has-tons-of-untapped-potential-says-daylight-dev

In fact one of the PS4 games not only utilizes the GPU for physics processing, using the same Havok engine than in the demo given by etayorius, but also uses the GPU for sound processing.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
CPU loads from the review of the new OpenGL game Wolfstein. The game scales well up to six-cores with an overall load of 63% at tested res. (load would be higher at 1080p)

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Wolfenstein_The_New_Order_-test-WolfNewOrder_amd.jpg


Here Intel loads:
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Wolfenstein_The_New_Order_-test-WolfNewOrder_intel.jpg
 

wh3resmycar

Distinguished
@juan

Yep im only talking about gaming. Those links that uve shared as well as the other dude sti proved my point. 8 years and counting the only thing theve done is make lara croft a pantene endorser. Thats the only consumer grade tech theyve released that took them 8 years to develop.

Post another link with an AMD rep "talking" about crap and stuff please

Good luck taking AMD marketing talk as gospel.


Edit: and quit including nvidia in this. This is not physx vs havok. Im just pointing out amds failure to deliver on a tech theyve been working on for a long time.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780
You expect AMD to deliver all this myriad of New Tech and Products (HSA, MANTLE, TrueAudio, Physics and AI) with their Limited Resources? they are not Intel you know? AMD was trying to bring Physics to Radeon cards through Havok back in 2009, but guess what? Intel bought Havok... and i am sure you know how that ended right? so are you going to blame AMD for that?

Fast forward to 2011, AMD announces a partnership with Bullet... getting people in the Team like Manju Hedge (Ageia) Erwin Coumans (Sony and Bullet Physics) and Takahiro Harada (who also seems to be another big name regarding Physics) it`s been 2 years and this sort of thing does not pops out like a fart... you know?

I saw few Slides with AMD Mentioning Physics and AI just after MANTLE announcement, and no i am not referring to the old slides or Articles, so AMD may just have something up their sleeve regarding AI and Physics, and even if they don`t ever really release the equivalent to PhysX, no one cares up this far... PhysX was a nice Gimmick, but a Gimmick after all... and anything nVidia can do with ther GeForce and PhysX, AMD can do on Radeon GPUs without a PhysX like Gimmick "Brand".

What ever AMD does i am sure it will be open source, more likely Bullet... there are rumors about AMD partnering with Rockstar and few claims online that points out that GTAV will use MANTLE... if this is true, it will be the biggest win for MANTLE and instant KO for nVidia, still a rumor though... but heck, GTAV profile has been spotted hidden several times on Catalyst drivers and some strange bits of code that seems to point AMD hardware specific code, if AMD managed to pull Rockstar it will mean Bullet on GTAV most likely will use the GPU too, still take it with a grain of salt.

It makes me laugh when people claim AMD with their limited resources Copied DX12 and somehow manage to steal, release it 2 years before MS announced DX12 (with their unlimited pockets) hahaha... yeah right, how come no dev house knew about DX12 before the announcement? the only one claiming DX12 been in Development for 4 years it`s... NVIDIA, and we all know how fair and trustworthy they are, i smell butthurt coming from nVidia to anything regarding MANTLE, they even outright lied with their DX drivers beating MANTLE haha... there are several articles regarding this massive Lie, but i am not bothering this time, and i am not a fanboy, i like AMD and nVidia (i don`t like Intel at all however) but maybe AMD a little more because they are less of jerks compared to nVidia and are really cool to have allowed me the TressFX Gimmicks on my GeForce, nVidia are known to play very dirty and unfair when ever they are cornered.

I currently own a GTX780Ti with a GTX470 for PhysX, i have previously owned a 5200FX (my very first dGPU), then a 7300GT, then a 8800GT, then i tried AMD for the first time with a HD5770 (it was ok, that`s about it) then went back to nVidia with the GTX470 and i just bought a GTX780Ti.

nVidia cards serves me well since i mod and play Skyrim heavily, and Skyrim is happier with nVidia GPUs and i plan to stay on GeForce until the next Elder Scrolls, as a side note, for all the guys who play Skyrim heavily on PC and like Mods i am the Author of the mod SkyTEST-RealisticAnimals&Predators, very popular mod with almost 10,000 Endorsements in the top 100 All Time Skyrim Nexus Mods.

 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
As someone who uses Bullet Physics in Blender for complex physics simulations a few times a month, blow it out your whatever.

I mentioned this elsewhere, but the ARM ecosystem for consumers has a lot holding it back. The ARM ecosystem only consists of operating systems that are locked down. They are designed to prevent you from doing things the manufacturer doesn't want you to do, like install another OS.

x86 strong point is the fact that it is an open platform. When you buy an x86 laptop, you have no one sitting over you saying "you can not use the root or administrator account" and "you can only install programs from these sources unless you modify your OS"

Lets be honest, people stick with x86 because of choice. I bought this laptop because I can run Windows and install whatever the heck I want on it. Do I want photoshop from the bay? No worries! Do I want to dual boot Gentoo? No Worries!

Now look at ARM devices. Can you do that? No.

This is the largest barrier to ARM device penetration we will ever see. And the OEMs will be pushing it like crazy because it means they can drop support for the OS, ensure you can't install another one on the device, and then provoke you into upgrading. That is the environment that ARM consumer is and it is far from consumer friendly.

The OEMs and content providers will constantly be sitting around the internet telling you that ARM is the future and it's going to free you from x86. And they will be there with locked down devices that depend on vendor support to be useful, when the vendor's main goal is to sell new products, not support existing ones.

ARM ecosystem will have to change drastically for consumers before it ends up being even remotely close to support content creators.

Big content wants these disposable ARM devices because they make it extremely hard to pirate content on. OEMs want these disposable ARM devices because they are disposable and they are locked down. Lenovo decides that your tablet model will have a locked boot loader and that you will not get a version of Android past 4.4? Tough luck. Buy a new tablet.

You have a 5 year old laptop that's x86, and guess what? You can still use it!

The ARM cheap device storm is coming and they're going to want to take away our ability to create content and our ability to do what we want with our devices in the name of saving power, making devices .1mm thinner, and planned obsolescence.

I am sort of rambling because I am drinking but fuck it. This whole ARM ecosystem is trash designed to do nothing but power crappy consumer electronics that serve as tools for content providers to shovel crap down your throats while the OEMs think of how they can make their latest and greatest product outdated trash.

But they don't do it by wanting to make things better, they do it by artificially removing features. The latest thing is phones without SD slots. But you know what? They will come back in a few years and all the OEMs will go "wow you can double your storage for cheap ITS AMAZING!!!!" and every retard who doesn't know what an SD card is will drool over SD card and thank the OEMs for giving them this feature they took away.

That is the consumer ARM ecosystem, and Juan, you can shut up about it already. Show me one good ARM device with a good ARM OS that has good content creation tools on it. You won't. You will find an army of shitty, planned obsolescence trash devices meant to be replaced in a year or two after being purchased with some crappy device that is marginally faster with slightly better specs.

The entire ARM ecosystem is cancer to the world of computing and the sooner everyone wakes up and realizes that the whole ARM/Android/iOS ecosystem exists solely to stop you from using your devices how you want and to create devices that end up useless in two years the better.

I don't know why the hell you keep bending over so hard for ARM. I don't care if ARM can do 50 times the work of x86 with 1/100th of the power. The ARM software ecosystem is utter shit. Bandstanding and hoping that ARM wins in all markets is like asking for someone to come long and tell you you're only using this version of this OS and then you have to upgrade and all sorts of nasty things.

So really, just drop it already. I'm so tired of having to fight this shit.
 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460


Unless if AMD provides a Better, Cheaper solution, I will be going Nvidia as well. I personally feel like Nvidiia runs better. I'm not a fan of their business ethics. Same goes for intel, but I'm seeing now, that these companies are top dog in the gaming market. While I enjoy what AMD is doing with gaming and technology, other companies provide better performance while providing less disappointment. My FX 8350 is nice and all... But I recently had the chance to run my 1440P monitor on a i5 2500K/R9 270X system. And the minimums were higher and the FPS were smoother.

@blackstar, If I could, I would +1 you right now. I'm sorry Juan, but no one in this thread has seemed to have an interest in your ARM topic.
 

jdwii

Splendid
Just got the I5 4660(edit darn Intel names 4460 i meant) processor for my friend ran benchmarks to compare it with mine i ran Fritz Chess-Wprime-Cinebench
Seems like the Intel outperformed my old Amd per clock per core at 40% Also i turned off turbo. Actually i was impressed with the Amd since Steamroller under my testing is 12-15% faster in performance per clock compared to my processor. Amd really only needs 25-30% boost in performance per clock and they will be even with Intel. This could be as simple as including a 3rd AGU/ALU as well as NOT SHARING resources as much as possible. Improve their memory controller and maybe who knows we will see some competition.
Edit again added info
Seems like with all 6 of my cores going in the benchmarks such as handbrake as well my CPU is even maybe +-5%
 

wh3resmycar

Distinguished


sure, in a tech demo. a technology that exists in development hell can never be better than a technology that's actually existing, available for public use. looking good on paper don't count as a win. this "AMD is better because they're using open source BS" has been going on since 2008, and again, the only thing they delivered is making lara croft a pantene endorser.

now again to the dude who's hallucinating that ARM will replace x86, i work as IT in a bank, and most of our infrastucture run on x86 hardware and software (apart from some IBM mainframe software i do not want to touch nor understand). I'd love to see the IT director's face if someone will break the news that everything needs to be re-written in ARM. I'd imagine the CFO would probably jump out of the building once they factor in the cost for man hours, new hardware and new software for this "ARM will replace x86 BS." and that's chaos for 1 fortune 500 company alone.


so yeah, ARM will definitely replace x86 /sarcasm.
 
Well, in defense for TressFX, Laras' hair was one part of the whole thing. You can do foliage physics simulation (like grass, maybe trees) and other "furry type" surfaces (think of a lion or a dog) that PhysX can do (in theory). They just gave it a name for marketing purposes, but the key physics elements are present in there. You could say no one uses AMD as a physics engine because of the same reasons 3D Now! wasn't, even being better than MMX at the time. AMD sucks at negotiating deals because of its size and other non-technical reasons, but the key technical elements for any developer to choose them as a physics "provider" (PhysX is nVidia provided tech, hence the word used) are there.

Also, I found this: http://developer.amd.com/wordpress/media/2013/06/2101_final.pdf

It says it's from 2011. Is that company still with AMD or not?

Anyway, I do believe TressFX is one step in the right direction, now they just need to work on getting Devs to start trusting AMDs tools/software for more varied physics simulations.

Cheers!
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Can you take a x86 phone with windows phone inside, delete the OS and install Gentoo on the phone. No? Then stop pretending that this is a defect of the "ARM ecosystem".

The irony of all this is that you mention gentoo when the gentoo handbook has a section that explains how install gentoo on ARM hardware.

Here you have ancient benchmarks of old (32 bits) ARM running gentoo

http://www.phoronix.com/scan.php?page=article&item=gentoo_linaro_odroid&num=1

And some pages ago I gave benchmarks of Tegra K1 against AM1 both ARM and x86 running Ubuntu. I repeat the link. The ARM hardware did run the same OS and the same benchmarks than the x86 hardware: from molecular dynamics tests to raytracing or WAV to MP3 encoding.

Stop saying nonsense about ARM and I will stop replying/correcting it.



Several people here shared their interest on ARM and specially on the new K12 core.

And don't forget that during the recent APU14 Beijing conference, AMD said explicitly that its goal is to be "the leader in ARM" (exact quote). There was no similar claim about x86 during the entire conference.

Note: I gave below links to new x86 products. Nobody commented on that.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Yeah that pseudo-conspiracy nonsense also did me laugh. As I said then Microsoft had no plans for DX12 and only started development of DX12 after MANTLE was on track. In fact we have now more insider info. Microsoft started development of DX12 after developers meet Microsoft and shared their MANTLE demos. Some relevant quotes:

For this reason, many of the most experienced developers, Oxide included, had for years advocated a lighter, simpler API that did the absolute minimum that it could get away with. We believed we needed a teardown of the entire API rather than some modifications of current APIs. Admittedly, this was after advocating no API at all caused the hardware architects’ faces to pale a bit too much. But if we were to build something evil, at least we could make it the least evil possible.

It was this group of advocates who, with AMD, pioneered the development of Mantle. Mantle was not an API birthed by a hardware vendor, but rather a child born of developers and AMD to create a completely different class of API. AMD selected a small but expert group of developers to help advance it. The intention was not to develop the end-all solution for every developer, but rather to build something that didn’t block our studios from maximizing the very capable GPUs that AMD was building.

This group spent quite a bit of time with AMD going over and helping shape the API. Many of the features and structure of Mantle came from developers, not from AMD itself. For example, we could show that nearly every batch required at least some small data payload, so we built in a specialized fast path just for it.

Oxide still remembers the day we did our first tests. We watched as driver overhead, once the dominant chunk of our frame execution time, practically disappeared. We watched as the thick driver threads that often polluted our cache and stole our CPUs disappeared. We watched as the little driver overhead we had linearly scaled across our cores. We saw this, in spite of the fact that Mantle was a very new API, competing against established and optimized APIs. There are still many optimizations that both we and AMD have yet to make!

We heard nothing of the development of a new version of D3D12 until sometime after the early tests of Mantle were indicating that this radically different API model could work, and work well – data which we shared with Microsoft. What prompted Microsoft to redesign DX is difficult to pin down, but we’d like to think that it would never have happened if Mantle hadn’t changed the game. We are delighted in DX12’s architectural similarities to Mantle, and see it is a validation of the pioneering work that Oxide was privileged to be part of.

http://www.oxidegames.com/2014/05/21/next-generation-graphics-apis/

The article also destroy other myths against MANTLE.
 

jdwii

Splendid
^ That's good news to hear juan, hope Intel and Nvidia make their own API as well to compete with Amd, because as of right now i feel like when directx 12 comes out no one will be using mantle anymore. Be nice to see OpenGL rewritten as well. We need a High performing API that works on all platforms.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780
The only way for MANTLE to survive is for AMD to work fast and release all the Docs and open source it before DX12 arrives, and if possible bring it to Linux and Mac (i seriously don`t think this will happen), it will be an overkill to DX12.

MANTLE already rape MS, it worked... for all we care now, MANTLE can die or either keep evolving.
 

vmN

Honorable
Oct 27, 2013
1,666
0
12,160
Mantle was pretty much already dead at launch, by the simple fact that is isn't supported by any other big GPU manufacturer.

Mantle did however make M$ start working on their next API, DX12.

Developers will normally go after the big market. We need to be realistic. Microsoft is the provider of the biggest API used worldwide.

It is an scumbagmove from nvidia, just ignoring an almost ready API which have shown to be better than DX11.
 


In that point, actually MANTLE is quite the deal. You lose nVidia, but you get consoles and (maybe? lol) Linux as dev ground.

The business case for MANTLE is risky, we can't say it is not, but at least it has a sound base: more performance at the same price point (for consumers; give or take given AMD's rep with drivers) and you get "easy" cross platform (that's the promise at least).

I wonder if UE will put some MANTLE support in, since Frostbite already has it and other big players will add it it seems. If UE puts MANTLE support, then we can start thinking that MANTLE is actually something to be very wary of; but yeah, until that happens, it's still too early to call it a dead tech or not. At least we have games that support it :p

Cheers!
 

colinp

Honorable
Jun 27, 2012
217
0
10,680


Er, I think you'll find that OpenGL in its various forms is the most widely adopted API.
 

8350rocks

Distinguished


Crytek is already adapting their engine to offer Mantle in the next upcoming iteration of the engine...I believe "big backers" have picked up considering EA/Dice/Bioware are using it, Crytek, and Star Citizen, among many other AA title developers...I honestly would be less concerned about UE picking it up, and more thrilled to see Blizzard working on it. If Blizzard picked it up, that would essentially mean the top 3 publishers are using it...which essentially nets Mantle a potential foothold into ~65-70% of the games coming out over the next couple of years...
 


I was taking it from the POV that UE is more widely used in games. I don't really know if its going to be like that for this gen and the next, but UE is, at least for me, the heavy hitter of the Graphic Engines out there. More than Crytek and DICE at least. Publishers don't really make the calls for what engine the Dev houses will use (except EA, maybe?), so UE is still a nVidia stronghold/fortress. Once that is taken away from them, I'll say MANTLE is sure footed going forward, haha.

Now, I agree with Blizzard though. Along with UE (most widely used engine), Blizzard is an Intel (nVidia second) stronghold that needs to be conquered for AMD to get some good track. Not because of the engines Blizzard might develop, but because of the user base. For all the doom and gloom Diablo and WoW might be carrying around, they're still mammoths (user base... don't forget, lol) and I'm sure any game that comes out will be a great showcase... If they manage to convince Blizzard, of course.

All in all, MANTLE is not dead, but it ain't the butter and bread I (we?) want it (need it?) to be.

Cheers! :p
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Except that you ignore that the author makes it explicit that they will continue supporting MANTLE and he cared to explain why:

1) DX12 only a subset of MANTLE performance.
2) "Strong interest in supporting platforms beyond Windows."
3) No word on whether DX12 is coming to Windows 7.
4) "From a business standpoint, it makes little sense to rely exclusively on Microsoft doing the right thing."
5) "Mantle is, for us, quite cheap to support."
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


40 developer studies have joined to the SDK beta

http://www.anandtech.com/show/7985/amd-mantle-developer-private-beta-begins

And Microsoft is not anymore the provider of the biggest API:

Windows and Direct3D exert less influence over the total universe of game development now than perhaps at any previous point in history.

http://www.extremetech.com/gaming/179010-who-needs-directx-amd-nvidia-and-intel-team-up-demonstrate-ultra-low-overhead-opengl
 
Status
Not open for further replies.