AMD CPU speculation... and expert conjecture

Page 528 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

etayorius

Honorable
Jan 17, 2013
331
1
10,780
I don`t think MANTLE was DOA, Activision and EA decided to support it fully... just EA and Activision account for at least 70% of all the games out there and Devs already claimed MANTLE will be stronger than DX12.

Like i previously said, if Rockstar truly gone with AMD like the rumors claim... MANTLE will gain massive amounts of Prestige.
 

jdwii

Splendid


Let's put this bluntly, there won't be a single PC game for the next 5 years that doesn't have directx, can't say the same for Mantle or OpenGL(which needs to be rewritten).
 

jdwii

Splendid


Well that's nice for that one dev, but at the end of the day where will the majority of their market share come from? Mantle as of yet only works on Amd, that's the main issue. Directx 12 and OpenGL works on all 3. The market won't let Amd control 100% of everything. Nvidia will still be kicking them and Intel will be beating them in terms of sales for quite some time. I'm all for multiplatform competition but the reality is Microsoft won 20 years ago.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


FYI, "total universe of game development" != "PC game".



Let me refresh your memory by telling you that you wrote "no one will be using mantle anymore"; thus I only need to quote one to show that what you wrote is untrue. Moreover, I am pretty sure that many of the 40 that joined the MANTLE beta think the same than the one I quoted, otherwise they wouldn't join.
 

jdwii

Splendid
^ I understand i said i feel like no one will be using mantle no more. Learn to read right or is english not your native language? Also note that most people on this site probably don't care about anything beyond PC gaming or at the very least console+PC gaming.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


63085.png


reading fail: 1.5 ghz jaguar score 0.39, 2.2ghz puma scoring 0.54 > 2.3 ghz scoring 0.70 "What I wrote was "Jaguar IPC > Piledriver IPC"

totally because that 0.1 ghz is going to add like 1.5 to puma's score proving that "Jaguar IPC > Piledriver IPC"

extremetech ... rofl.

Calculating relative efficiency between Kabini, Kaveri, and Richland

Calculating efficiency with a bunch of unknown benchmarks ... as if all benchmarks are linear 1:1 with clock speed.

but lets apply their method to see how much truth is in their article.

0.39 score, 1.5ghz jaguar, upped to 2.3ghz pd (0.39/1.5*2.3=0.59) ... yep, its still wrong, Jaguar !> PD.

And for the record, ARM stated the a57 is targeted at 2.5 ghz, AMD stated their A1100 is >= 2 ghz

ya, pure invention that 2.5 ghz is >= 2 ghz

Lets look at something not marketing for a comparison.

http://www.spec.org/cpu2006/results/res2012q1/cpu2006-20111219-19191.html

20W dual core 32nm xeon 1220L = 72.6, 25W A1100 8 core A57 28nm = 80 ... ARM A57 sure is impressive ... when fluffed by marketing. 8 cores (25w) to beat 2c/4t (20w) EOL technology IRL, but marketing would like you to not know any of that information, its a secret.

If you want to bring a conversation, try using something other than marketing hype.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


This A10-4600 has 3.2 GHz in turbo.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
aside from the fact that your not juan, thats the entire point. You cant calculate theoretical IPC by guessing at the actual clock speed. A flaw that easily makes up the 2-8% that jaguar IPC > Piledriver IPC according to a favorite extremetech article.
 
62710.png

at 2.05ghz the kabini chip does 0.51 in cinebench 11.5

5SwxoTZ.png

at 2.0 ghz my piledrive core does 0.59 that is at least 15% more performance/mhz
I ran this with only 1 module so the CPU turbo'ed to 2ghz.
 

8350rocks

Distinguished


Meh, I would wait on that NVidia upgrade...
 


The problem is AMD is approaching physics piecemeal. Rather then making a robust API that can be used for everything, they're making different API's to do one specific thing (TressFX, anyone?). That's why TressFX is already dead.

And yes, Havok craps out after so many objects; the CPU simply can't handle the strain. That's why I was a very early supporter of PhysX, back in the Ageia days. Problem is, having the GPU do physics while also doing rendering is too much for the GPU to handle. Hence why the PPU/Second GPU is the prefered method, but even then, you're limited too much by the HW.
 

Why spoiler the better looking Intel numbers? Would be interesting to see how the 2500k (or another quad without HTT) looked like...Also, why bench the older 2600k?

Hence why I hate GameGPU; they bias AMD.
 


Dev's don't get told about API development until the API gets finalized, which didn't happen until a few weeks before it was revealed. Hence the demo running DX12 at the reveal, which was done after the spec got finalized.

I'll say it again: AMD "borrowed" parts of the spec, claimed there would be no DX12 while everyone who knew otherwise was under NDA and couldn't refute the claim, then released the most performance friendly parts early for their own financial gain. And because they released first, MSFT obviously stole from AMD, despite the fact that the API hadn't been publicly released yet for them to steal in the first place!
 


Due to legacy SW and non-Windows platforms. No one on Windows uses GL by choice. It's a piece of crap.

Seriously: Junk it, because you can't fix it.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780




No they did not, it is impossible for AMD to run ahead with a MS "Idea" and manage to release it 2 years before with their extremely limited resources, simply impossible... what you are saying it`s seriously a Lie:

http://www.oxidegames.com/2014/05/21/next-generation-graphics-apis/


Oxide:
"We heard nothing of the development of a new version of D3D12 until sometime after the early tests of Mantle were indicating that this radically different API model could work, and work well – data which we shared with Microsoft. What prompted Microsoft to redesign DX is difficult to pin down, but we’d like to think that it would never have happened if Mantle hadn’t changed the game."

Even AMD has been begging MS to go Low Level FOR AGES, it`s known history... Gosh dude, you really really, like... REALLY hate AMD don`t you? getting tired of your same rant over and over and over and OVER... and over... its just like Juanrga with ARM, but at least he is not on the watch waiting for one Corp or another to make the slightest move to piss on any of them.

I have a feeling DX12 was completely different from what MS announced recently, probably it was just few more Fancy Features... when AMD announced MANTLE and devs yelled at MS for their ancient API, they had no choice but rebuild it from the ground with "MANTLE" like features, hence why it will be released in late 2015 with games arriving early-mid 2016, there are even several articles claiming the contrary of what you are spreading, and that DX12 is actually MANTLE disguised as a new DirectX version.

If that does not makes sense too you or rather that prefer to continue your Anti-AMD rants and call Oxide a bunch of BS Liars... here`s what Oxide got for you:

Oxide:
"Oxide’s door is still open to other forward-looking APIs. Should Nvidia or Intel create an API that is as functional and straightforward to use as Mantle, we’d be more than happy to support it. Oxide doesn’t play favorites here. We recognize the passion of PC gamers. We recognize the amount of time and money players invest in their PC, and are committed to doing whatever we can to get the most out of whatever hardware our fans are running."

Just NO, you need to start giving AMD credit for all the new cool things going around in the PC industry... i have pissed on AMD too on this same thread, where everyone on this thread went ape on me saying i was spreading Pro-INTEL Propaganda after i posted the "leaked" info about AMD skipping Steamroller FX, and i have said before i will not buy INTEL ever again... but it`s not like i hate them and want them to fail just like you want AMD to just fade and wither... but it seems your prediction about "MANTEL DOA" was a complete fail, i given you credit before because you were right in regard to Kaveri being Bulldozer 2.0 as i expected Kaveri to be on i5 2500 level, but regarding this topic you are just plain wrong, and what is more funny is that you have no way of proving your claims that MANTLE stole/copy pasted DX12 code, there is a good reason why AMD went ahead and got greedy with the Consoles, that same move ties it with MANTLE.

If AMD copied DX12 why the hell MANTLE is here NOW 2 years ahead from DX12 while also being more powerful and efficient just as Oxide, DICE and other devs claims?

Regarding TressFX, i don´t think it´s even dead as of now, and even nVidia was able to ramp up performance thanks to TressFX being open source, so nVidia just had to sneak in and copy/modify their code for their GPU´s, time will tell if you were right on TressFX, and if you need to be given credit for that i will, but regarding "MANTEL"... i don´t think anyone will be Crediting you for the prediction of MANTEL being a ripoff of DX12, just as your prediction that it was DOA failed miserably.

Perhaps you did not see how a lot of devs went ape against nVidia and their horrible "GameWorks" Scheme, i have not seen you comment on GameWorks at least once... now that thing is EVIL... why not give some of your usual love for AMD to GameWorks too? perhaps you could find a nice name like "MANTEL" for that abomination too... how come MANTEL is so bad for Devs as you claim, yet you ignore GameWorks is completely? someone just posted a bench with the 290X against a lowly 770, and we all know the 290X even manages to beat the Titan in some benchmarks and this is the type of BS that really hurt Devs and Gamers, so for once take a step back on AMD and direct your attention at GameWorks, it will be a nice new change for once.
 
Dev's don't get told about API development until the API gets finalized, which didn't happen until a few weeks before it was revealed. Hence the demo running DX12 at the reveal, which was done after the spec got finalized.

You and I both know this is complete bullsh!t.

Outside dev's are often actively involved in the creation of new API's and there are many alpha stage releases to those developers to play around with and submit comments back to the API's developer. Your version of events is illogical and breaks continuity. Hell time machines start becoming a requirement if one thinks on that tangent long enough. AMD was announcing Mantle long before it was released and if DX12 had been in development anywhere near as long as you claim then Microsoft would of been announcing it as well. Feature sets such as those aren't hidden away in some backroom closet. These companies are publicly traded companies, stock price is a driving factor in their marketing decisions and they market these "developments" as ways to increase that same stock price.

The most you can possibly claim is that MS would of been developing "something" but wasn't solid on it until AMD started ramping up Mantle for public release. Then MS would of had to put out some news announcement to prevent a perceived loss of their premier market position.

Occam's razor being what it is and all.
 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460
@8350rocks, I liked the tone on that comment... Maybe I will wait. Secretly hoping AMD pulls out RAM stacks and higher efficiency this year. GCN 2.1? I don't want to end up like the GTX 680 users and getting a mamed 780.

I'm holding out for a GPU with a lot of VRAM and hopefully will run most games @60FPS 1440P. I might go for dual GPUs if budget permits. I cant hit the upgrade cycle on a GPU for black friday anyhow. :( But I should have money for a "Higher" end card ;)

@etayorius, Don't worry about gamerk316. I think he's just trying to dispel the BS and the fanboyism in this thread. I'm not as against Intel and Nvidia as I was previously. But Nvidia's whole thing with watchdogs make me question their ethics. Questionable at best. I'm really curious to see what the CPU numbers look like.
 
But Nvidia's whole thing with watchdogs make me question their ethics. Questionable at best. I'm really curious to see what the CPU numbers look like.

No company is ethical or has any kind of moral compass. The only duty of companies is to make money for their owners. What you often see is companies in different positions will make different moves based on what helps them best. Intel and nVidia are both in strong dominate positions, they don't need to "become dominate" and so they pursue tactics to maintain dominance. Becoming dominate involved inventing or creating products that are better then the current dominate ones, maintaining dominance involves ensuring nobody else makes or markets a product that beats yours. The difference is subtle but it's important to note that preventing competition is often cheaper then beating competition. The non-dominate member can not win by preventing competition, so they often try an open competition-orientated approach that puts them on more equal footing with the dominant member. The dominate member gains nothing from open competition and will attempt a closed proprietary approach that creates barriers for non-dominate members. It's the difference between attack and defend.

So if we were to magically reverse the position of AMD and Intel, we would see AMD/ATI doing the same things as Intel / nVidia are doing now. It's why I don't maintain loyalty nor to any company.

The above is also why Mantel MUST become open. And I mean open as in free to implement for third party vendors. AMD is not in a dominate position, defending a product development won't win them more market share or profit. Their only option to to attack by developing newer products and spreading them as fast as possible, which entails making any standards open and free to implement.
 
mantle and enduro work together finally. amd is staggering, fumbling and sometimes neglecting. but they are very slowly getting close to aligning the stars.
http://www.anandtech.com/show/8058/amd-catalyst-146-beta-drivers-now-available-adds-new-eyefinity-functionality-more
may be use one of these monitors with the new drivers? ;)
http://www.techpowerup.com/201230/28-inch-uhd-iiyama-monitor-with-1-ms-response-time-shows-up-online.html

mobile kaveri spec details
http://www.anandtech.com/show/8053/amd-kaveri-specs

AMD drops support for Microsoft’s Windows 8
http://semiaccurate.com/2014/05/26/amd-drops-support-windows-8/
LOL

@etayorius: the "limited resources" excuse only works if they're as small as companies like oculus rift's developers or calxeda. not for amd. amd is smaller, but they have vastly more resources (funding, experience, expertise, market penetration and marketshare etc.) than small companies, even after their restructuring.
 

jdwii

Splendid


Yeah i went with Nvidia over Amd this time around i love my 770 now i want to upgrade my 1100T to a 8350fx if you look at watch dogs its actually 40% better, the truth is this new design is being used in next gen consoles to Piledriver still hangs really if they made a high end excavator part it would probably beat the I7 in most next gen games.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


And still native english developers disagree with your "feelings"... :lol:

The quote about the "total universe of game development" was included because "PC gaming" is not an isolated island. Or do you believe that when I mentioned server roadmaps it was because most people here care about servers? Nope. I mentioned the server roadmap because was telling us that no FX Steamroller was coming to the "PC gaming".
 
Status
Not open for further replies.