AMD CPU speculation... and expert conjecture

Page 499 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

colinp

Honorable
Jun 27, 2012
217
0
10,680
What's this all about then?

http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1920283&highlight=

Will we finally learn about the roadmap for Excavator and beyond on 5th May?
 


1: Arma 2 is simply really badly coded, from everything I've seen. That's an entirely separate issue from API overhead.

2: Would be nice to have side-by-side frame latency graphs, or even better, a GPUView analysis. Still, the point is that driver side optimizations were able to squeeze out similar gains to adding a Mantle backend.

3: The ONLY thing Mantle is addressing directly is the overhead in the DX driver layer. Hence why AMD tends to benefit more then Intel (weaker cores are more sensitive to heavy threads bogging everything down).

4: I find the "extra units on the screen" argument ironic, since we are now coming back to the "added value" of PhysX. Yes, its the SAME. EXACT. ARGUMENT.
 

do you think starswarm is similar to a physx benchmark?
 

colinp

Honorable
Jun 27, 2012
217
0
10,680


Isn't it funny how you always take Nvidia's claims at face value while you are consistently cynical about AMD...
 
absence of 20nm radeon in 2014, amd's 14nm, 20nm plans and speculations
http://www.fudzilla.com/home/item/34562-no-amd-20nm-graphics-this-year
http://www.fudzilla.com/home/item/34557-amd-talks-20nm-finfet-16nm-14nm-future

glofo 14nm in 2015
http://www.fudzilla.com/home/item/34560-first-globalfoundries-14nm-commercial-chips-in-2015

keep in mind that the amd executives left big enough gap in their statements which trolls and general c.a.l.f. can twist the words to fit their ramblings.

looks like qualcomm and apple have priority on tsmc's new 20nm process and nvidia, as i [strike]parotted s/a speculations :p[/strike] claimed(!) is being fab-blocked. another prediction confirmed!

this 20nm delay makes me wonder about the future of 3rd party foundry business. i mean, all the fabless vendors are s.o.l. if tsmc doesn't deliver in time. afaik, everyone was holding on to the hope that tsmc will deliver 20nm in time, then 16/14nm processes for mobile socs (arm) so that they can go toe-to-toe with intel's 14nm process and socs. what this delay did was give intel time to optimize their 14nm process further. samsung and glofo have their thing going but neither is as 3rd party- oriented (i forgot the exact term... pure-play something?) as tsmc. although, nvidia was never in a competitive position since their tegra 4 mess and project debacle (ahem). amd's gpu desings aren't exactly portable like console socs. amd coulda gotten a massive headstart over nvidia by sourcing gpu and tablet chips from glofo.
 


No, Mantle in Starswarm is. The added benefit of using Mantle in starswarm is the same as the added benefit of any game that makes use of PhysX: Extra stuff in the background.



Other games saw plenty of performance boosts (Thief being a primary example, with upward of 30% gains). The point is, just as devs can add ~15% to performance by adding Mantle, NVIDIA can add by virtue of driver side tweaks.

I'd actually be very interested to test a range of games, comparing the driver version that was standard the DAY the game launched, versus the current driver, and see what percentage gains have occurred since the game launched. I'd expect 20-25% on average, minimum, comparable to what Mantle gives on some subset of configs.

Isn't it funny how you always take Nvidia's claims at face value while you are consistently cynical about AMD...

Not NVIDIA doing the benchmarking. I'm simply commenting on what I see, and that's a ~20% performance improvement in DX via driver side optimizations.
 
Well, this is interesting (and predicted):

http://wccftech.com/xbox-one-architecture-explained-runs-windows-8-virtually-indistinguishable/

Interesting they're essentially using a RTOS layer to manage Windows. Inefficient as hell though.
 

8350rocks

Distinguished


Contact WarSam71 on the forums...he is an AMD rep.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Fab or yield blocked? Xilinx was smart and started making their chips 2.5D to combat the yield issues. Their 6.8B transistor part was 4x1.7B die. Nvidia still had the largest single die at 7.1B

Apple could double their graphics cores and still be around 1.5B. Much easier to yield than where NVidia wants to go (>10B).
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I began my post stating that I respect your opinion on this topic. I expect you to respect other's opinion. "Easy" and "difficult" are subjective terms. Some people think that OGL is more difficult than DX, other think the contrary. In next links you can find several people who claims that modern OGL is simpler and easier to learn than DX11.

http://stackoverflow.com/questions/4171631/comparison-between-opengl-and-directx

http://www.gamedev.net/topic/624858-the-state-of-direct3d-and-opengl-in-2012/



No. Moreover, you are assuming that the statistical noise will close the gap, but it can do just the contrary and increase the gap. However, I am not considering individual behavior, but the behavior of the ensemble and MANTLE provides near twice the performance of the driver update.

Also, your argument is based in the fallacy that AMD drivers are already optimized and that current version of MANTLE is already near the maximum performance value, which is not true. I have seen a recent review where AMD AM1 platform performed 100%--400% faster with improved non-MANTLE drivers. Nvidia 50% increase are peanuts compared to that.



Ok, but how do you know that AMD has not tapped out on Samsung LPP/E?
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I read rumors that Nvidia is considering delaying 20nm Maxwell or backporting to 28nm.
 

jdwii

Splendid


Wow that nuts i hope we will see more PC games now. Never thought they would do such a thing.
 

Interestingly enough, I honestly think Microsoft is taking the best approach. The PS4 was designed with only games in mind but the 360 outsold the PS3 a lot and was used a lot for media like Netflix and such. The XB1 is in a great position to become the all in one media hub much like a HTPC while the PS4 will be considered just a console.

It is a lot like the Wii U. It is just a console and a lot of people don't care for it, its sales are doing pretty bad. The XB1 is actually selling well, better than most people think. A lot of people think the PS4 is doing better because it was sold out more often when in reality it was sold out because they didn't produce as many units. It is a lot like the PS3. It sold out a lot at launch because there was little to no units produced compared to the Wii or 360.



The other issue is the support. While I have no issue with open source there is the issue of support and even fragmentation, much like say Android. OGL is a good API but it doesn't have the support and while sometimes it will be equal to DX, there are times when it wont if the people behind it do not push it.

If it had the backing that DX had I think it would be a different story.

Same can be said about Mantle. While it has a bit better support from AMD, I don't think AMD will be able to keep up with MS in terms of adding changes. It is a lot of money to continually push new software and if you look at drivers, AMD has been pretty slow on new WHQL non beta drivers.



This has been known actually for a while. Everyone knew it was a 8 kernel running on top of a base OS with a side Xbox OS.

That said, while it seems inefficient to us it is probably better for the XB1 and how it is to run. As well it is a single set of hardware so the entire thing gets optimized well beyond what MS can do for a PC.

With the XB1 being so close to the PC and running DirectX, I think MS has the advantage over OGL or Mantle. We will see in the long run but I do not see DirectX going to the way side for quite a while.

BTW, no more Linux talks.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Their R&D budget keeps going down.

If they had I'm sure they would have found some way to spin it as positive news by now.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780




I think the PS3 was complete crap... but actually in the end PS3 outsold Xbox360 by couple Millions, if i were to choose between PS3 and Xbox360 ill take the 360 without thinking it, on this generation i would easily pick the PS4 over the Xbone.



 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Currently Sony is selling about three units of PS4 for each two units sold of Xbox1. It seems PS4 is not doing it so bad for being only a console

http://www.dailyfinance.com/2014/04/23/ps4-versus-xbox-winner/



Current AMD is different from previous AMD.

I believe that AMD was already informed about GF plans and that when Dr Su announced tapping out at 14nm AMD was aware of Samsung process. But even if this is not true and AMD tapped out for the now dead 14XM process, the Samsung/GF partnership includes a "copy-smart" approach that allow "designs built at one foundry to ramp up smoothly at another".
 
I think the PS3 was complete crap... but actually in the end PS3 outsold Xbox360 by couple Millions, if i were to choose between PS3 and Xbox360 ill take the 360 without thinking it, on this generation i would easily pick the PS4 over the Xbone.

The PS3 had its good points, but was, much like the N64, too unique for devs to really get used too.

Sales wise right now though, its not even close. 7 Million PS4s to ~4.2 Million [5 Million shipped with a backlog of ~800,000 on store shelves]. The fact Sony is wining in NA/UK, the only two markets the 360 beat the PS3, is telling.
 
AMD says no to the budget tablet market
http://vr-zone.com/articles/amd-says-budget-tablet-market/76441.html
amd has massive untapped opportunities for tablets. for example, make a57 cores + gcn 1.1/2.0 socs and make them socket-compatible with AM1 platform and co-exist with beema. amd could try to penetrate the growing chromebook market with it's apus, socs and arm socs.
in other news, arm's revenue shrank a bit due to cheap smartphone sales. arm ecosystem has hit the "cheap is good enough, don't need high end" milestone at a lightning fast pace.

AMD CSO John Byrne talks ARM
http://www.fudzilla.com/home/item/34568-amd-cso-john-byrne-talks-arm
also on tilda swinton playing emma frost in the next x-men movie....:pt1cable:

AMD Chief Sales Officer thinks GPU leadership is critical
http://www.fudzilla.com/home/item/34567-amd-chief-sales-officer-thinks-gpu-leadership-is-critical
or lilandra or professor x himself..... :whistle:

Dragon Age: Inquisition trailer shows Frostbite-fueled gameplay
http://techreport.com/news/26352/dragon-age-inquisition-trailer-shows-frostbite-fueled-gameplay
frostbite 3... could be the next mantle-enabled game.
 

truegenius

Distinguished
BANNED

This sentence of yours points out the failure of xboxone as a gaming console

first and most important role of any gaming console is to play games in which xb1 failed
imo, even current gen tablets and mobiles are great as an htpc, they got techs to connect with your tv and other gadgets (like storage drive full of movies) and thus serves as a better htpc and costs less and does other job or phone/tablet too and maybe xb1 does not support h265 while tablets and phone are starting to support it

when specs of xb1 and ps4 were revealed then at that time i thought that gtx690 like gpus will become minimum to play games
but it looks like this gen consoles (especially xb1) are trying to catch current gen PCs
 

jdwii

Splendid


Agreed and as for a media PC i could probably build one for 300$ that can do that. Not to mention the PS4 could do the same stuff if it had in software update really all people care about is online streaming at least the console market, not so much about hooking a console to a comcast box or satellite box, and the camera was a horrible idea, and really gimmicky, was kinda like Microsoft wanted the casual crowd vs the hardcore gamer crowd(cough wii).
 


There's a saying in spanish that goes: "no desvistes a un santo para vestir a otro". Meaning: "you don't undress a saint to dress another one".

IBM is another approach to what Intel is today; or say, another type of monster, but monster still. IBM is no super hero, but I'll give that in the desperation, they actually had to give in and ask for help through that consortium. I wonder, if this pans out, what will they do. They're IBM, they can get out and pay whatever legal fees they have and what not, so good on one hand, but I'll still won't give them any cookie, haha.

Cheers!
 
Status
Not open for further replies.