AMD CPU speculation... and expert conjecture

Page 542 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

8350rocks

Distinguished


As long as the evolution from GCN is not so drastic it will be easy to maintain.

Also, picture a world where every steam machine carries AMD GPUs to support mantle on linux better than nvidia can support OGL.

If they play their cards right, and developers take the queues and run, AMD could easily gain 10-15% market share in a matter of 6 months...which would take the currently 35-40% market share they enjoy in GPUs and turn it into roughly half or better.

EDIT: You really, really....REALLY lowballed that one. AMD has much better GPU market share than they do CPU market share...
 


I didn't say dGPU, did I? You have to factor in Intels share here, and thats about 40% of the total market.
 

you may have considered, but it never seemed to me that way in your posts for the duration of this discussion/argument. actually your posts from the begining reflect that. check out portions of your own posts regarding the yields where you drone on and on about yields being extremely good because there's no dual core parts or defects would showe up in cpus first. that's why i ended up not understanding why you were completely ignoring the igpu and had to ask you for explanation, because your "logic" did not make sense to me (still doesn't). since it is about my not understanding your point of view, it is not lying. your accusation is utterly, entirely baseless and plain wrong. here i thought i could participate in healthy argument instead it turned into a cliched, useless, baseless blamethrowing.

i went back and re-read up on chip manufacturing. there are several steps in manufacturing and defect-testing, binning (during wafer testing), die harvesting and speed binning. from wafer testing to the final test, there is no mention of yields being "extremely good" or even "good" because a defect showed up in one part of the die (in your case, the cpu) earlier than anywhere else on the rest of the die. one testing procedure might miss one or several type of defect because of using automated tools or economic or feasibility concerns but others catch it/those. fabricators know this, hence they employ exhaustive testing to ensure that no flaw, no matter wherever it appears, sooner or later, escapes quality analysis and makes it to the customer. and that pretty much breaks your "logic" about "extremely good yields", arguing about "less modular part" (what the heck that supposed to mean anyway?) or "2:8 ratio" (another wth is that anyway). from the available information including the ones you posted, defects do seem to exist in glofo's kaveri yields, so "if defects exists..." it moot.
i didn't ask about why the igpu is being called "less modular" or what 2:8 ration means (i assumed it's 2 modules : 8 cumpute units..whatever) because the base part of your "logic" is wrong.

edit: forum is acting up on my end, wonder if the forum is going through maintenance..
anywho,

i think low end a4 apus will launch alongside the others in the second salvo. announcement and retail availability are different. i didn't really check when those trn/rcd aspus came out. iirc, trinity and richland dual module apus were available quite soon after announcement while athlons were m.i.a. for months.
i looked up this for reference,
http://www.cpu-world.com/CPUs/Bulldozer/AMD-A4-Series%20A4-5300.html
shows price history graph starting from late 2012 when other dt trinities launched (launch reviews came out). richland a4 6300 came out later (after richland apus launched) in october judging by cpu-world dates.
http://www.cpu-world.com/CPUs/Bulldozer/AMD-A4-Series%20A4-6300.html
non-k apus seem to ship to oems first, then in retail shops. in case of kaveri, i don't know.. haven't seen the announced apus on shopping sites. afaik, amd didn't announce prices, only the specifications. amd dropped a10 7850k's price very recently, so retail availability (of the new apus) should be imminent.
 

colinp

Honorable
Jun 27, 2012
217
0
10,680
I'm guessing that if you're the sort of person who is happy with the performance of an Intel - ahem - gpu, then a) You don't know what a gpu is b) You don't know what a driver is and c) You don't play anything more complex than Angry Birds.
 

logainofhades

Titan
Moderator


Intel's igp can play League of Legends and a fair amount of older games like FFXI online. Not all gamers are into the more harcore games like BF4 or Crysis3. Personally, none of the newer games out there really interest me at all anymore. Unless something manages to catch my eye, I will probably be done gaming once I quit WoW. Only exception would be emulators to run classic NES and SNES games. I am old school like that, because that is what I grew up with.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


AMD controls main consoles and porting from consoles to mantle is easier than porting from consoles to DX11.

Moreover, several devs have explained why they will support both MANTLE and DX12.

And more devs are adopting and praising MANTLE

http://community.amd.com/community/amd-blogs/amd-gaming/blog/2014/06/10/more-mantle-games-are-on-the-way

When AMD will move to another graphics architecture, e.g. that currently developed by Kodury and that will be released in 2016, AMD will only need to change the MANTLE driver.



At my best knowledge the linux version of CIV will not support MANTLE, basically because MANTLE for linux doesn't exist yet.

As mentioned before AMD is now focusing on the MANTLE SDK, when it was in final form and Valve ready, then MANTLE would be ported to SteamOS/linux/Mac. Or that is how I would do it.



If you had stated that you don't understand my argument or that you understood it but disagree, everthing would be fine; but this is not what you did. You have been insisting on that I ignored the GPU after I stated very clearly I didn't. You lied. You even pretended that I don't know that Kaveri has GPU inside, despite I have published an article about Kaveri where I predicted GPU performance. You trolled.

In any case, this is going nowhere just as your discussion about hypothetical Kaveri with system ram + GDDR5 vram... which I stopped answering. I am not interested really.




This is clearly wrong. Moreover one has to look forward not backward:

1) Iris Pro is just behind Kaveri GPU performance (you can barely play BF4 on both).

2) Broadwell will bring between 40% and 80% better performance than Haswell, depending of the source of the leak. I think that a very good estimate from considering the increase in EUs more the update to Iris Pro graphics level for K-series is 60% ~ 20% + 40%.

3) Desktop Broadwell K will include Iris Pro graphics level. This means that i5-5000k would have a GPU faster than current Kaveri A10.

4) And Skylake will bring an improved graphics architecture

http://www.kitguru.net/components/graphic-cards/anton-shilov/intel-to-boost-graphics-performance-of-skylake-gpus-by-50-per-cent/

(1--4) are the reason why AMD is preparing a new graphics architecture, with stacked RAM, for 2016 APUs.
 


The consoles do not support Mantle and if a game is a XB1 exclusive, it is actually easier to port to DX11 since it is using DX11 as its API and a Windows 8 kernel.

All of this Mantle praise is great, if you have a weaker CPU. I tested it and Mantle does not benefit me and my i5 at all. Then again I don't test at low game settings but maxed out, why else would I have a top end GPU? I mean who is going to buy a R9 290X to play on low settings at 1080P?

As well, Mantle still needs to add support for NVidia or AMD needs to push it because without that it is pointless. Game devs will write for it for a while but when it starts becoming pointless they will stop and move back to just DX12 or whatever. Right now, NVidia still has the majority of the discrete GPU market. Unless AMD magically changes that, Mantle is still a small market GPU.

As for Intel, I said this a while ago. People laughed at Intel back when the Pentium 4 was out and they were talking about Conroe and how the performance was better as was the power consumption. People said their IGP would never be that good yet every version they release is decently close to what AMD has out at the time and even Iris Pro is not that bad considering it is supported on newer games like Watch_Dogs and Wolfenstein TNO.

This is much like Terascale, the 80 core CPU that had 1TFLOP of performance in a 62W thermal envelope. Intel will do anything they want as they have the funds and experience to do so.

AMD moving to stacked RAM for a GPU is the next logical step though. It is the only way to actually get better IGP performance as even just on package will give much better bandwidth and lower latency than using system RAM. Of course I am willing to bet that by then Intel will move the stacked RAM to on die which will be even better.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Q1_2014

JPR_Graphics_chip_market_Q1_2014.jpg


Intel's "good enough" graphics are severely putting the squeeze on.
 


Yeah Maxwell is very efficient. Until the release of the 750 / 750ti, the Radon HD 7750 was the most powerful bus powered card available.

The 250X you linked is actually a re-badged HD 7770 which was launched years ago, so I have no doubt AMD will be launching some more efficient cards soon that will compete with Maxwell.
 
sigh, this is no longer anywhere near a tech argument, just baseless blamethrowing. i'll still try once again, to clarify my position. since this is now a straw man argument, as i will demonstrate below, i won't engage further.

oh, but i did state. very clearly. more than once. here's the first time, in case your memory needs refreshing:
i have to stoop to underlining the part. it's become that sad.

i didn't understand, let alone dis/agree. that's why i asked You for further clarification.

i never insisted. i said i didn't understand why you're stuck on the cpu side only and i've stated why i thought that. it was after i repeatedly asked, you barely mentioned that you "considered" while your posts showed otherwise. i can re-post what you said if you keep saying you didn't ignore the igpu.

... no i didn't lie, let alone troll. it went to the point where you ignoring the igpu being affected by defects made me wonder if you were ignoring the presence of the igpu, especially the fact that the igpu can be affected as much as the cpu, on purpose.

at this point, it's become clear that you have created a straw man argument right after i expressed my lack of understanding your p.o.v. despite me explaining my side several times. that's why none of your accusations hold any credibility. i have always mainteined my stance and stand by my statements.

it was open for participation from everyone. your interest is irrelevant. but participation is appreciated nonetheless. :)
 
The consoles do not support Mantle and if a game is a XB1 exclusive, it is actually easier to port to DX11 since it is using DX11 as its API and a Windows 8 kernel.

And PS side, I know the libgcm library is still used for low-level HW access (and I'm assuming some form of OGL for higher level control). And libgcm is FAR lower a library then Mantle is, so porting that to anything is a headache necessitating a major re-write; libgc is a true "to the metal" language.
 
two new entry level microatx socket fm2+ motherboards piss off xbitlabs
http://www.xbitlabs.com/articles/mainboards/display/asus-a88xm-plus-gigabyte-ga-f2a88xm-d3h.html
that's not the actual article headline.
i wonder why gigabyte's o.c. didn't work. may be the bare power delivery?

not really amd cpu related:
Intel loses bid to sidestep $1.4 billion EU fine
The EU's second-highest court says it agrees with the European Commission in deciding that Intel acted unfairly against AMD.
http://www.cnet.com/news/intel-loses-in-bid-to-sidestep-1-4-billion-eu-fine/

AMD Announces Latest Step in Multi-Year Strategic Transformation
http://www.techpowerup.com/202000/amd-announces-latest-step-in-multi-year-strategic-transformation.html

 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810



Suing your customers probably isn't a good idea. The question is how much of that fine will make it back to AMD, and with what kind of payment schedule. It would certainly help their bottom line but they really need to pour that into R&D.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


The consoles do not support Mantle, that is right, but Mantle imitates/reproduces part of their low-level APIs and allows for easier ports.

Microsoft has released DX12 for XB1. And porting from DX12 to Mantle is the simplest of all the ports due to close similarities.

Mantle improves performance on situations of CPU bottleneck. This bottleneck is much easier to appear for weak CPUs, but also exist for strong CPUs. For strong CPUs, the bottleneck appears when using two or three high-end GPUs on parallel. This is why boutique builders of gameboxes and similar high-end gaming PCs are overclocking the $1000 CPUs to feed the multiple GPUs. I already shared here some gaming benchmarks showing how Mantle improves the performance of an extreme i7 CPUs when gaming with two R9-290X on crossfire.

I am convinced that Nvidia will never support Mantle, but if Nvidia want prove me wrong they are welcome!

About Intel, I understand your sentiment perfectly. Some people here did laugh at some early claims that were latter confirmed.



That is the quote where you lied and trolled: you tell something that is untrue, and next you pretend to explain me what is an APU, like if I didn't know.

I know my interest on your vram speculation is irrelevant; I can see the massive interest that your speculation generated and how everyone is actively participating in it. No wait, everyone is ignoring it...



What sense makes comparing a new architecture designed around efficiency against an old architecture was not?

Forthcoming AMD's Tonga is expected ot be the 750Ti competitor

http://www.digitaltrends.com/computing/amds-upcoming-tonga-gpu-may-target-nvidias-geforce-750-ti/
 
They make more money back than the case would fine them so they will keep doing it. All the tablets with intel CPUs are being funded by intel like the pc back then were, I wonder when the lawsuit for that would resolve.
 
Status
Not open for further replies.