AMD CPU speculation... and expert conjecture

Page 680 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


The situation is entirely different now: the market has changed, the tech has changed, the finances have changed, the players have changed, even antitrust agencies are no more watching Intel closely...

AMD is not doing a comeback anymore. I had the hope during a while when relevant financial/technical data was unknown, but now the gap with competitors is too big and the last reductions in R&D and engineers will only increase the gap. AMD will split/reorganize/sell and will survive a la VIA on some niche market.
 

Rafael Luik

Reputable
Sep 18, 2014
86
0
4,660
If we count GPUs on the CPU die, then Intel owns like 60% of the GPU market.

Just saying, your metric makes AMD look even worse by comparison.

And NVIDIA too, but isn't that company invincible?
lol

...
Just because Intel has the majority it doesn't mean AMD and NVIDIA can't have their niche market to survive.
 

jdwii

Splendid
If that niche market wasn't the hardest to pull off maybe but who would buy a 290x like product for the same price as a 970. Who would buy an fx CPU that is priced the same as an i5. Its so no secret And is no threat to intel. Intel's main threat in the market is arm based designs. Using this snapdragon CPU I can say its powerful enough for most people if used with the right software.

You can't keep making big mistakes all the time and expect stock holders to be happy or your business partners. Jaguar was a success GCN I would definitely call a success but those are small wins overall. Bulldozer really took them down in the server market and in the client market. Not having a clear strategy in mobile devics. I remember reading when bobcat came out Amd was stating netbooks still matter like wtf. Honstly if it wasn't for winning the consoles I'm not sure they would even be here 2 years from now.

Been following Amd for quite sometime myself and they always market that next big thing that will save them this time around we have the K12 and zen. I'm pretty sure zen will have many cores like bulldozer, so I'm not sure on its power consumption or performance per core it better be good however or they lost again. K12 will probably do a bit better being arm base in terms of power consumption.

Did anyone here find it odd when Rory Read stepped down?

For are little market I'm not sure but nvidia is worth like 3-4 times more then Amd while only making GPUs and arm designs which they have experience with actually I'm kinda mad I couldn't get the nexus 10 with their new design but then again its a bit to big for me I love my 7 inch tablet.
 

colinp

Honorable
Jun 27, 2012
217
0
10,680
The first two are pension funds I think, typical institutional investors. Vanguard own a chunk of intel too.

Mubadala may not have bought the shares. It looks like they were part of some issuance, similar to an issuance that a company makes to its employees. I.e. AMD will have bought the shares on the open market to issue for bonus reasons instead of cash. And 1.5mill shares is a drop in the ocean.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
As some of you know I am preparing an article about high-performance APUs that will be published very soon (I expect). I already explained why AMD chose APUs instead slower combination dCPU+dGPU. AMD will also use APUs for in-memory processors. This is why:

- The CPU and GPU components support familiar programming models and lower the barrier of using in-memory processors.
- The programmability allows a broad range of applications to exploit PIM.
- The use of existing GPU and CPU designs lowers the investment and risk inherent in developing PIM.
- The architectural uniformity of the host and in-memory processors ease porting of code to exploit PIM.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


Throwing away features to reduce power consumption does not make something more efficient. It just makes it use less power. If you want an absurd example, consider a rock. As GPU, it's missing a lot of features that GCN and Maxwell have. But it uses a lot less power.

But my point, as ridiculous as a rock GPU is, is to prove that you can't just remove things and then declare you've improved efficiency. If Maxwell was a real winner, they'd be GPGPU beasts as well as gaming beasts. Instead Nvidia opted to remove features to increase performance as well as reduce the power used. And you can see they are playing some sort of odd games. Power consumption for gaming is fantastic and when you switch to GPGPU, it is awful compared to how much power it uses when gaming.

That's not increasing efficiency, because efficiency implies less resources used to complete the same tasks. It is blatant removing things to decrease resources used.
 

colinp

Honorable
Jun 27, 2012
217
0
10,680
It's true that your usage decides what will be most efficient. I remember watching Top Gear (UK motoring programme) where they proved that a BMW M3 was more efficient than a Toyota Prius (*)

Most consumers will mainly be interested in gaming though, in which case it's hard to argue that Maxwell isn't a great arch.

(*) Here's how: you have to drive the Prius as fast as it'll go around a racing track and let the M3 keep up at the same speed.
 

jdwii

Splendid


Actually i find this statement hilarious. Something can be efficient in one area while being pretty inefficient an another for example CPU's suck at parallel work loads, does that make them inefficient at serial work loads to? What about my headphones they work great for listening to sound over my ears but suck when their not on my ears does that make them inefficient no why would i use them like that? You use products based on specif work loads.

Now lets break this down
980 4,616 TFLOPS
290X 5.600 TFLOPS
Per watt(http://tpucdn.com/reviews/ASUS/GTX_980_STRIX_OC/images/power_maximum.gif)
980 156 Watts
290X 258 watts
Per Watt
980: 29.59
290X 21.71

LOL keep thinking of more things so far i'm not convinced and either are these guys
http://www.fool.com/investing/general/2014/11/22/tech-news-ibm-and-nvidia-will-power-the-worlds-fas.aspx
 

colinp

Honorable
Jun 27, 2012
217
0
10,680
Some predictions for AMD in 2015:

1. Carrizo will get released on mobile only and will be to Kaveri what Richland was to Trinity. I.e. A relatively minor bump in arch. A few tech sites will review a reference laptop. The general consensus will be "it's ok, nothing special. Average CPU, decent iGPU." Very few real laptops will make it to market.

2. Nothing to excite desktop users on the CPU front. At all.

3. Next gen GPU will generate some excitement and we'll get a competitive desktop market again, but only if AMD are able to obtain enough working chips from whoever is fabbing them. They won't. Not for a while at least. On the mobile side, AMD will continue to struggle to win deals.

4. AMD will still be here in 12 months somehow

5. Mantle won't make it to Linux

Any others?
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I love to see IBM joining our club:

Simply increasing the performance of the CPUs and GPUs isn't enough, because much of the work being done when Big Data is involved is simply moving data back and forth between storage and the processors. As IBM stated in its press release, "The popular practice of putting design emphasis solely on faster microprocessors becomes progressively more untenable because the computing infrastructure is dominated by data movement and data management."

I love to see my thoughts about efficiency applied to get 5--10x faster supercomputers:

Titan, the system Oak Ridge is replacing, uses 8.2 MW of power. Summit, the larger of the two new supercomputers, will use about 10 MW of power, but it will provide five to 10 times the performance of Titan. This increased energy efficiency is due to efficiency gains in NVIDIA's GPUs and the minimization of moving data around the system.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Do you know anything about Maxwell?

Nvidia has increased efficiency by adding stuff. They added more control logic to improve power management; they added more FP32 CUDA cores, special function units, and load/store units; they added more L2 cache, from 256KB to 2MB... e.g., more cache improves performance whereas reduces power consumption by reducing accesses to GDDR5.

Moreover, the graphics cards have received new rendering features to support DX12 and OpenGL 4.5. The new architecture also delivers real-time global illumination through one-bounce path tracing.

On the GPGPU side, everyone knows that Nvidia owns the market with 80--90% of the share. Nvidia has the fastest GPGPU cards (I provided the link to Nvidia owing benchmark record) and its efficient design has been chosen to power future supercomputers that will be 5--10x faster than Titan (jdwii gave the link).
 


That is why speed can matter. It is great if something is efficient but sometimes if something is done much faster it outweighs the efficiency of the former.

That is why when NVidia added components it got more efficient. If it takes 25% less time to complete a task and the opponent is 25% more efficient, more than likely it will even out or be more beneficial for the faster component.

Of course you can also have both. A new product that combines faster performance and higher efficiency. It is rare though as those two do not always coincide.
 

truegenius

Distinguished
BANNED
*me talking to a noob user wanting to upgrade its graphics card
"hey, buy 980 because regardless of same tdp and slower performance than 390x, it is more efficient"

seems like it doesn't makes any sense to me :p
i don't care from where they get efficiency, all i care ( or an end user will ) is result.
if we buy these 2 cards now and consider drivers don't change their performance over time then 390 will remain a winner, regardless of if nvidia switch to new fabrication or not
and what we are assuming that nvidia will get performance per watt advantage by switching fabrication then what if amd release new series or new gpu with refined arch at around the same time because they will have experience of new fabrication and then can modify their arch accordingly to take advantage of that


btw about that virtual super resolution thing :p
i think i was among first few who used this thing :D
because i have been using this thing with my hd6770 since 3 years :cheese:
my monitor doesn't support resolution above 1360x768x60hz but in ccc i manually set eid to 1920x1200x75hz ( though i can set higher but nothing works above that ) and this lets me to use 1080p resolution in desktop and games ( though desktop looks horrible at that resolution, but games look better )
but amd didn't enabled vsr for hd6770, i was hoping for 4k in my 768p monitor :whistle:
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


The 980 is not the competitor of the 390x, the titan 2 is :p Thanks to having a more efficient architecture Nvidia can increase clocks and/or add moar cores on top of the 980 and remain competitive with the 390x without requiring expensive HBM, liquid cooling, and so on. And when Nvidia makes the die shrink and add HBM to the next cards, then will be again ahead of AMD because has an architectural advantage. It is not an assumption, power is proportional to area and a die shrink of the same architecture will reduce area, aka power consumption.

Of course, for average joe, it doesn't matter if performance was obtained thanks to a more efficient architecture or thanks to a brute force approach. But some of us understand that AMD brute force approach is useless on phones, tablets, laptops, servers, supercomputers... :p

Some of us understand that Nvidia will get several design wins on all the markets, from phones to supercomputers, whereas AMD will be stuck to a niche market (HEDT), which implies that will be a step closer to bankruptcy or to split and sell...

We also discussed the possibility of AMD releasing a new efficient architecture for 2016. Why do you think some of us claimed that AMD needs a new efficient architecture? It seems you missed those posts as well. :p

The problem is that AMD has cut R&D. The problem is that AMD is firing engineers and closing centers. The problem is that AMD did switch from VLIW to RISC-SIMD adding useless compute features for average joe, who only want play games. The problem is...

As someone mentioned, AMD has been tacking the wrong decisions during years and has cornered itself into a pair of markets. It is barely afloat thanks to the pair of console wins, but this will not continue forever.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


I'm more referring to the fact that if you want to do something like render with CUDA in Blender, you're still better off buying a GTX 580 than a GTX 970. Yes they added some transistors, but they're clearly cutting corners elsewhere to focus on specific tasks for the GPUs to shine in.

How is the fact that sites that posted Maxwell GPGPU numbers suddenly decided to make large changes to how they measure power consumption at the same time Maxwell releases, or that they've even gone as far as removing graphs and information from reviews.

No one ever brings this up enough, but review sites are at Nvidia, AMD, and Intel's mercy. If they represent a product in a way one of those companies don't like, the review site stops getting their free $1000 graphics cards. No one ever factors this into their reviews, and it's a large part of why Nvdia gets much better press than AMD.

Consider what happened when TweakTown decided not to obey Nvidia's every command. They lost their free review samples and they threw a giant fit over it.

AMD's position in all of this is that they don't have the products that are in the highest demand, so review sites don't care about getting the free AMD swag because AMD is going to send them it anyways since they're the underdog. And Intel and Nvidia know they're going to get people buying their stuff, even if they're inferior, because of brand recognition.

I'm quite sure you'll show up with single precision benchmarks to try and refute me, but Anandtech has GTX 980 losing to GTX 780 Ti, 290x, 7970, 290, 780, and GTX 580 (by large margins) in F@H double precision. That is the sort of thing I'm talking about Nvidia gutting from GPGPU to improve efficiency.

And yes, I know they're gimping it for desktop parts. But enabling them is going to increase power consumption, which shoots your whole efficiency statements out the window. In fact, we haven't even seen any sort of evidence that opening up the DP parts of Maxwell on professional graphic cards will still maintain Maxwell's efficiency.

And I think it's rather foolish to extrapolate that a part that's cut down so far in DP will still maintain the same levels of efficiency when you enable full DP throughput. Because that's basically what you're doing when you make the jump from GTX 980 efficiency benchmarks to stating Maxwell will be great in professional and HPC market where DP is used and not artificially gimped.

I also think that if AMD had a part that did so horribly relative to previous AMD parts in a benchmark like F@H double precision, that AMD would be on a spitroast in forums. But it will get written off because it's Nvidia and there's nothing wrong with Nvidia releasing a new part where the one from 3 generations ago is almost 50% faster. Praise Nvidia for increasing efficiency, actually! They're miracle workers!
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


Carrizo has Excavator core and newer GCN. Definitely something more than Richland's level of improvement (where they changed only a frontend a little).
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Carrizo-L will come to desktop as replacement for AM1 platform. It remains to be seen if excavator D/E has been canceled or is still coming to FM2+.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


And once again you avoid to mention how the 7970 GHz gives more DP TFLOPS than the newest 290X because AMD castrated the 290X from 1/4 DP to 1/8 DP to force non-gamer users to purchase the expensive Firepro cards. Neither Nvidia is evil nor AMD is an charity ONG, both are companies and their goal is to make money.

The reason why AMD lost the server market is because last Opterons are both slow and inefficient compared to Intel Xeons; not because some tech writer wrote a biased review about FX CPU.

The reason why AMD is nowhere in the tablet/phones market whereas Google and others chose Nvidia products is because AMD doesn't have anything to compete; not because some tech writer wrote a biased review about Beema/Mullins.

The reason why Nvidia owns most of the GPGPU market and has been again selected to power the more faster supercomputer is because its architecture is more efficient than anything that AMD has; not because some tech writer wrote a biased review about the 290/290X.

The reason why Cray has partnership with Cavium to build ARM-based supercomputers is because Cavium has very competitive 95W SoCs (fast and efficient); not because some tech writer wrote a biased review about the forthcoming AMD's K12.

And so on...

AMD current situation is a result of the bad decisions taken since Sanders' epoch; it was a technical/financial/strategic mistake after another mistake after another mistake, and in last years I see the same tendency (*) it is not a result of conspiracies by malign tech writers.

Of course, AMD has also done some things right, but as a former AMD president wrote... the shrinking AMD ship has a hole too big.

(*) Dresden, ATI acquisition, FSA, K9, Barcelona, Torrenza, HTT, Globalfoundries, WSA, SOI, CMT, Bulldozer, Imageon, HSA, Linux, GCN, FF, Seattle, Skybridge, modular-dies...
 

colinp

Honorable
Jun 27, 2012
217
0
10,680
Anyone else here count AM1 as a desktop platform? No, thought not.

And you are now rating HSA as a mistake after all these months (years, even) of thinking it was the best thing since sliced bread.
 


If HSA delivers, it will turn the "AMD is not efficient" around, I guess. APUs will shine and blah blah blah.

So far, the light at the end of the tunnel is dim to say an optimistic thing.

Cheers!
 


If only we knew how the 390X would perform. So far it looks good on the rumor mill but then again that can always be wrong. Remember the XDR2 being the memory of choice in the 290X?

Right now NVidia has a very efficient design. That will translate very well over to 20nm.



If is the key word. If only we knew how it would turn out. So far we have no way of knowing.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


You must be using a different definition of desktop than everyone else then...

http://www.amd.com/en-us/products/processors/desktop/athlon
http://www.zdnet.com/article/amd-debuts-am1-platform-using-low-cost-kabini-desktop-processors/
www.pcworld.com/article/2141383/amds-brings-kabini-apus-to-the-desktop-with-affordable-upgradeable-am1-platform.html
www.pcmag.com/article2/0,2817,2456239,00.asp
http://www.anandtech.com/show/7933/the-desktop-kabini-review-part-1-athlon-5350-am1/7
http://www.newegg.com/Product/Product.aspx?Item=N82E16819113364

I think you are exagerating a bit regarding my former position about HSA, but during last months I did learn lots of stuff about microarchitectures (including future ones are being designed now); I did meet some engineers and graphics gurus; and now I consider that HSA is reinventing the wheel and doing it square.
 
Status
Not open for further replies.