AMD CPU speculation... and expert conjecture

Page 454 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


The initial game lineup wasn't that impressive to me, but then I haven't bought a console since the N64, so no surprise there. I only hope they sell well enough to keep AMD afloat.
 

Sigmanick

Honorable
Sep 28, 2013
26
0
10,530
OEM's that sell laptops with the OS installed on 5400rpm drives should be shot. The reason I bring this up is that AMD systems are almost always relegated to the cheapest looking, single Ram strip, bottom dollar system bottleneck builds.

I'm sure this has been discussed before in this thread, but AMD doesn't help themselves when this is the representative product, or more importantly, what people think is the representative product that AMD builds.

It furthermore hurts when review sites test "as is" (single channel memory) and the product is given an artificially low score.

IF manufacturers are going to save some money on processor, upgrade the storage, at least install the dual channel memory it deserves.

My laptop is a Llano series, had one 4gb strip, and 5400r hdd. memory is cheap, so I added extra memory which helped; but replacing the hdd is like having a new laptop.

//end rant//
 


HP pissed me off a year or two ago. I bought a HP DV6z with the A8-3550MX CPU with the goal of putting 8GB of DDR3-1600 memory inside it. HP only ships DDR3-1333 memory which we all know is horrible. After installing my memory kit I notice it's still DDR3-1333 inside the BIOS. Come to find out that HP's BIOS's never recognized more then that on the DV6 line because they originally used the older APU's. To add insult to injury the model I had bought was after they had started to using RSA signed BIOS's and I couldn't upload the moded BIOS that was available with the setting for 1600 memory. It was a super easy fix but HP refused to support it on the DV6z's. Funny thing is the DV6t's and DV7t's (Intel versions) both had DDR3-1600 memory support.
 

jdwii

Splendid

rmm03

Honorable
Apr 21, 2013
72
0
10,660
Also notice the ram speed on the amd side... 1 mhz ? What is that... A Lil skewed it seems. Just sayin. If it's gonna be fair make it fair...
 

ColinAP

Honorable
Jan 7, 2014
18
0
10,510


I had this exact same frustration with my DV6, very frustrating. I always felt like the Llano was being held back by slow memory. On the bright side, it ran great under Linux, albeit with poor battery life. I think my biggest problem with the laptop was the poor cooling performance though, especially when the dGPU was being taxed.

If HP do a DV6 version with a decent Kaveri chip and proper memory support, then I may well check it out. I hope AMD have a 45w mobile part, as that would probably be a better solution than a Llano + dGPU, at least under Linux where switchable graphics are a pain.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


That is like pretending that a 300hp car is 3x faster than a 100hp motorbike. No.

Oles Shishkovstov: No, you just cannot compare consoles to PC directly. Consoles could do at least 2x what a comparable PC can due to the fixed platform and low-level access to hardware.

http://www.eurogamer.net/articles/digitalfoundry-inside-metro-last-light

In raw compute the PS4 custom GPU will destroy a 6950. In graphical power the 2TFLOP PS4 will behave more or less like a PC with a 4--6TFLOP GPU.



This was one of the few leaked benchmarks of engineering sample. It is good to compare what final silicon can do.

About benchmarks, I did my early predictions about a set of concrete benchmarks.

I agree that other benchmarks can disagree. If you use software optimized for intel, if you use software compiled with ICC... then Intel will look better. If you use W7+CB then Intel looks better. If you use a x264 binary optimized for Haswell then an i3 looks better... If you use the OS and the x264 binary (optimized for both AMD and Intel) I used then Kaveri is at the i5 SB level.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


I wish people would give Lenovo a little more credit with the x140e. Yeah it's still 768p resolution, but the build quality is great. It comes with memory card reader, 1600mhz ram, 7200rpm HD. Everyone complains about the price (about $550 with A4-5000) but picking it up it doesn't feel cheap or anything at all. It feels so solid the first thing my mom said when I got her one was "wow this thing is amazing, is it waterproof?"

People want to see AMD jaguar laptops with small form factors selling for $350. It's not going to happen. AMD APUs are cheap and the rest of the system is going to cost a lot anyways if you want something of decent quality.

In the off chances where we do get good AMD APU laptops, people whine and say they cost way too much money, nearly every single time.

I love x140e that as soon as I saw the ones my parents got, I turned around and ordered one just for me.

As far as other Jaguar based systems go you're looking at bad build quality like Acer V5 or MSIs or whatever. My x140e feels just as well put together as this Asus I have here that I spent $1500+ on several years ago.

I have hopes for what Lenovo is going to do with 35w Kaveri. I'm expecting them to at least give us a 13in model, maybe even 12in. Yeah, it''ll cost most and people will whine, but it's going to be a heck of a lot cheaper than Intel option.

My other options are an outdated x131e with Core i3 starting at $549 or x240 starting at almost $1000.
 

truegenius

Distinguished
BANNED

:ouch::lol:
this looks interesting
AMD gave us stale food in dinner and then gave us chocolate in desert, while nvidia gave us fresh food but only toffees in desert
smiley_ROFL3.gif


now i am totally confused, should i suggest amd or nvidia
looks like everything is messed up :pfff: mantle made us mantel :pt1cable:

i think i will eat fresh food :p but i like chocolates too :??: why not both :fou:
 

amd has launched a lot of stuff in quick succession for multiple markets. wait for a couple of months at least, before making any decisions. because people who wanted to buy kaveri, r9 290/x, try out mantle are already doing so. the budget conscious need more time.

one thing is sure, as long as cryptocurrency fad continues, high end radeon prices will stay high. as a result, nvidia won't have any reason to lower prices. both amd and nvidia win, customers lose.
 

8350rocks

Distinguished
Anyone mining cryptocurrency these days using GPUs has mostly missed the boat. FPGAs and ASICs are where the big money in that is, and it is one of the few ways to still turn a profit doing it. However, as I understand it, there are other less in demand currencies than bitcoin, so that may be what people are using those radeons for at this point. However, I know little of the newer currency and bitcoin is pretty tapped out at this point.
 

Embra

Distinguished


I think Litecoin is the biggest demon:

http://www.techpowerup.com/196320/why-the-litecoin-craze-hurts-more-than-helps-brand-amd-radeon.html
 


There is no more money to be made in mining BTC. Speculation has already took over and hyper-inflated the price. The moment people started making cheap custom ASIC's it was over with as nothing anyone else can do will equate to how fast those things can pump out SHA-256 hash's. LTC, IMO, has a much better hashing algorithm that is resistant to such mass mining methods.

 

ShadowofGhosts

Honorable
Feb 6, 2014
55
0
10,660
I don't see any reason why AMD doesn't make an FX Steamroller CPU, or any Steamroller CPU running on AM3+. The FX Processors were quite successful after their second generation came out, and was a cheaper alternative to Intel. It would make sense for AMD to make higher transistor count CPUs to compete with Intel, which is moving towards 14nm Process. Unless AMD is going to release new 32nm CPUs, the high-end market will be dominated by 14nm Intel CPUs.
 

ShadowofGhosts

Honorable
Feb 6, 2014
55
0
10,660


But wouldn't the Scrypt in Litecoin make it even less profitable in the future given that difficulty will increase, and the difficulty increase will outpace the release of stronger mining GPUs. ASICs will have to exist for the sake of longevity of the currency.

Honestly, I think that AMD can make big $$ if they make cards specialized for cryptocoin mining, which will also make the prices of high-end AMD Cards go back to what AMD intended them to be (lower than Nvidia, which will give AMD more customers), and also get cryptocoin miners to buy their ASICs, which equates to more $$
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Plenty reasons were given in this thread.

1) Change of strategy in servers towards ARM, first because AMD current x86 architectures cannot compete with Intel in efficiency and, second, because a new more powerful player enter in scene this year: ARM. AMD already admits that "ARM will win". The canceling of Steamroller/Excavator Opteron implies AMD will not release Steamroller/Excavator FX derived from them.
Screen-Shot-2014-01-28-at-5.58.23-PM-640x358.png


2) The future is APUs for everyone (Intel, AMD, Nvidia...). AMD and Nvidia only plan APUs for 2018--2020, because a CPU is not enough for exascale compute. Intel starts migrating from discrete card to APUs (note Intel call CPUs to its APUs) the next year. AMD start migrating towards an all APU strategy. CPUs and GPUs will disappear.
Xeon-Phi-Knights-Landing-GPU-CPU-Form-Factor-635x358.png


3) APUs represent most of AMD revenue. The FX brand represents a small percentage. The FX-8000 series did account for something as 1% of AMD sales or so. In Steam, the FX-8000 series accounts for a 0.3% of gamers or so. And predictions are towards APU increasing importance
APU-AMD-Kaveri-02.jpg

APU-AMD-Kaveri-03.jpg


Note that the weight of APUs on sales/revenue has already increased with consoles contracts. AMD returned from the red numbers thanks to Xbox1 and PS4.
 
Going to comment specifically on this:

APU-AMD-Kaveri-02.jpg


What, exactly does that mean? A i7-2600k has a GPU built on silicon, but its not used for anything in my system. Is that counted? What about Intel chips that have a iGPU, but is disabled for one reason or another?

So yeah: That chart is a meaningless statistic, since 9/10 PC's out of those 9/10 PC's aren't using the iGPU they have access to.
 


The answer is in the graph itself: "Already shipping with CPU and GPU on a same piece of silicon".

Cheap Intel lappies do have just the iGPU for use, and that's like the BIG majority of lappies out there. I don't know about desktops though, but at least office ones, I'm pretty sure it is the same deal.

Cheers!
 

juggernautxtr

Honorable
Dec 21, 2013
101
0
10,680
"What, exactly does that mean? A i7-2600k has a GPU built on silicon, but its not used for anything in my system. Is that counted? What about Intel chips that have a iGPU, but is disabled for one reason or another?

So yeah: That chart is a meaningless statistic, since 9/10 PC's out of those 9/10 PC's aren't using the iGPU they have access to."

correction since 9/10 processors made are apu, but rarely utilized with out dgpu.

there would be no point in keeping making dgpu if no further line of fx or other cpu. amds apu is no where near strong enough to run a dgpu full out. and it's already been hinted at by others that the fx line will be continued for desktop use at least.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
Juan, for what you are saying to happen, we'd basically end up seeing Intel and AMD walk away from Gaming PCs entirely and just basically cancel the platform.

I don't see that happening. Gaming PCs are the fastest growing segment in desktop computing, and most surveys only include sales data from OEMs selling pre-built desktops. They don't count the DIY yourself market at all. And a quick survey of anyone you know of who built their own gaming rig compared to who bought a pre-built gaming rig would (and I say this with absolute confidence) show up as DIY absolutely destroying pre-built purchases.

To put all of this into perspective, the reports that only count OEM sales are saying that gaming PC growth is just a little smaller than shrinkage of overall OEM desktop sales.

The model of AMD making server chips and then selling them as desktop chips is over. It doesn't work anymore, because x86 Opteron is dead. Obviously ARM Opteron will win in the long run, x86 Opteron is under 5% market share. Beating a product with less than 5% marketshare and calling it a win is basically AMD ARM plans.

You keep confusing that model dying for AMD x86 dying completely. If AMD was going to abandon x86 market entirely, they wouldn't care about releasing Mantle because Intel CPUs are not massively bottlenecking GPUs. AMD CPUs are. x86 AMD CPUs to be exact.

So I don't get what you're playing at by saying that AMD will go ARM only, or at least significantly ARM. AMD is going to spend lots of resources making APIs efficient on lower end CPUs to play games that only exist as x86 (it's hard enough to get Linux ports of games, let alone ARM Linux ports) and then abandon x86 computing completely?

Mantle seems like a lot of work to just sell GPUs for a few years and then give up, don't you think so?

If you ask me, AMD plans on keeping x86 around for a while, but keeping it completely out of HPC and servers. Meaning that AMD continuing to sell mobile x86 and desktop x86 CPUs and APUs lines up very well with their Mantle plans.

I keep thinking back to that guy who claimed to work for IBM and the things we discussed when he was drunk. 22nm SOI being a disaster at IBM was one of the claims he made, and now IBM is selling their fabs. Another one was that there's going to be a platform where the problem of making HSA work across multiple devices is solved. There is also HPC discussion of HSA working across APUs in add-in boards in official AMD presentations for HPC.

So while APUs are being pushed right now, they only solve the problem of getting GPU and CPU to cooperate well. Once that problem is solved (which I admit, is really difficult) there will no longer be a need for APU. In fact, I think that when this problem is solved, we will see separation of the two as it makes things easier to cool and improves yields by reducing die sizes of both products. Kind of like how Intel moves some things from CPU to NB because it reduces CPU power consumption, power draw, etc.
 

Master-flaw

Honorable
Dec 15, 2013
297
0
10,860

From the Mantle benchmarks...I'm more inclined to think this, rather than anything else.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
I have given you the strategies and also official data given by AMD. The pair of above slides with the evolution of the market towards APUs and with units sold of A10 is data given by AMD. If any of you disagree with that data please direct your complaints against AMD or against JPR. If you don't like AMD strategy regarding APUs go to the link that I gave above and sign the petition of Steamroller FX. I know that the petition will change nothing (AMD will not release a Steamroller FX CPU), but you can sign either.

It is worth mentioning that Intel and Nvidia are following the same strategy than AMD. I gave above the slide from Intel where Intel clearly says that the discrete card version of the Phi is the present whereas the APU version is the future. Nvidia has similar plans to replace their line of compute dGPUs by APUs because the APUs will be more powerful. I already mentioned plenty of times before that the more powerful design at AMD and Nvidia labs are not dGPUs but 8-core APUs of about 20TFLOPS. Those APUs are planned for 2018-2020.

If one pays attention to Intel. The last improvements on the CPU side are small whereas all the emphasis is on the integrated graphics: SB-->IB-->HW. Broadwell will introduce another big update (40%? 80%?) on the iGPU whereas the CPU will remain almost the same than Haswell. Intel has even modified its tick-tock strategy to account for the new emphasis on the graphic side of its APUs. It is also not a secret that Intel plans to kill discrete graphics cards by about 2015 or so

http://www.techradar.com/news/computing-components/graphics-cards/intel-could-kill-performance-pc-graphics-in-2015-1089339

I remark again that Nvidia and Intel don't call APUs to their APUs. but the name is irrelevant.

I am just reporting what is made official by AMD, Nvidia, and Intel and what I know behind the scene. I recall being one of the first saying here No Steamroller FX. Nobody trust me then, but my claim was broadly confirmed up a point that some people is signing a petition.

You can negate sales data and pretend that FX line is a great success, when is not. You can negate the claims made in public by AMD head and pretend that AMD has other plans to what the have. You can also negate the laws of physics and pretend that APUs will be replaced by CPUs and GPUs and that "there will no longer be a need for APU" when laws of physics say otherwise. Engineers at AMD/Nvidia/Intel labs know the laws very well

Xeon-Phi-Knights-Landing-GPU-CPU-Form-Factor-635x358.png
 
Status
Not open for further replies.