AMD CPU speculation... and expert conjecture

Page 581 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
@juan, you do realize that a mid-range dGPU is just as efficient as an APU, right? If APU was on the scale of Hawaii or GK110 it would be just as (in)efficient.

Gaming PCs are a 21.5 billion dollar market. You are telling me that Intel and AMD are just going to abandon it? Consoles are losing steam and gaming PC is actually growing and you're proposing we go to consoles and mobile only?

But you always take things out of context. You see marketing slides for an HPC presentation that says APUs are more efficient and it's the future and then you turn around and go "HEDT is dead, x86 servers are dead, everything that's not an APU is dead! HOORAYY!" You see ARM marketing slides that say ARM will win, and they're surrounded by microserver slides, and then you go "HEDT IS DEAD X86 SERVERS ARE DEAD! Here is research from HPC paper that shows APUs are better so that means APUs will win everywhere!"

You make hasty conclusions like freaking crazy and it irks me to no end, specially because you defend them so vigorously.
 

8350rocks

Distinguished


YES!!!!!!!!!!!!!!!! +111111111111111111111111111111111111111111111111111111111111111111111
 


Then the cloud fails for a few hours right before you need to deliver a document, and you later find out your critical company specific knowledge was lost when the datacenter got hacked.

Nevermind the physical limitations on data sizes to be processed due to poor upload/download speeds throughout most of the US.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


This is already sort of happening with xbox one and their whole "power of teh cloud!!!" thing. Making consumer devices dumb terminals is very advantageous to corporations, because it allows them to not only charge you for the device, but a monthly fee to use that device as well. And when they want you to upgrade, they simply stop supporting the old devices remotely.

We already see something similar with Android devices where older hardware is flat out abandoned and not updated in the hopes that it forces someone into a new phone.

And what is terrifying is that consumers take this as a good thing that they are buying an updated phone with better features.

The kind of future Juan is pushing for also does well for corporations. It would more than likely leave us with dumb ARM devices, and you'd have to pay per usage for anything that required advanced calculations that the little ARM device couldn't do.

 


It does sound very unlikely but then again we're talking from the standpoint of *current* tech.

I mean go back to 1980 and show them a modern laptop running a 3-D screen + a modern game on it... I mean try and explain to a normal user back then about the internet for that matter.

There is a significant chance that something else could spring up in a year from now and totally displace all the current tech / direction / wisdom.

If not however I think further integration is inevitable- but this is an ongoing process. I also don't really think it means the end of gaming hardware- just a rebalancing. I mean what difference does it make if you buy a CPU and a dGPU, or just buy the latest combined APU to get better games performance?

Also perhaps the 2 can coexist long term- I still think APU has the potential to be *most efficient overall* which means it will displace using dGPU as accelerators for HPC as in that context power is the limiting factor. On the client side it's unlikely that systems will go multi socket (as it's expensive) so dGPU will probably have a place for a long time- the level at which you need a dGPU is probably going to become more and more high end though. I can see in another few years an up to date APU should be sufficient to play *any game* at a reasonable level and the dGPU market will be for super high resolution / multi screens / additional special effects, rather than the current position of APU = just about works most of the time, dGPU = actually can play games properly at good settings.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Yep, future kids will be saying the Cloud ate my homework. :D
 


The Xbox cloud thing was ALWAYS a joke. Remember, 1 frame per 16ms. Think about it: Do you really think you can upload all the data your title needs across the internet to a MSFT server half way around the country, have uit processed, and have it sent back to you, and integrate the data back into the game, all within 16ms?

So yeah, anyone who understands technology knows that was always marketing at its finest.
 
Thats like saying electricity can fail so buying into the power grid is bad and we should just all generate our own.

 

8350rocks

Distinguished


More like saying if you put 50 tractor trailer trucks on a bridge designed to carry 25 tractor trailer trucks, the infrastructure will fail.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


Just send and recieve data some time before you need them. It's not solution for computation you do every frame. Even if game server is few kilometers away it's usually hard to get less than 50ms.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I think it is not a mere disagreement on technical matters.

AMD and Nvidia have made official their future projects. AMD, in the slides reproduced by WCCFTECH, makes it clear that will use HSA APUs as base for their exascale supercomputer project. Neither Nvidia nor AMD will use dGPUs.

Then some people here is strongly denying what AMD and Nvidia are announcing!

I am not really surprised because this happened before in this thread: when AMD announced bulk some people denied that Kaveri was bulk; when AMD announced ARM license, some people denied AMD would do ARM APUs...
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Dumb terminals have been around for ages. Even traditional HPC vendors like Sun Microsystems were trying to flog them at one point. They never seem to take off for very long.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I hope you are joking. Several ARM products for HPC were presented in last weeks. Take my favorite ARMv8-A design:

* Each ARM core supports SMT4, which means that can run up to four threads at once; each Piledriver core in your x86 CPU only can run one thread at once.

* Each ARM core can compute three operands instructions at once; each Piledriver core in your x86 CPU only can manage two operands at once.

* Each ARM core has 3 ALUs; each Piledriver core in your x86 CPU only has 2 ALUs.

* The ARM SoC can issue six instructions per cycle; your x86 CPU only can issue four.

* Each ARM core has 2x 128-bit SIMD/FP units and can give 16FLOP; each Piledriver core in your x86 CPU only can give 8 FLOP.

* And so on.

Those ARM SoCs offer more throughput performance than a 140W Xeon from Intel but consume about one third of the power used by your slow FX CPU @ 5GHz.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


You might find this difficult to comprehend, but *gasp*, some people are not right 100% of the time. That's just an attack on the person making a claim and it does nothing to refute the actual claims being made.

If I am wrong about something in the past, and then I go "2 + 2 = 4", does that mean 2 + 2 = 4 is not true because I have been wrong in the past?

People are not 100% accurate. Even large tech sites like S|A blow things sometimes. It doesn't mean they're trash that should be discarded.
 


Nobody is saying that tho.
 

8350rocks

Distinguished


Ahh, but what operands? 256 bit FMAC? 512 bit FMAC? Simple logic?

You know so very little about what actually comprises ARM instruction sets you cannot even say with any certainty whether that ARMV8-A will even outperform a pentium...much less a HEDT CPU.

You make assumptions...but you compare apples and oranges and assume equality. This is called a false assumption, or false equality...
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


All right, I know that A57 is much better than current generation of ARMs, but these metrics don't show eveything? By this logic P4 was better than Athlon XP. You can't just throw as many resources as possible to core, frontend must be able to feed them or else it's just waste of transistors.
 
fresh new amd cpu/apu/soc/stuff rumors
http://wccftech.com/amd-desktop-processors-ddr4-support-post-bulldozer-architecture-2016-claims-italian-report/
wow carrizo will be on mobile! what an epic revelation! color me impressed! oh wait...

http://wccftech.com/amd-12-core-aseries-apu-inbound/
12 cores! could be a10 7800's u.s. introduction or something like richland.

+NaCl
 

Fidgetmaster

Reputable
May 28, 2014
548
0
5,010
That is ridiculous and I wouldn't get your hopes up or get too excited about it.....like you even really need or could make use of 12 cores for anything yet....

And with Broadwell coming that also is ridiculous....virtually negligent performance increase over Haswell....but a decent power consumption improvement....
 


It's an A series part- it's going to be 12 'computer cores' i.e. quad core + 8 CU.

It wouldn't surprise me if it was yet *another* pr stunt relating to the A10 - 7800, just like they 'relaunched' the FX 9590 :S
 

jdwii

Splendid
AH he is back on Arm again last time i checked the best Arm device is around half the performance per clock of haswell heck Intel is actually making some nice products the industry will probably ignore them however. Never thought i would say it but Atom>Arm in performance per watt
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Yes, the A57 is a nice improvement. However, above I am discussing big server-class ARM core. The frontend of this server-class core was seriously designed and there is not bottleneck to feed the cores.

In practical terms each ARM core offers about 90% of Haswell single thread performance, but consumes about half the power. The core is also smaller and more cores can be packed on die to provide more total throughput. At the socket level the SoC is about 80% faster than IB Xeons.

I expect AMD K12 to be more bigger and faster core.
 

8350rocks

Distinguished


You might need less salt than you think...
 
Status
Not open for further replies.