AMD CPU speculation... and expert conjecture

Page 132 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


Straight quoting from a company's product page is ...naive .



You're talking about a Supercomputer measuring in gigaflops...and bragging about 44 KW power consumption for one benchmark...?

Titan runs at 17.59 petaflops...

Green HPC w/Intel Xeons = 112,900 GFlops (Consumed ~45KW or 45,000W)
TITAN w/AMD Opterons = 17,590,000 GFlops (Consumes 8 MW or 8,000,000W)

Now, TITAN is a GPGPU HPC (HSA based), your "green" HPC is not...want to bet the GPGPUs would effect power consumption more than the opterons that run along with them?

Not really sure what you are saying. Could you explain ?
Also is that "HSA" or just CUDA/OpenCL ?
 


So can write your thesis statements on a cellphone? Really do tell us all about that...

Sorry not buying it for one moment. I happen to live in the cellphone / tablet capitol of the world, they do not replace home Personal Computers.

Phablets do not compete in the same market space that desktop PC's do anymore then iPods compete with home theater systems. What we are seeing now is not Phablets taking over the desktop, that's pretty much impossible due to form factor limitations, but the desktop market having reached its full capacity and the mobile market haven't not done that yet. I say yet because in about five to ten years some new device will be all and you people will be using the stagnation of phablet sales as evidence that some new device is "taking over" the then established phablet space, all while writing it on your home desktop PCs.

When the analysts talk about markets their talking about market growth not in total volume. As a market reaches a saturation point growth slows down, your still selling product but everyone already has one and now people are only buying when they current one no longer does the job. That market is not dead, there are customers who have a demand for a product and that demand will be met. The market merely has reached it's capacity and you won't see record breaking sales anymore. So lets stop pretending that phablets compete for market space with desktop PCs, commodity servers or specialized servers. They compete with ultrabooks and other extremely portable yet highly unproductive devices (small screens / cramped keyboards / limited processing capacity / storage capacity).

Right now Intel has a near monopoly on the desktop and commodity server markets. Desktop means Microsoft Windows and x86 CPUs, though MacOS has a small segment their even more restrictive on their CPU utilization. Commodity servers are almost universally MS windows, especially with ESXi, webservers are the exception here as apache + RHEL are magnitudes more secure then Server 2008 + IIS.

People can try to play semantics all they want but hard market number speak for themselves.
 


By that you mean you can google search something for basic research but then lets also get into the myth that is "I can do everything on a cell phone".

1) Not all phones are equal, this is the biggest hurdle some phones don't even have proper high speed wireless internet or screens that are smaller and completely impractical to work off, to get better and bigger you need to spend money and by the time you are onto the Galaxy S3's ilk you are spending more than a laptop or PC so again phones are like justin bieber, a fab for now until they get boring ie: kik, whatsapp and chat apps really just save text costs but ramp up data costs until the kids figure that out they spend more money.

2) show me a phone that I can do a legal document on within time is of the essence and expediently and I will sell every system I have and only use a smartphone.

3) when a phone is no longer a phone it dilutes the purpose, I don't believe in smartphones

 
Ok now for the ARM discussion. No ARM based design will ever have computing capacity then an equivalent x86, SPARC or Power design. ARM has severe limitations with scaling as manufacturers are starting to see. For low power devices with limited user connectivity (user input / output) this isn't an issue, the user spends more time attempting to input their commands or understand the systems output then the system needs to process those commands and communicate the output.

The reason you get so much performance from x86 is the amount of hardware devoted to optimizing instruction flow and heavy amounts of caching. Those things are power and space intensive and are anathema to a design specializing in "cheap low power". SPARC and POWER each have their own unique design attributes that leads them to being rather powerful in large deployments. Due to these reasons you'll never see ARM in the same performance league as the other three. It's absolutely amazing for lower power small devices as the design scales downward very well. It can provide "good enough" performance while consuming relatively little power, which is what we want in mobile devices.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
@ Sarinaide and Palladin :

Perhaps the more appropriate interpretation of the statement is that "I can do all my college stuff on a hardware as fast as that in my smartphone" . I agree completely, as you both of point that you cant write the thesis on a smartphone. By the same logic, you dont need a 3770K or even a Celeron G20 for that. A Core2Dup E7300 from 2006-ish already overkill for this.
So : Buy a smartphone for everything in your life, to type large documents, use your 10 year old PC. When the PC dies, just buy a keyboard dock for the phone, and use the HDMI output to connect to a TV.

@Sarinaide :
3) when a phone is no longer a phone it dilutes the purpose

500 million Android users think otherwise. not to mention atleast 100million iOS users. Plus, smartphones outsold featurephones this year.


@Palladin :

I say yet because in about five to ten years some new device will be all and you people will be using the stagnation of phablet sales as evidence that some new device is "taking over" the then established phablet space, all while writing it on your home desktop PCs.
Of course a new something will come in 5 years that will completely kill the smartphones. But, my bet is that it will be smaller, more powerful, more mobile, and more integrated with the person than smartphones are currently. So in a way, they will still be mostly smartphones, except that instead of holding them in your hand, they will be wearable.
So instead of smartphones, they will be called smartGlasses :p
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


The reverse may not be true. Specifically, Intel can probably scale x86 due to their node and FinFET advantage, and provide more perf than a comparable ARM, at lower the watt use. This should be true for 22nm 3D vs 28nm planar , and maybe with with 14nm as well.
Note that i am not saying intel will set the mobile market on fire, just that they can probably make a better SoC than ARM.

 
Of course a new something will come in 5 years that will completely kill the smartphones. But, my bet is that it will be smaller, more powerful, more mobile, and more integrated with the person than smartphones are currently. So in a way, they will still be mostly smartphones, except that instead of holding them in your hand, they will be wearable.
So instead of smartphones, they will be called smartGlasses :p

Except no. Smart phones are not killing / replacing desktops. Smart "glass's" will not replace smart phones. Each of these devices integrates and compliments the other. Smartphones let you communicate on the go while maintaining some connection to the internet with serviceable applications. The desktop is when you actually need to get something done, they compliment each other (smartphone docking stations SUCK HARD, I've used them first hand). In that same vein wearable computing devices aka "Smart Glass's" will connect to your smart phone and provide a HUD to display information from that same smart phone device. Utilizing PAN technology your glass's would be in constant communication with your communicator (proper name for a smart phone) and could display incoming information or even augmented reality. Remember glass's must be as light and small as possible, this drastically limits the hardware that can actually exist in them, yet your communicator has more physical space and isn't as limited.

And yes x86 doesn't scale down very well. Via's done their best but there is a hard limit to how low you can go due to the complex front end decoders, instruction schedulers and required caching system.
 
Smartphones are not killing Pc or replacing PC, they are two distinct markets operating separate of the other. The smartphone market though is bigger as vis a vis, more people need phones than they need desktops or notebooks for functions such as email facilities or chat applications etc.
 

how about this as an alternative analogy:
put the camaro's engine in a truck's chasis with appropriate modifications and see if it can work.
connect a full-size keyboard to a smartphone and/or output to a larger display and see if it works as a general purpose pc (sorta like asus padfone design). :)
 

truegenius

Distinguished
BANNED
^ with some modification in transmition i think that we can do it
me too think of building a very light weight 2 seater car with the engine of a typical motor cycle and a turbo charger to attain >50 km/littre milage :p

(speed won't matter much because generally speed limit in my area is below 60km/hr , and by using gps of my smartphone (oh looks like my smartphone is replacing my friends during a journey :whistle: ) and an app, i found that average speed is in between 20-30km/hr (no laughing please :D ) )
 
Samsung Galaxy S4 is estimated at UK pound Sterling 650, thats about $1100 and you can get a helluva stronger Desktop or Notebook/Ultrabook that can do everything and some that the phone cannot remotely do and probably never will be able to do. I chose a phone like that as it has a big screen, big enough to not strain ones eyes trying to to read the google search or play on whatsapp or whatever the craze is now. When you start getting down to more modest phones sadly the utility is rather less impressive and nobody will even bother to use it more than for its purpose....to phone people :D

DT dying is purely the intel spin off as to why the monopoly they created is no longer sustainable. Its simple really Intel are not offering reason for people to buy DT and AMD is all but non existent in the market or hidden from plain site. Imagine in the Athlon years if Intel did not resort to the tactics adopted and AMD had 4-5 years of market dominance, out selling Intel 80/20, had a number of long term deals with OEM's for cashflow, who different would the situation have been. I agree with Ruiz that the 1.5bn settlement was probably 10-12x lower than what AMD's expected earning potential would have been if they were not coerced out the market. Maybe team blue should make better CPU's so that their market can buy something.

It would be fun to watch Intel get a spanking in a market they have no control over and are themselves the small fry.
 

8350rocks

Distinguished
Let's not delude ourselves, the smartphone will not replace the desktop anytime soon. The capability of a modern desktop is far too great and useful for work environments and home projects, in addition to running high end games. Additionally, as software becomes more and more parallel, there is no way a smartphone will be able to match the speed and computing power of modern desktop PCs.

Volcanic Islands looks extremely interesting...if they can get power consumption to fall inline with the HD 7790 card, then they're really on to something there.

http://www.fudzilla.com/home/item/31250-amd-slashes-desktop-cpu-prices

Thought that was an interesting article there.

FX8320 for $153 anyone? FX6300 for $112? FX4300 for $101? They also cut APU prices pretty significantly as well...

Looks like the FX8320 and FX6300 are going to be tremendous values moving forward. I can't see anything that would hold a candle to the FX6300 @ $112.
 
I am wondering whether AMD have timed this to perfection and Nvidia have jumped the gun, I am also questioning the conventional like of thought that the GTX Titan was not in the end the GTX 780 which is also not a very good indication either. Volcanic Islands represents superior node advantage, a very formidable SPM/PCM heavy GCN core and will likely have impressive power gating like Tahiti had, not only will these be fast it will be powerful, some suggestions say between 2-3~ times the performance of the 7970GE while reducing power and improving efficiency. Nvidia will have 20nm by Q3 2014 by that point TSMC will have 16nm ready for 2015 and AMD will likely again jump the node. While Nvidia will have 4 months over AMD with the GTX700 series which will likely only be between 15-25% gains on the same process AMD will have likely the superior cards for the best part of 9-10 months and undercut Nvidia, source leaks on the 770 suggests around HD7970GE performance.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
gtx770 is gtx680+5% improvement, which puts it equal to a HD7970.

Has TSMC announced whether they will be able to mass produce 20nm node by end of 2013 ? I dont remember.
Though, i do think that even a paper launch, with some review samples gone to the press, will help AMD tremendously.
 

8350rocks

Distinguished


GPUs consume more power than CPUs...it's a proven fact. Additionally HSA is Heterogenous system architecture. OpenCL is one of the standards for HSA. But when you get into GPGPUs, it's a bit more than that. OpenCL is aimed at your standard GPU and making them more beneficial to the system...but most GPUs only have a few general compute pipelines. GPGPUs are going to be more like the GPU onboard the PS4 with 64 possible compute pipelines to run GPGPU functions under heavy parallel workloads. OpenCL is a part of this, but there's a lot more behind it.

Also, quoting a company's homepage is not naive, they can't claim it legally if they can't back it up, now can they? In the day and age of frivolous law suits, you have to be 100% sure what you put on your homepage is the truth verifiable by a source.
 

Blandge

Distinguished
Aug 25, 2011
316
0
18,810


Bragging... What? I just posted the link because I thought it was relevant. Titan is on the list.
 

8350rocks

Distinguished


I didn't mean you were bragging, that the article was...perhaps that was worded poorly.

Either way...the lack of GPGPUs is where the power consumption difference falls squarely.
 

Blandge

Distinguished
Aug 25, 2011
316
0
18,810


Would you not consider Xeon Phi GPGPU? If not it's pretty damn close. Certainly close enough to have similar power and performance characteristics.

Either way, you are wrong. Yes GPUs use more power than CPUs, but they increase efficiency on HPC workloads. This means that overall they use less power per FLOP on highly parallel workloads. That's why the top (x86) Green HPC devices all have some GPGPUs (Or Xeon Phi), because it is more efficient. Otherwise the whole toplist would be nothing but Opterons and Xeons.
 
^^ xeon phis are more performance efficient than gpus? i thought kepler-based teslas were more efficient and phis' main attraction was the ease of progammability.
edit: disclaimer - lack in-depth knowledge about hpc and supercomputers.
 


AMD has special order of second batch from TSMC and this probably explains why Maxwell will only see the light of day closer to 2015. I do expect volcanic Islands to be on average 2-3x faster than existing Southern Islands parts across the board with a universal "double em up" through all SKU's.

 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790




By efficiency I did mean performance per watt. The #1 top supercomputer using AMD X86-Opterons plus Nvidia cards offers 20 petaflop using 9 MW of power. Assuming linear scaling 200 petaflop would use 90 MW of power (the power consumption will be higher because is nonlinear). The European project, using ARM CPU + ARM GPU wait to obtain 200 petaflop using only 10 MW.

http://hexus.net/tech/news/cpu/48193-new-eu-based-supercomputer-arm-based/

Imagine what AMD could offer us with its future ARM-Opteron chips.
 
Status
Not open for further replies.