AMD CPU speculation... and expert conjecture

Page 202 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810
Just trying to get a sense of how well Jaguar is selling. Still difficult to find much in the retail space. Hope they've had better luck in the embedded markets.

I figured 4th of July sales and such would bring up a few more hits.
 

8350rocks

Distinguished


I expected that...Qualcomm and Samsung have that space sewed up tight...AMD is into ARM for server solutions, where Qualcomm and Samsung won't be competing. Neither will Intel, so they will really be the pioneers in that arena and will probably gain market share aggressively until a competitor shows up.
 

8350rocks

Distinguished


Where that's really going to make a difference is when the AIOs and tablets, etc. start coming out with AMD solutions in them. AMD has several manufacturers working on OEM stuff like that currently, should be coming end of Q3 or there about.
 

8350rocks

Distinguished


If Intel goes into smartphones and comes up with more than 5-10% market share, I will be shocked. All the top OS systems for smart phones run on ARM cores, not x86. Windows phones are not taking off at all, and that won't be changing anytime soon with the atmosphere regarding windows 8 in the marketplace. Intel is running off on a fool's errand to try to get someone to buy something of theirs, and it won't work.

Mark my words, Intel will fail in mobile phones with x86.
 

8350rocks

Distinguished


Android and iOS do not run on x86 cores, they are the 2 most popular smart phone OS's by a landslide (greater than 90% market share between the 2). Meaning windows phone at less than 10% market share will be Intel's market on smart phones. They are destined for failure at that rate. It won't be worth the R&D costs to do it.

EDIT: Power consumption has nothing to do with the price of goats in Africa for smart phones, it's all about what OS is compatible. People won't buy Windows phone OS on a smart phone, the general public is not happy with M$ right now, and Intel is going to be in bed with them in smart phones. It's not just poor timing, it's also a poor architecture for phones...no matter how "great" their power consumption numbers are. People will continue to buy car chargers for their Android/iOS phones and keep skipping windows phones because of the OS.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Intel has always supported Linux pretty heavily. Android is built upon Linux so running it isn't a major hurdle. You can download it from Intel's website. iOS is of course out of the question.

http://software.intel.com/en-us/articles/intel-atom-x86-image-for-android-4-2-jelly-bean-installation-instructions-manually

Lenovo sells a Intel Clovertrail phone with Android 4.1.
http://reviews.cnet.com/cell-phones/lenovo-k900/4505-6454_7-35567342.html


There are other Linux based phone OS's as well that haven't made a big splash yet but not everyone wants to be under the thumb of Google. Samsung/Intel have partnered to make Tizen mobile OS.

Intel's failures in the mobile space were primarily their lack of focus. It was a low priority item for them. Now they have shifted focus entirely to low power mobile SoC. Even their desktop line is being put at a lower priority for the 14nm node. This is a first for Intel.

The work Intel has done on dynamic voltage/frequency scaling, low P-states, etc, has all given them quite a competitive advantage in this space.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


There should be no competition for AMD in the ARM server space, if you ignore the other 7 players. Some are smaller players until you get to the likes of Samsung and Qualcomm.

http://www.eetimes.com/document.asp?doc_id=1263010

"Qualcomm is at least the eighth company now known to be working on ARM server SoCs. Others with announced or known plans include AMD, Applied Micro, Cavium, Calxeda, Marvell, Nvidia and Samsung. "


Which ones will get preference on the TSMC/Samsung fab lines? Probably Samsung and Qualcomm.
 

8350rocks

Distinguished


I am not so sure that TSMC is locked down so tightly. AMD has been using them for GPUs for a while, and I would be surprised if they ignored that fact, considering AMD GPUs sell in a pretty large volume.

However, I doubt Samsung will offer fab to a competitor.

However, you also have to consider, many of those ARM developers don't have experience with servers like AMD does either. Insight and experience can often make up for a lot of things when it comes to designs that will inherently be similar anyway.

EDIT: AMD can also offer integrated solutions through SeaMicro and other sources...while Qualcomm and Samsung will have to get someone to integrate their solutions into an entire server solution package. I honestly think this leg up is the biggest advantage AMD has to leverage in that segment.
 

guskline

Distinguished
Aug 25, 2006
431
2
18,795
What's the latest news about SteamRoller? I see that AMD via a slide presentation talks about an early release of a "steamroller" core in a socket FM2+ version with 2 modules and 4 cores but what is the likelihood of a FX socket 8 core in 2013? Or is it now mid 2014?
 

8350rocks

Distinguished


Last I saw Q1 2014 was the estimate. The FX roadmap shows Vishera 2.0 through the end of 2013.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Apple just locked in the 20nm node at TSMC through 2014. That's why there will be no NVidia/AMD 20nm GPUs this or next year.

ARM servers have been in the works for years. Dell and HP already have 32bit systems shipping and have plenty of experience in the server space. MiTek just released their 64bit server based on the X-Gene 8-core chip.



144 ARM 64bit cores in a 4U Enclosure.
 

8350rocks

Distinguished


None for Nvidia, meanwhile GloFo announced 20nm will be ready 1H 2014...

32 bit ARM servers...sure...64 bit? Not really. That's just barely coming around now...and AMD is poised to do well, especially if their performance/watt ARM expectations are met...they might offer the most value per watt for ARM servers if they hit their goal.

Additionally, 1U, 1.5U, 2U and 3U are all server structure sizes that AMD is already utilizing with their Open Compute hardware. SeaMicro also offers the densest server configurations with the most bandwidth and lowest power consumption out there in x86...if they apply that technology to ARM, the others may not even have a chance before they start.

On a side note: Adjustments in BIOS for AMD CPUs increase single core performance by 14-18%

http://www.xbitlabs.com/news/cpu/display/20130622092423_Low_Performance_of_AMD_Microprocessors_May_Be_Conditioned_by_Poor_BIOS.html
 


I was never a big fan of how AMD's drivers handled ACF and switchable graphics. For notebooks they left too much in the hands of the OEM's who tend to screw everything up. ACF works pretty well, assuming your dGPU isn't 2x the performance of your iGPU. Richland seems to be a bit better at scaling and balancing the power distribution of the CPU and graphics component of the APU. With the way boosting works, ACF almost seems counterproductive.
 


Getting an OS to run on any random CPU architecture is not terribly difficult. It requires a compiler for the target CPU in the language that the kernel and userspace are programmed in. Android is a great example of this. Android is the Linux kernel with Google's userland. The kernel is released under the GPL and the Google userland in Android is licensed under Apache. You can see the code so it is pretty easy to take a good standards-compliant, multi-language x86 compiler set like GCC to compile it to an x86 target. And yes, people have done that with Android. It is especially trivial due to how the Linux kernel has drivers for pretty well everything under the sun in it. You would likely get your self-compiled OS to actually run well on your hardware and have most if not everything work properly, no less.

iOS should be similar as it is based on BSD UNIX, which also compiles easily to any CPU which has a decent compiler. Apple of course abused the permissiveness of the BSD license to make pretty much all of iOS proprietary, so only somebody on the inside at Apple could try to recompile iOS for another architecture. You might run into driver issues since Apple doesn't exactly put or make of drivers in/for the iOS kernel, unlike the Linux kernel containing pretty well any driver you'd need. They make what the iOS iWhatever needs and that's it, external devices attach via Bluetooth, WiFI, USB, FireWire, etc. I don't know if the iOS kernel driver ABI is anything like OS X's or any other BSD's so you may end up with an OS that will run but none of your devices will work due to a lack of a driver.

EDIT: Power consumption has nothing to do with the price of goats in Africa for smart phones, it's all about what OS is compatible. People won't buy Windows phone OS on a smart phone, the general public is not happy with M$ right now, and Intel is going to be in bed with them in smart phones. It's not just poor timing, it's also a poor architecture for phones...no matter how "great" their power consumption numbers are. People will continue to buy car chargers for their Android/iOS phones and keep skipping windows phones because of the OS.

Call me dumb but I doubt that too many people know or care what OS is on their phone since you can't specify or generally change OSes on phones. They more care about the phone hardware itself and what the software can do. A good indicator of that is that no one phone OS has come to be THE phone OS like what happened with Windows on computers. They want their phone to be able to send texts, surf the Net, take pictures and videos of stupid stuff, watch videos, play music, and last and least make and receive phone calls.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


First off, drop the blue koolaid.

Battery life of the device is average. It lasts around 11 to 12 hours when used with custom settings in the battery saver app, which always runs in the background whether you require it or not. By default this battery saver app switches off the data connection and controls the brightness too aggressively when the battery gets low.

hmm so less power by switching all the devices on the phone off ... marketing ploys win this one. How about real world?

Also, we found that web browsing for some reason takes a heavy toll on K900 battery life compared to other Android devices. If you don't browse too much, you will get better battery life from the device compared to what we got on an average day. http://www.indiatimes.com/technology/mobile/review-lenovo-k900-86373-2.html

ya ... less power when its in shut-down mode, wich you can do with any andriod phone, but for marketing, you run one on minimal mode while maxing the others out. Apples to apples, not a chance. Low price? Not a chance, priced right now in India at the cost of a Galaxy Note 2.

Screenshot_2013-06-05-20-24-01.png

I have used the phone now for around 2 days. The battery life is not that impressive ... http://wandowski.com/blog/2013/06/04/lenovo-k900-review/
 


It makes a difference in theoretical FPS but in terms of performance its very irratic and frustrating right now, this was the case with a 5800K, I am told the situation is better with a 6800K but I cannot comment on that as I have only tested with the HD6670.

At same settings in BF3: 1268x768 Low Presets the APU alone scores around 60-65FPS smooth (DDR3 2400), Dual Graphics scores around 85FPS but stutters a bit. I have tweaked around and updated the CAP profiles and tuned the settings in Catalyst Control Panel and in game and managed to get 1680x1050 75FOV, Ultra Meshing and Medium textures, 2xAA, HBAO playable around 45-50FPS with minimal microstutter but its very frustrating. I am just going to hold out for the frame pacing drivers which may only release in September which co-incides the rumour that AMD will release its Hawaii based GPU's amidst BF4 release and may be holding beta drivers until they have the situation perfected for the HD9970/9950 release dates.




As above I have heard Richland is better with DG support but right now a 5800K with a Sapphire Low Profile HD6670 1GB GDDR5 card has a bit of microstutter at higher resolutions, at low res its fine but to be honest a game maxed out at 1366x768 looks just so blocky however at $180 to get that performance out of a $400 system is actually pretty good, however it is definitely confirmed that DG is very much work in progress.

For benches and synthetics DG is very good. Unigine Valley Extreme HD presets:

APU only:

269 Pts

APU+HD6670 DG

597 Pts

GTX 560 SOC

780 Pts

HD7770

735 Pts

3Dmark shows similar 2-3x gains but its easy to make drivers for synthetics, I do however hold out for the new frame variance drivers before I make an opinion on DG right now nice feature just work in progress.

 

you might not have to wait that long. after promising frame pacing drivers in h2 2013, then june, then june/july, amd has set a specific date now. from the red horse's mouth -
https://twitter.com/AMDRadeon/statuses/347803712930070529
disregard the red/green banters below the tweet.

since there's no radeon 9000 discussion thread...
http://www.xbitlabs.com/news/graphics/display/20130701195901_AMD_May_Unleash_Next_Generation_Radeon_HD_9000_Series_in_October.html

kabini's here
http://acer.us/ac/en/US/content/model-datasheet/NX.M81AA.014
they made it to amd's website as well:
http://products.amd.com/en-us/NotebookAPUDetail.aspx?id=66&f1=&f2=&f3=&f4=4&f5=&f6=&
http://products.amd.com/en-us/NotebookAPUDetail.aspx?id=67&f1=&f2=&f3=&f4=4&f5=&f6=&
http://products.amd.com/en-us/NotebookAPUDetail.aspx?id=67&f1=&f2=&f3=&f4=4&f5=&f6=&
http://products.amd.com/en-us/NotebookAPUDetail.aspx?id=85&f1=&f2=&f3=&f4=4&f5=&f6=&

 

8350rocks

Distinguished


They are actually moving forward with HD 8XXX series instead of the 9XXX series nomenclature. Also, the HD 8970 is purported to be a 35% improvement over the HD 7970 GHz Edition, which should bury Nvidia pretty soundly, yet again.

The HD 8950 is supposed to be over 20% improvement over the outgoing HD 7970 GHz as well. The HD 8870 is supposedly a staggering 43% better than the outgoing HD 7870 (Not XT model mind you) and the HD 8850 is supposed to best the old HD 7870 by a good 25%.

That's a pretty dramatic improvement when Nvidia more or less slapped a new heatsink/fan with updated nomenclature on their old GTX 6XX SKUs for all but the 780.
 
Status
Not open for further replies.