News Nvidia and AMD to Develop Arm CPUs for Client PCs: Report

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

bit_user

Titan
Ambassador
If the ARM instruction set wasn't so annoying I'd root for them. I mean, come on. Who really needs a barrel shifter anyway?
That's actually really useful for address arithmetic. For instance, if you have an index into an array of ints, floats, doubles, NEON vectors, etc. then you need a fast, convenient way to do power-of-two multiplies (i.e. shift-left).


The fact that it can do rotations is a little less useful, but why not? I'm sure there are plenty of tricks you can do with them, and it's not like x86 doesn't have its own rotate instructions.

I still think Intel and AMD would be far better off making a completely stripped x86 instruction set that only had the instructions necessary for the 90%.
At that point, why not just adopt a standard ISA, like RISC-V?

Furthermore, you say "Intel and AMD", but what are the chances they would partner up on such a project? Even if they did, I think the computing world wants more than 2 choices (and sometimes in the past, only 1 choice) of CPU vendors. If people are going to move away from x86, it will be to a more vibrant ecosystem with more players. ARM and RISC-V currently embody the best such options.
 
  • Like
Reactions: NinoPino

bit_user

Titan
Ambassador
Tell that to the average customer who isn't well informed on what's available on the market or has a key piece of software for their workflow that they can't or won't replace for whatever reason.
You keep looking for exceptions, rather than at the majority of users. Nobody is saying x86 will go away any time soon. It will continue to be a viable option, for those who need it. However, most just don't - especially with decent x86 emulation.

Also, I don't buy the argument that the performance hit of that emulation is such a deal-breaker. If the software were modern, then there will be an ARM-native version of it. So, most of what people are using it to run are old apps that were designed to run on far slower PCs. In that case, running them under emulation will still be many times faster than the PCs being made when the app was published.

It's usually smaller software that are proprietary where the devs don't support it anymore and they have expensive machinery that is dependent on that software to run.
And probably nothing exotic or performance-intensive, meaning it should run fine under emulation.

Now, where you talk about "machine control" is where I'd draw the line - you want to run that on the supported hardware, only. But, that's such a tiny niche of the market that it really isn't going to have much influence in what the mainstream does. Heck, I'll bet there are still a handful of old DEC VAX machines still running some ancient accounting software written in SNOBOL67 or APL, but that didn't stop x86 from dominating the computing market, for the past 2+ decades!
 
Last edited:
  • Like
Reactions: P.Amini

bit_user

Titan
Ambassador
Disagree that software optimization is the most important? Definitely not. Those new instructions won't be magically used by old software.
But they will be magically used by code that's either JIT-compiled (i.e. the universe of web apps, Java, C#, Python, etc.) or recompiled to run on APX-enabled processors.

Pretty much the only definite advantage of ARM ISA is their simpler instruction decoding logic and APX, sadly, has nothing to do with that. Sure, more registers is nice but how much that affects performance in modern superscalars is hard to tell. 2op vs 3op arithmetic issue is already kind of solved with almost free mov.
APX doesn't bring much to the table that ARM doesn't already have. Basically the main thing it does is to close the gap with the ARM ISA, which it doesn't do entirely.

And yet, Intel estimated it provides real benefits. So, that should tell you there are some real advantages to ARM's ISA. In Intel's own words:

"The extensions are designed to provide efficient performance gains across a variety of workloads – without significantly increasing silicon area or power consumption of the core."

"Intel® APX doubles the number of general-purpose registers (GPRs) from 16 to 32. This allows the compiler to keep more values in registers; as a result, APX-compiled code contains 10% fewer loads and more than 20% fewer stores"

"... three-operand instructions ... reducing the need for extra register move instructions. ... there are 10% fewer instructions in APX-compiled code"

Source: https://www.intel.com/content/www/u...ical/advanced-performance-extensions-apx.html
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,420
944
20,060
You keep looking for exceptions, rather than at the majority of users. Nobody is saying x86 will go away any time soon. It will continue to be a viable option, for those who need it. However, most just don't - especially with decent x86 emulation.

Also, I don't buy the argument that the performance hit of that emulation is such a deal-breaker. If the software were modern, then there will be an ARM-native version of it. So, most of what people are using it to run are old apps that were designed to run on far slower PCs. In that case, running them under emulation will still be many times faster than the PCs being made when the app was published.


And probably nothing exotic or performance-intensive, meaning it should run fine under emulation.

Now, where you talk about "machine control" is where I'd draw the line - you want to run that on the supported hardware, only. But, that's such a tiny niche of the market that it really isn't going to have much influence in what the mainstream does. Heck, I'll bet there are still a handful of old DEC VAX machines still running some ancient accounting software written in SNOBOL67 or APL, but that didn't stop x86 from dominating the computing market, for the past 2+ decades!
We'll see soon enough if Windows on ARM goes anywhere.

I'm hoping it FAILS miserably and x86 comes in and makes them keep on looking stupid.

x86 is 4 Life!
 

setx

Distinguished
Dec 10, 2014
263
233
19,060
And yet, Intel estimated it provides real benefits. So, that should tell you there are some real advantages to ARM's ISA. In Intel's own words:
Uhm, how about reading my post before trying to tell me things that I've already addressed?

They want to compete in the laptop space with Apple, quite possibly.
You have absolutely no idea what you are writing. AMD is hardware company, Apple is full stack religious sect that also produces hardware for themselves. They have no direct competition at all. AMD isn't going to make laptops. And even if AMD decides to make ARM CPUs they would compete with Qualcomm, MediaTek e.t.c. not Apple.
There is another case to be made for AI acceleration on ARM since it's so much less power consumptive than x86 at equivalent implementation scales.
AI acceleration on ARM also makes no sense at all. For such tasks it's far more efficient to use dedicated accelerators or special instruction extensions like Intel AMX.

Also, proof can't be empirical. What a joke.
 

bit_user

Titan
Ambassador
Uhm, how about reading my post before trying to tell me things that I've already addressed?
I did read your post. I was showing you where Intel directly contradicts you. You think I'm going to believe your gut feel over Intel's experimental results? Wow, such arrogance!

If Intel's researchers didn't have good evidence supporting APX' benefits, do you think they'd be going to so much trouble of introducing it? I don't.

AMD is hardware company, Apple is full stack religious sect that also produces hardware for themselves. They have no direct competition at all. AMD isn't going to make laptops. And even if AMD decides to make ARM CPUs they would compete with Qualcomm, MediaTek e.t.c. not Apple.
AMD's customers compete with Apple, and they have been losing marketshare to Apple, in recent years. They do need something to hit back with.
 
The article gives a good reason. If Microsoft is already decided on using ARM for the next XBox, then maybe that was enough to push AMD into the game. Then, they figured they might just go ahead and also put it in some mini-PCs or something, because "why not?"

If the Windows-on-ARM market fails to materialize, they'll still have XBox as a way to recoup their design costs and can simply cancel the desktop version.
IF MS decided to go with ARM (for their main console and not for an play-anywhere reduced side console) , they still didn't decide that they will keep with AMD, why should they?! Anybody would weigh their options and go with the best one, AMDs first attempt at an ARM cpu will most definitely not be the best option available to MS probably not even nvidias first offering, unless the GPU plays a much higher role than the CPU parts.
Also the main reason that the consoles switched to x86 in the first place was because devs didn't want to deal with porting anymore so they would antagonize all the people making them money.

But the most important part is that AMD barely makes any money from the console APUs as it is, now they would have to pay a license, pay for a new design, make a new production line, and so on, it would definitely be a big loss for them.

ARM desktop doesn't need windows to succeed, if it can run a browser it can do everything 90% of that user base will ever need to do.
Hence why the chromebook came about.
 
  • Like
Reactions: cyrusfox

NinoPino

Respectable
May 26, 2022
487
303
2,060
IF MS decided to go with ARM (for their main console and not for an play-anywhere reduced side console) , they still didn't decide that they will keep with AMD, why should they?! Anybody would weigh their options and go with the best one,...
It seems that AMD have done a very good job keeping two little customers like Microsoft and Sony for 2+ generations of the top consoles on the market. I bet will be a lot of good reasons to continue this relationship.

AMDs first attempt at an ARM cpu will most definitely not be the best option available to MS probably not even nvidias
AMD is not the last arrived in CPU design. Probably second only at Intel and IBM. Add that x86 CPUs actually are RISC CPUs with a additional layer of translation, they are no more pure CISC for a lot of years now. I think the first implementation can be very good.
I'm surprised AMD and Intel do not have done this move before.

first offering, unless the GPU plays a much higher role than the CPU parts.
And in the actual scenario so it is.

Also the main reason that the consoles switched to x86 in the first place was because devs didn't want to deal with porting anymore so they would antagonize all the people making them money.
Not only this, at the time there was no performant alternative, the last one was IBM/Motorola with Power/CELL. Than also them leaved that CPU business.
But the most important part is that AMD barely makes any money from the console APUs ...
Are you serious ?
...ARM desktop doesn't need windows to succeed, if it can run a browser it can do everything 90% of that user base will ever need to do.
Hence why the chromebook came about.
Windows desktop is also Adobe, Corel, Autocad, Solidworks, Rhinoceros, Catia, infinite number of custom software for companies all around the world, games, ERPs, audio stations, old software that need particular driver/hardware to work, just to do some examples.
 
  • Like
Reactions: bit_user

Bazzy 505

Reputable
Jul 17, 2021
344
124
4,940
The main reason behind ARM power efficiency is that it doesn't need to carry 40 years worth of legacy baggage intel never had the courage to drop and break compatibility with a lot of obsucure stone age business applications still chugging along in dark corners with their customers.

I doubt Windows on ARM will make noteworthy splash in forseeable future. But it may be just the gentle kick Intel needs to declutter its ISA. I mean it's ridiculous we still boot in 16bit mode from the get go all accross the entire 64bit CPU product stack.
 
Last edited:

NinoPino

Respectable
May 26, 2022
487
303
2,060
The main reason behind ARM power efficiency is that it doesn't need to carry 40 years worth of legacy baggage intel never had the courage to drop and break compatibility with a lot of obsucure stone age business applications still chugging along in dark corners with their customers.
You were too kind, the reason is not the legacy but more simply the fact that x86 ISA is ugly rubbish. Think that also ARM go '40, only 7 years divide ARM design from x86.

I doubt Windows on ARM will make noteworthy splash in forseeable future. But it may just the gentle kick Intel needs to declutter its ISA. I mean it's ridiculous we still boot in 16bit mode from the get go all accross the entire 64bit CPU product stack.
Intel have only two possibilities, ARM or RISC V, the latter is a big risk at the moment.
 
  • Like
Reactions: bit_user

bit_user

Titan
Ambassador
IF MS decided to go with ARM (for their main console and not for an play-anywhere reduced side console) , they still didn't decide that they will keep with AMD, why should they?!
Yes, and I suppose that would put Nvidia back in the running.

Also the main reason that the consoles switched to x86 in the first place was because devs didn't want to deal with porting anymore so they would antagonize all the people making them money.
PowerPC was at a dead end. The only reason you'd port to it was for the consoles. ARM, on the other hand, currently has a lock on the phone market and is used in Nintendo Switch. So, a lot of game engines will already have support & optimizations for ARM. This puts it well ahead of RISC-V, as an option for the console market.

Also, Microsoft is probably thinking that pushing console games to target ARM will translate over to a lot of games being available for Windows/ARM.

But the most important part is that AMD barely makes any money from the console APUs as it is, now they would have to pay a license, pay for a new design, make a new production line, and so on, it would definitely be a big loss for them.
I think the fees for an architectural license are a lot lower than licensing entire core IP from ARM, and the former is the type of license AMD would use to design its own ARM cores. Obviously, any such costs will be passed through to Microsoft, if they're non-trivial. If Microsoft has decided to use ARM, then they will have run the numbers and made sure the math works out for them.

ARM desktop doesn't need windows to succeed, if it can run a browser it can do everything 90% of that user base will ever need to do.
Hence why the chromebook came about.
Yeah, but I hear the gaming scene on Android isn't great. You have a chicken-and-egg problem, where nobody builds a true gaming-class ARM SoC (and no, I don't count the Switch - its SoC was never dGPU-caliber), because there are no games. XBox/ARM could change all of that.
 

pug_s

Distinguished
Mar 26, 2003
482
76
18,940
Well, Windows-on-ARM has been a thing for several years, at least. Did you know that Windows NT once ran on MIPS, DEC Alpha, PowerPC, IA64, and perhaps others? Some of those are big-endian, even! Plus, Windows Mobile/Phone obviously runs on plenty of ARM SoCs.

Microsoft does know a few things about porting to other ISAs!


Why is it less portable? Java was designed to be super-portable, by using an intermediate bytecode representation, instead of natively compiling programs for a single ISA. With C# and .Net, Microsoft copied that idea. So, I'll bet a lot of Windows apps just run on ARM with no porting necessary.


Uh... you lost me. Can you rephrase that, please?
I know that Windows NT 4 supported those 4 architecures and that's not the difficult part. Other than the base OS works, you can't get 3rd party software that is not compiled for the certain architecture. I'm not talking about Chrome, Firefox, but there are bread and butter software that many companies use which is not compiled or optimized other than x86.
 

bit_user

Titan
Ambassador
Add that x86 CPUs actually are RISC CPUs with a additional layer of translation, they are no more pure CISC for a lot of years now. I think the first implementation can be very good.
I'm surprised AMD and Intel do not have done this move before.
They did, but cancelled it.

You have to keep in mind that AMD was on the verge of bankruptcy, at the time. They probably didn't have enough money to put a full effort behind both the K12 and Zen. Given the circumstances, I think going all-in on Zen was the right choice, since the ARM server market wasn't nearly as mature back then.

Anyway, this is why AMD already has an ARM architecture license. Nvidia has one from designing their Project Denver cores.

Edit: at the end of the above article, there's a link to this one, published 2 years ago, containing statements from their CFO about working with ARM:

“... we have a very good relationship with ARM. And we understand that our customers want to work with us with that particular product to deliver the solutions. We stand ready to go ahead and do that even though it's not x86, although we believe x86 is a dominant strength in that area."

 
Last edited:

bit_user

Titan
Ambassador
The main reason behind ARM power efficiency is that it doesn't need to carry 40 years worth of legacy baggage intel never had the courage to drop and break compatibility with a lot of obsucure stone age business applications still chugging along in dark corners with their customers.
Hmm... there have been a few, but most were part of optional extensions.

I mean it's ridiculous we still boot in 16bit mode from the get go all accross the entire 64bit CPU product stack.
It's almost odd that you would say that...

 

setx

Distinguished
Dec 10, 2014
263
233
19,060
You think I'm going to believe your gut feel over Intel's experimental results? Wow, such arrogance!
Feel free to believe all the marketing ads. I'd rather see the real independent tests first.
If Intel's researchers didn't have good evidence supporting APX' benefits, do you think they'd be going to so much trouble of introducing it? I don't.
I don't doubt that APX is good. The questions are how good it is and how much impact it would have on real software.

Here is a real example of how much new instructions can help speed and how much time it takes to actually introduce them in software, exactly to Python that you've mentioned: https://www.phoronix.com/news/Intel-AVX-512-Quicksort-Numpy That's just 7.5 years from first hardware available.

Also, no one can guarantee that Intel won't do something stupid again like limiting APX to Xeons only like they do with TSX now. And look at that TSX: it's a real groundbreaking technology (unlike just incremental benefits of APX) that almost no one had/has (sans POWER) and it's used how much? Sadly, almost nothing uses it to the point that almost no one noticed Intel removing it in client CPUs.
AMD's customers compete with Apple, and they have been losing marketshare to Apple, in recent years. They do need something to hit back with.
Yes, AMD's customers do compete with Apple. But do they want to pay AMD a lot of money for ARM chip some time in the future instead of paying, say, Qualcomm? That's a good question. If they do then AMD sure will make an ARM CPU.

Intel have only two possibilities, ARM or RISC V, the latter is a big risk at the moment.
The former is actually a far bigger risk nowadays – licensing risk. Just look at Qualcomm. Pretty much no one doubts bright future of RISC-V, the only question is when.
 
PowerPC was at a dead end. The only reason you'd port to it was for the consoles. ARM, on the other hand, currently has a lock on the phone market and is used in Nintendo Switch. So, a lot of game engines will already have support & optimizations for ARM. This puts it well ahead of RISC-V, as an option for the console market.
How much worse could ARM, at that time, be from the jaguar cores?! Serious question because I don't know, but my guess is it couldn't have been that far off since the jaguar cores where pretty bad.
You are saying it yourself later, gaming isn't that big on ARM so will a current ARM CPU be able to be close to the ryzen CPUs they use now?
Also, Microsoft is probably thinking that pushing console games to target ARM will translate over to a lot of games being available for Windows/ARM.
Cloud gaming will be a thing, as I already said I think the new console that MS is making, that will be using ARM, will be a streaming console, using either the actual XBOX to play the games or even streaming them over the net.
Just like the portable that sony just introduced...
Obviously, any such costs will be passed through to Microsoft, if they're non-trivial. If Microsoft has decided to use ARM, then they will have run the numbers and made sure the math works out for them.
If MS needs to keep AMD for the GPU or whatever then yes, but if they can go with any other supplier because they just need a normal ARM cpu then , yeah they will crunch the numbers and AMD will most probably lose that comparison.
Yeah, but I hear the gaming scene on Android isn't great. You have a chicken-and-egg problem, where nobody builds a true gaming-class ARM SoC (and no, I don't count the Switch - its SoC was never dGPU-caliber), because there are no games. XBox/ARM could change all of that.
When I said "that user base" I was talking about the group of people that would buy such a super low cost desktop, they wouldn't mind much about gaming as long as there are some simple time wasters on there they would be happy.

In my opinion nobody builds a true gaming-class ARM SoC because it would draw the same amount of power than an x86 one would making the whole thing pointless.
 

NinoPino

Respectable
May 26, 2022
487
303
2,060
These same types of concerns were echoed for 64-bit only operating systems and it turned out to be a tiny fraction of the overall base (like 5 or 6% if I recall). If that software is new enough to run on 64 bit only then I don't think there will be any issues.
64bit Windows can run 32bit software.

... Those users can stick with x86 or use emulation/translation.
This is the solution, but slow down the transition.
 
  • Like
Reactions: bit_user

JamesJones44

Reputable
Jan 22, 2021
856
790
5,760
LOL - nearly a Freudian slip, there!
Maybe it was a subconscious! LOL

Didn't the M2 add some hardware optimizations to close the gap even further, or were those already present in the M1?
I've seen rumors of this and benchmarks of Rosetta 2 are generally much improved on the M2, but I've never been able to confirm if Apple made changes specifically for Rosetta or if it's just because the M2 has a larger L2 that is able to cache more translated instructions.
 

JamesJones44

Reputable
Jan 22, 2021
856
790
5,760
We'll see soon enough if Windows on ARM goes anywhere.

I'm hoping it FAILS miserably and x86 comes in and makes them keep on looking stupid.

x86 is 4 Life!
Why though? Competition is much better than having two vendors.

This isn't 2010 when Microsoft was forcing you to choice on or the other because the tooling didn't exist to easily support two architectures. These days the tooling exists to support both ARM and x86 simultaneously. It helps with competition which is sorely needed in the desktop/laptop/server spaces.
 
  • Like
Reactions: bit_user
Why though? Competition is much better than having two vendors.

This isn't 2010 when Microsoft was forcing you to choice on or the other because the tooling didn't exist to easily support two architectures. These days the tooling exists to support both ARM and x86 simultaneously. It helps with competition which is sorely needed in the desktop/laptop/server spaces.
Windows being available for smartphones did not change anything for competition and it won't in the future either...
ARM has a place in the market and always will have but it's not a competitor to x86 except for the very very low end.
 

bit_user

Titan
Ambassador
Feel free to believe all the marketing ads. I'd rather see the real independent tests first.
Yes, I will believe Intel's claims, because these seem very credible:
  • "APX-compiled code contains 10% fewer loads and more than 20% fewer stores"
  • there are 10% fewer instructions in APX-compiled code

The thing you're missing about move-elimination is that mov instructions still have costs, which correlate to that 10% figure:
  • Wasting memory bandwidth, since they have to be fetch from DRAM.
  • Wasting space in the instruction cache.
  • Wasting instruction decoder bandwidth.

That's why it's more beneficial not to have them in the first place, even though you can mitigate their cost through register renaming. The further upstream you can eliminate an instruction, the better.

I don't doubt that APX is good. The questions are how good it is and how much impact it would have on real software.
That's tested easily enough. You can do a simple experiment by restricting a compiler from using certain features on an existing CPU which has them, but you can also model your hypothetical CPU in a simulator and update the compiler to match. I'm certain the latter is how they arrived at the above numbers, because it's standard practice to have a cycle-accurate software simulation of modern CPUs (usually written in a language like C++), before you build them. That should be easy enough to modify, as should a compiler like GCC or LLVM.

Here is a real example of how much new instructions can help speed and how much time it takes to actually introduce them in software, exactly to Python that you've mentioned: https://www.phoronix.com/news/Intel-AVX-512-Quicksort-Numpy That's just 7.5 years from first hardware available.
That's a terrible analogy, because they had to hand-code those optimized routines. For the kind of enhancements that Intel is adding via APX, all you have to do is flip a switch and the compiler automatically utilizes them.

Also, no one can guarantee that Intel won't do something stupid again like limiting APX to Xeons only like they do with TSX now.
I thought they cancelled TSX. Previously, it was available on Skylake-era client CPUs. They subsequently released microcode patches that disabled it for security reasons.

And look at that TSX: it's a real groundbreaking technology (unlike just incremental benefits of APX) that almost no one had/has
Heh, ARM has similar functionality in TME (Transactional Memory Extensions), which is included in ARMv8.5-A and ARMv9-A

So, if you like that, you should welcome our new ARM overlords!
; )

Sadly, almost nothing uses it to the point that almost no one noticed Intel removing it in client CPUs.
glibc used it to optimize pthreads mutexes and maybe some other stuff. Some databases used it. Maybe the kernel used it for futexes? I was sad to see it go.

The former is actually a far bigger risk nowadays – licensing risk. Just look at Qualcomm. Pretty much no one doubts bright future of RISC-V, the only question is when.
Again, you're assuming (incorrectly, I might add) that AMD and Nvidia didn't have architectural licenses, before ARM started trying to extort Qualcomm! AMD definitely had one, since it would've been needed for designing the K12, which they were rumored to have even considered releasing as late as 2020. Nvidia had one from its Denver cores, and I remember reading they renewed it in the past couple years. I've read those licenses are good for a decade or so.
 

bit_user

Titan
Ambassador
You are saying it yourself later, gaming isn't that big on ARM so will a current ARM CPU be able to be close to the ryzen CPUs they use now?
I said high-end gaming isn't big on Android. The problem there isn't ARM, but rather the GPUs included in ARM-based SoCs. No one make one with a high-end GPU, and that's because there's not a market for high-end games on Android. Classic chicken-and-egg problem.

In my opinion nobody builds a true gaming-class ARM SoC because it would draw the same amount of power than an x86 one would making the whole thing pointless.
The GPU portion would draw the same amount of power, but the ARM ISA enables the CPU cores to be more efficient. That could free up more power for the GPU portion or just provide cost savings due to lowering the cooling & power requirements.
 
Status
Not open for further replies.