News AMD: We Stand Ready To Make Arm Chips

AMD's CFO Davinder Kumar says the company stands ready to produce Arm processors for its customers.

AMD: We Stand Ready To Make Arm Chips : Read more

Curious

ARM needs to really stand an individual company. With cloud services moving to ARM for efficiency reasons it's vitally important that NVIDIA doesn't control this market.

While NVIDIA claims they will be neutral, there's no mention of licensing fees for competitors.

This is akin to Facebook claiming they would not force customers of Occulus into the Facebook eco system. Well they outright lied several years after the acquisition, by forcing Facebook accounts to be linked.

Nvidia also completely dropped support for PhysX hardware after they acquired the company, leaving owners out in the cold.

You can't trust a company once they have the goods.
 

zodiacfml

Distinguished
Oct 2, 2008
1,228
26
19,310
this more interesting than I thought. i was expecting Arm based products already in the works considerign Apple M1 but they're pretty much divided considering large gaming consoles and some handhelds are x86. giving this minutes of thought, I think AMD should go for it because the future will be graphics and Apple doesn't have that. Apple might want to outsource their Arm based SoC in the future to AMD to get better graphics. once AMD gets Apple as a customer, consoles be converted to Arm. all this is assuming Nvidia doesn't get Arm which is probably the reason Nvidia wants Arm so badly.
 
  • Like
Reactions: artk2219

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
Nvidia also completely dropped support for PhysX hardware after they acquired the company, leaving owners out in the cold.

You can't trust a company once they have the goods.
I'm sure that disappointed both people that owned one of those PPU's. Did you know anyone that had one? I didn't, which isn't something I can say often. They dropped support 3 years after the acquisition which was 6 years after the hardware was initially released. Pretty much any Cuda capable GPU outperformed the PPU at that point. Nvidia made significant improvements PhysX then released the source code in 2015 before open sourcing it in 2018. They just gave it all away for free. Can't trust those bastards at Nvidia.
 

ddcservices

Honorable
Oct 12, 2017
54
26
10,560
What many do not realize, is that with the whole "chiplet" design, AMD could throw some ARM cores into a processor and suddenly, native phone/tablet app execution in a x86-64 computer. I know that being able to run some Android apps will be welcome when it comes to using a number of devices that require a phone/tablet to set them up because they use apps for those, and don't have a true program for a computer to do the same thing.
 

ddcservices

Honorable
Oct 12, 2017
54
26
10,560
this more interesting than I thought. i was expecting Arm based products already in the works considerign Apple M1 but they're pretty much divided considering large gaming consoles and some handhelds are x86. giving this minutes of thought, I think AMD should go for it because the future will be graphics and Apple doesn't have that. Apple might want to outsource their Arm based SoC in the future to AMD to get better graphics. once AMD gets Apple as a customer, consoles be converted to Arm. all this is assuming Nvidia doesn't get Arm which is probably the reason Nvidia wants Arm so badly.
The power of the M1 is that it has silicon dedicated to specific tasks, and that is why it is excellent in many ways. Picture dedicated chips for this and that feature, merge them all into "the CPU", and there you go.
 

artk2219

Distinguished
I'm sure that disappointed both people that owned one of those PPU's. Did you know anyone that had one? I didn't, which isn't something I can say often. They dropped support 3 years after the acquisition which was 6 years after the hardware was initially released. Pretty much any Cuda capable GPU outperformed the PPU at that point. Nvidia made significant improvements PhysX then released the source code in 2015 before open sourcing it in 2018. They just gave it all away for free. Can't trust those bastards at Nvidia.

You can't trust them, and they "gave it all away for free" 10 years later, and basically killed any future development by absorbing them and close sourcing the code for years when it could have been used by other companies. You're right though, not many people had those cards, but i knew a few people that did pick them up. It was also stupidly annoying that even if you did buy an nvidia gpu just for the physx capabilities, they made you jump through hoops to use it as a coprocessor, i did that with a GTS 250 i picked up for 50 bucks and my Radeon 6950 for a couple of years, just about every driver update broke it.
 
Last edited:
  • Like
Reactions: salgado18
What many do not realize, is that with the whole "chiplet" design, AMD could throw some ARM cores into a processor and suddenly, native phone/tablet app execution in a x86-64 computer. I know that being able to run some Android apps will be welcome when it comes to using a number of devices that require a phone/tablet to set them up because they use apps for those, and don't have a true program for a computer to do the same thing.
Android apps don't require an ARM processor to run. The real problem with running Android apps natively outside of Android is ART/Dalvik hasn't been ported to any other OS as far as I know.

Also Android has run on x86 devices such as the first Asus Zenfone and Lenovo's K80.
 
  • Like
Reactions: JamesJones44

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
You can't trust them, and they "gave it all away for free" 10 years later, and basically killed any future development by absorbing them and close sourcing the code for years when it could have been used by other companies.

They didn't close source the code. It was closed sourced when they bought it. They were under no obligation to open source technologies they purchased. How often do companies do that? Why is Nvidia held to a different standard than other companies? It's not really debatable that PhysX was never popular with developers or gamers at any point in its history, so Nvidia didn't kill anything. Before Nvidia, PhysX required the PPU you mentioned to work. That more than anything killed any chance of it being popular. Nvidia removing that requirement increased adoption of the technology, not the other way around. PhysX is still alive and well today as it and Havoc are basically the only physics models used in gaming.
 
Having a dedicated PPU card was going to be dead in the water the moment NVIDIA took over anyway. Why bother supporting another piece of hardware when the stuff you already make can handle it just fine? And history can tell you that in most computer systems, if some whizz bang feature is optional, can't be shoehorned into what you already have, or doesn't provide an exponentially better result, chances are it's not going to last long.

Also, most developers run PhysX on the CPU. The number of games that actually use the GPU or PPU to accelerate it is pretty small. Keep in mind that a lot of PC games also have a console version, so they can't make PhysX run on the GPU when there isn't a compatible one to begin with. If anything, the hardware accelerated aspects of PhysX are just to handle cosmetic things most of the time.
 
  • Like
Reactions: renz496

salgado18

Distinguished
Feb 12, 2007
928
373
19,370
They were under no obligation to open source technologies they purchased. How often do companies do that? Why is Nvidia held to a different standard than other companies?
You do know that other companies include AMD, who either open-sources many of its techs, or sticks to already open-source ones. You can use a Freesync monitor with any GPU thanks to that, same thing for FSR. Vulkan is a byproduct of Mantle, an AMD tech. That's the standard I'm holding Nvidia against, and they fail miserably.
It's not really debatable that PhysX was never popular with developers or gamers at any point in its history, so Nvidia didn't kill anything. Before Nvidia, PhysX required the PPU you mentioned to work. That more than anything killed any chance of it being popular. Nvidia removing that requirement increased adoption of the technology, not the other way around. PhysX is still alive and well today as it and Havoc are basically the only physics models used in gaming.
It's alive as a CPU tech, not a GPU tech, which it was in the beginning.

Also, most developers run PhysX on the CPU. The number of games that actually use the GPU or PPU to accelerate it is pretty small. Keep in mind that a lot of PC games also have a console version, so they can't make PhysX run on the GPU when there isn't a compatible one to begin with. If anything, the hardware accelerated aspects of PhysX are just to handle cosmetic things most of the time.
You mean they don't want to. Should I bring the DLSS vs FSR debate to the table? Yes, they ported the code from the PPU to Nvidia GPUs, but when only half of your users (or less, if you count Intel GPUs) can use a certain tech, why make a game depend on it?

Destructible environments, for example, are 10 years too late because of it: only now CPUs can handle it, when GPUs could do it easily back then. Why make a game that only Nvidia users can play? That's what killed the tech, and it was Nvidia's decision to make it so.
 
  • Like
Reactions: artk2219

Amdlova

Distinguished
Nah AMD, is working into arm thing because intel has some advanced tech in risc v. Will be a such pain for amd when de x86 market goes down. If I remember correctly Lisa say something about the right path of x86 and ryzen... and now New arm is the thing... lol
 
You mean they don't want to. Should I bring the DLSS vs FSR debate to the table? Yes, they ported the code from the PPU to Nvidia GPUs, but when only half of your users (or less, if you count Intel GPUs) can use a certain tech, why make a game depend on it?
The "they" I'm referring to are game developers.

Also again, PhysX can run on the CPU. All the GPU accelerated stuff tends to be for cosmetic features.

Destructible environments, for example, are 10 years too late because of it: only now CPUs can handle it, when GPUs could do it easily back then.
Uh, no it's not. Go look up the game Red Faction some time.

The only reason why it's not used more often is because it presents design problems that developers would rather not have to deal with.
 
  • Like
Reactions: renz496

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
You do know that other companies include AMD, who either open-sources many of its techs, or sticks to already open-source ones. You can use a Freesync monitor with any GPU thanks to that, same thing for FSR. Vulkan is a byproduct of Mantle, an AMD tech. That's the standard I'm holding Nvidia against, and they fail miserably.

Not what I said. Nvidia bought Ageia for a rumored $150 million in 2008 to acquire the closed source PhysX. Name any closed sourced technology that AMD (or any other tech company) spent millions to acquire and then immediately turned around and open sourced it.

It's alive as a CPU tech, not a GPU tech, which it was in the beginning.

No, it wasn't. There's a reason the original hardware was called a PPU (physics processing unit) and not a GPU, and that's because it wasn't GPU powered. Nvidia turned it into a GPU technology. It couldn't be exclusively GPU tech because it had to work on consoles which didn't have the hardware to support GPU acceleration.
 
  • Like
Reactions: renz496
Also I don't know of any game that requires hardware accelerated PhysX to run period. Or at least anything that was remotely popular.

EDIT: Also despite all the poo-poo that NVIDIA seems to get for being closed, everyone seems to miss that NVIDIA does a crap-ton of research and they do publish freely available papers. You know that feature FXAA? That was developed by an NVIDIA researcher. There was also another feature that was used in Gears of War 5 that was, as said in a presentation about graphical features, was inspired by an NVIDIA paper, for a game developed primarily for the Xbox One, and AMD based system.

Yes NVIDIA pushes features that only work on their hardware, but it's also what gives their hardware an edge. Yes AMD pushes features that are more open, but guess what? They're also helping NVIDIA. At the end of the day I only care about the features my hardware has and what performance it does it at. AMD allowing FSR to work on NVIDIA GPUs just means that NVIDIA GPUs has another checkbox that can be added on the feature list.

So thanks AMD, for making my NVIDIA GPU more valuable. I'll be sure to buy more NVIDIA GPUs.
 
Last edited:

artk2219

Distinguished
Also I don't know of any game that requires hardware accelerated PhysX to run period. Or at least anything that was remotely popular.

EDIT: Also despite all the poo-poo that NVIDIA seems to get for being closed, everyone seems to miss that NVIDIA does a crap-ton of research and they do publish freely available papers. You know that feature FXAA? That was developed by an NVIDIA researcher. There was also another feature that was used in Gears of War 5 that was, as said in a presentation about graphical features, was inspired by an NVIDIA paper, for a game developed primarily for the Xbox One, and AMD based system.

Yes NVIDIA pushes features that only work on their hardware, but it's also what gives their hardware an edge. Yes AMD pushes features that are more open, but guess what? They're also helping NVIDIA. At the end of the day I only care about the features my hardware has and what performance it does it at. AMD allowing FSR to work on NVIDIA GPUs just means that NVIDIA GPUs has another checkbox that can be added on the feature list.

So thanks AMD, for making my NVIDIA GPU more valuable. I'll be sure to buy more NVIDIA GPUs.

Yeesh dude chill, we get that you very much like Nvidia, and while you have no issues listing Nvidias contributions and accomplishments you cant seem to acknowledge that other companies have also made contributions to the same space. Nvidia is not always an anti-competitive, non community minded company, but they have a history of being a anti-competitive, non community minded company. You've got to take the good with the bad, and they're no saints, neither is AMD for that matter. Honestly the complete drop of support for their terascale chips in 2015, less than 5 years after the release of the last card is pretty unforgivable, it screwed over more than a few users, and stands in stark contrast to the length of time Nvidia suppored their cards from the same period and beforehand. Lets not even talk about Intel, or especially Apple and their history of bad stewardship for their products or customers, anti-competitive actions, and issues with supporting the community.
 
Last edited:
  • Like
Reactions: salgado18 and Jim90

Jim90

Distinguished
Yeesh dude chill, we get that you very much like Nvidia, and while you have no issues listing Nvidias contributions and accomplishments it seems like you cant seem to acknowledge that other companies have also made contributions to the same space. Nvidia is not always an anti-competitive, non community minded company, but they have a history of being a anti-competitive, non community minded company. You've got to take the good with the bad, and they're no saints, neither is AMD for that matter. Honestly the complete drop of support for their terascale chips in 2015, less than 5 years after the release of the last card is pretty unforgivable, it screwed over more than a few users, and stands in stark contrast to the length of time Nvidia suppored their cards from the same period and beforehand. Lets not even talk about Intel, or especially Apple and their history of bad stewardship for their products or customers, anti-competitive actions, and issues with supporting the community.

Exactly!
A simple look at the message history of certain well-known entities here show a recurring - and damning - Nvidia bias. For the cost of a few dollars, they can pollute any social media site with their multiple accounts.
 
  • Like
Reactions: artk2219
You do know that other companies include AMD, who either open-sources many of its techs, or sticks to already open-source ones. You can use a Freesync monitor with any GPU thanks to that, same thing for FSR. Vulkan is a byproduct of Mantle, an AMD tech. That's the standard I'm holding Nvidia against, and they fail miserably.

It's alive as a CPU tech, not a GPU tech, which it was in the beginning.


You mean they don't want to. Should I bring the DLSS vs FSR debate to the table? Yes, they ported the code from the PPU to Nvidia GPUs, but when only half of your users (or less, if you count Intel GPUs) can use a certain tech, why make a game depend on it?

Destructible environments, for example, are 10 years too late because of it: only now CPUs can handle it, when GPUs could do it easily back then. Why make a game that only Nvidia users can play? That's what killed the tech, and it was Nvidia's decision to make it so.

AMD most often choose to go open source route because they want to save money. and some other tech like Mantle actually quite controversial as well. Mantle did not reach to it's 2.0 version most likely because there are some developer that part of AMD mantle program did not really agree on how AMD want to handle Mantle.

nvidia killed GPU physics? nvidia actually kill nothing. because game developer are not interested with the feature since the very beginning. did you ever heard about Bullet Physics? that is pretty much open source version of PhysX because the engine also capable of doing gpu accelerated physics using OpenCL. they even got endorsed by AMD back in 2009. back then AMD saying that game using bullet will coming out in a year or so. they even called it as PhysX killer. but after more than a year we heard nothing nor we did see any kind of game hinting to use the physics engine. AMD was asked about it and AMD clearly said this: game developer are not interested with the feature (even if open source solution exist).

PhysX would be dead today if nvidia did not acquire Ageia. the idea with PPU? it was DOA since the very beginning. you need to buy PPU just for very little game that can take advantage of the hardware. back then majority of game are using havok.
 
AWS is all ahead on their Graviton processors, so at least on the (cloud) server side there is a significant option available for ARM.

AFAIK AMD was supposed to make the ARM processor that Amazon going to used. but i heard Amazon are not impressed with the performance hence they do it themselves. the end result was Graviton.
 
Curious

ARM needs to really stand an individual company. With cloud services moving to ARM for efficiency reasons it's vitally important that NVIDIA doesn't control this market.

While NVIDIA claims they will be neutral, there's no mention of licensing fees for competitors.

This is akin to Facebook claiming they would not force customers of Occulus into the Facebook eco system. Well they outright lied several years after the acquisition, by forcing Facebook accounts to be linked.

Nvidia also completely dropped support for PhysX hardware after they acquired the company, leaving owners out in the cold.

You can't trust a company once they have the goods.

licensing fees probably will be less an issue. the way i heard it ARM and Softbank decided to part ways because ARM management did not agree with Softbank to raise the licensing fee. because ARM know that is very good way to make people looking for alternatives. big players like Qualcomm most likely more worried about nvidia to create an ecosystem that makes hard for them to compete in the same space. Qualcomm should understand this well because that's how they dominate the smartphone market: by creating an ecosystem that people will be forced to use. and in qualcomm case they also probably worried if nvidia want to get some revenge for what happen when nvidia still in the mobile market before.
 

lastguytom

Commendable
Nov 23, 2018
12
2
1,515
It's fantastic that AMD is getting ready when Arm replaces the x86 CPU as the main pc CPU. AMD needs to get some more fab companies on boards, since Rumor have Intel and Nvidia buying up 5 nm and 3 nm, thus cutting off or slowing down AMD road map of CPU and GPU products. The AMD Ryzen and navi train is being cutoff at the pass. See Tom's Hardware, where AMD promise more supply in 2022, Too bad it's not Gamers they are promising, but server, laptop, and crypto miners.