Nvidia confirms the usage of dozens of RISC-V cores in its GPUs that replaced Falcon microcontrollers.
Nvidia to ship a billion of RISC-V cores in 2024 : Read more
Nvidia to ship a billion of RISC-V cores in 2024 : Read more
Why would you want less competition in the market? A lack of competition leads to stagnation imagine if Intel ruled x86 alone? You’d still be paying nvidia type prices for consumer level CPUs or rather old Intel type prices. One ring to rule them all, and in the “darkness” bind them.The sooner we throw ARM out and replace it with RISC-V, the better.
Deny them any financial stream via Royalties.
RISC-V isn't alone, there are literally thousands upon thousands of developers within RISC-V.Why would you want less competition in the market? A lack of competition leads to stagnation imagine if Intel ruled x86 alone? You’d still be paying nvidia type prices for consumer level CPUs or rather old Intel type prices. One ring to rule them all, and in the “darkness” bind them.
The more the merrier , and ultimately it may force ARM to change their approach, it drives innovation … both cost wise as well as in terms of derived solutions. I don’t know about you the hegemony of cuda ISA dominating the AI and consequently arena didn’t personally good for my wallet. Despite arms licensing practices they’re push has driven key efficiency and innovation solutions by both Intel and AmD to address.RISC-V isn't alone, there are literally thousands upon thousands of developers within RISC-V.
And we have x86, OpenPOWER as competing ISA's.
Is there anyway you can start using ROCm instead of CUDA?The more the merrier , and ultimately it may force ARM to change their approach, it drives innovation … both cost wise as well as in terms of derived solutions. I don’t know about you the hegemony of cuda ISA dominating the AI and consequently arena didn’t personally good for my wallet. Despite arms licensing practices they’re push has driven key efficiency and innovation solutions by both Intel and AmD to address.
Could people and companies, probably. Will they, almost certainly not. As expensive as Hopper and RTX cards are, they're still a lot cheaper than paying engineers to learn ROCm, translate all the old CUDA, debug it all again, and have made no progress on improving their software in the time they were doing that.Is there anyway you can start using ROCm instead of CUDA?
Is there anyway for Intel to revive IA-64 and the Itanium?Is there anyway you can start using ROCm instead of CUDA?
The more the merrier , and ultimately it may force ARM to change their approach, it drives innovation … both cost wise as well as in terms of derived solutions. I don’t know about you the hegemony of cuda ISA dominating the AI and consequently arena didn’t personally good for my wallet. Despite arms licensing practices they’re push has driven key efficiency and innovation solutions by both Intel and AmD to address.