News Nvidia's upcoming ARM-based N1X SoC leaks again, this time on FurMark — modest benchmark score indicates early engineering sample but confirms Wind...

The article said:
That's not even as good as some RTX 2060 scores, despite the N1X reportedly featuring 6,144 CUDA cores, more than the RTX 5070.
...
Running at a modest 120W power budget ...
There's your answer - or, at least a big part of it. Another big part might be the difference in memory bandwidth.

I found a claim that DGX Spark runs at 170 W, so that's probably the upper end of what the N1X would use:
 
  • Like
Reactions: artk2219
Color me less then impressed... this was originally supposed to be a 2025 product. Now what mid to late 2026?

Although, Strix Halo successor will probably be a year later than this, so it will probably sell.
 
  • Like
Reactions: artk2219
  • Like
Reactions: artk2219
>...cuda support for it and laptops runing it with linux.

N1X will run on Linux and WoA.

https://www.laptopmag.com/laptops/windows-laptops/nvidia-n1x-apu-benchmarks

"The Nvidia N1x APU was tested on an HP unit based on the motherboard specification for an HP 8EA3. The test unit was running Linux AArch64 and featured 128GB of system memory, confirming the N1x as a 20-thread, 2.81 GHz ARM processor."

CUDA is supported in Linux



>Strix Halo successor will probably be a year later than this, so it will probably sell.

Medusa Halo is rumored to be cancelled. Judging from Strix Halo's lack of design wins, not a surprise.

https://www.techpowerup.com/339056/...resh-amd-cancels-medusa-halo-in-latest-rumors
CUDA requires drivers for each of the specific cores. Just because it has CUDA cores, doesn't mean it will have support for the CUDA software out of the box. None of the news i've skimmed on the internet had any info about that. So we will wait and see "soon" :)
 
  • Like
Reactions: artk2219
CUDA requires drivers for each of the specific cores. Just because it has CUDA cores, doesn't mean it will have support for the CUDA software out of the box. None of the news i've skimmed on the internet had any info about that. So we will wait and see "soon" :)
Everything Nvidia ever shipped in the last 15+ years supported CUDA, even the handful of graphics cards that didn't officially support it (GT 1030 being one example, I think). That includes all of their Jetson boards.

So, the chance that they won't support CUDA on these is basically nil. However, you're perhaps aware that a lot of the deep learning horsepower is probably contained in NVDLA blocks, and those aren't regular CUDA devices, as far as I know. They should still be supported via TensorRT, as well as probably some other deep learning frameworks, but they might not be open for you to program, directly.
 
  • Like
Reactions: artk2219