News UALink has Nvidia's NVLink in the crosshairs — final specs support up to 1,024 GPUs with 200 GT/s bandwidth

It be nice to see someone, anyone, humble Nvidia a little bit as their greed knows no bounds. Hopefully with more competition Nvidia will need to re-evaluate it's pricing in both AI and gaming workloads. The AI bubble won't last forever and Nvidia has to know that.

It'll be interesting to see just how long Nvidia keeps its lead in AI. I think others don't need to beat them on performance, but just come close, offer similar scalability, and beat them on perf/$. With UALink, that's one less point on the board in Nvidia's favor.

This has been on my mind a lot as of late. Nvidia had so much success over the years I feel like they got overly confident. Their prices such seem to indicate as much. I suspect someone will challenge them soon in AI that costs them market share. But I could be wrong. Intel led in the CPU server/consumer space for decades before their foundry issues tripped them up at the same time AMD really began to challenge them again. After AMDs x64 coup d'etat moment that helped sink Itanium with the introduction of Opteron CPUs (Athlon 64 CPUs for consumer), Bulldozer-> Excavator did a lot of damage to their marketshare/mindshare. Point being no telling how long Nvidia might hold their king of the hill position. It could be a couple years or a couple decades. Either way they need to be humbled in a big way soon for the good of consumers. Their pricing is getting a *little* <cough cough> out of control.
 
  • Like
Reactions: valthuer
Either way they need to be humbled in a big way soon for the good of consumers. Their pricing is getting a *little* <cough cough> out of control.

Yeah, I wouldn’t mind Nvidia having another DeepSeek moment. Consumers would definitely benefit from that.
 
Point being no telling how long Nvidia might hold their king of the hill position. It could be a couple years or a couple decades. Either way they need to be humbled in a big way soon for the good of consumers. Their pricing is getting a *little* <cough cough> out of control.
Nvidia needs to move beyond the general-purpose compute architectures they've been using for AI. How quickly and successfully they do that will help determine their ability to stay on top. They have their NVDLA NPUs for doing inference workloads in their embedded SoCs. So, they do "get it".

They're also becoming heavily power/cooling-limited. That's adding to the cost of their solutions and could add more delays (there's some suggestion it was part of Blackwell's holdups). So, we might be nearing a point where they stumble in the face of a more efficient-dataflow architecture, like most of the other NPUs out there.

One thing I can say with a fair degree of certainty: it doesn't seem like AMD will be the one to usurp Nvidia's dominance in AI. AMD is making its usual mistake of trying to beat Nvidia at its own game and they're fumbling the ball quite badly.


Maybe UDNA will be a game changer. I wouldn't bet on it, but we'll see.
 
Last edited:
  • Like
Reactions: atomicWAR