News Intel launches Gaudi 3 accelerator for AI: Slower than H100 but also cheaper

JayNor

Honorable
May 31, 2019
456
102
10,860
hothardware says:
"According to Intel, Gaudi 3 delivers an average 50% improvement in inferencing performance with an approximate 40% improvement in power efficiency versus NVIDIA’s H100, but at a fraction of the cost."
 
  • Like
Reactions: JRStern

cyrusfox

Distinguished
Not hard to beat Nvidia on cost, but in terms of software plug and play and performance + scalability, good luck.

Intel was forecasted to do $500 million only on Gaudi 3 in revenue for this year, paltry amount compared to AMD($4.5 billion), and miniscule compared to NVidia during this AI goldrush($100+ billion by my count). If Gaudi is good, it needs to show it on the market and not on a press deck. Go win some mind share and market share, especially outside of china, as China is largely blocked from many of the top AI solutions thanks to US government restrictions.
 
Mar 19, 2024
33
14
35
Good memoty size but no FP64 , even no FP32 and 10x price than probably a bit slower Tenstorrent Wormhole GPU . This means unlikely it will find good applications in HPC.
 
Sep 25, 2024
1
0
10
I kind of seeing your info misinformation, because what I see from nextplatform differ from your claims.. In Intel's test, too, Intel Gaudi3 is MUCH faster than H100.. I doubt it is only for software..
Peak FP16/BF16 is much faster than H100 and even B100.. Nvidia's are 8 & 14, Gaudi3's is 14.68
In peak FP8 H100 is 16 B100 is 28 and Intel's Gaudi 3 is 14.68 again..
In peak FP32, H100 is 4 and Gaudi 3 is 2.39


These are the numbers that I read in the Next Platform, I added the link, please check it out from below..

https://www.nextplatform.com/wp-con...ntel-computex-gaudi-vs-nvidia-tnp-table-2.jpg