Deep Learning On A Stick: Movidius' 'Fathom' Neural Compute Stick

Status
Not open for further replies.

jakjawagon

Distinguished
Aug 28, 2010
356
3
18,965
55
The Fathom’s performance ranges from 80 to 150 GFLOPS, depending on the neural network’s complexity and precision (8-bit and 16-bit precision is supported). That performance requires less than 1.2W of power, which is more than 12 times as efficient as, for example, an Nvidia Tegra X1 processor.
OK, let's maths this out. The linked article says the Tegra X1 peaks at 1024 GFLOPS (FP16). At 15 watts, that's ~68.3 GFLOPS per watt. The Fathom gets 150 GFLOPS at 1.2 watts, making 125 GFLOPS per watt. Yes, quite a bit more efficient than the Tegra X1, but nowhere near the 12 times that the article says.
 

Co BIY

Honorable
Jun 18, 2015
366
33
10,840
6
Care to play a game of Tick Tack Toe?
The good outcome depicted in War Games depended on the "game" being unwinnable and the computer being ruthlessly rational. Once we give the AI some ability to bluff for advantage and take even reasonable risks along with the a more normal scenario that includes some possibility of winning the fictional future could be much more dangerous.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS