On Tensors, Tensorflow, And Nvidia's Latest 'Tensor Cores'

Status
Not open for further replies.

dstarr3

Distinguished
Why do I feel like one day, there's going to be a discrete AI card for gaming, like how their used to be discrete PhysX cards. And then eventually they'll vanish because only three games ever utilize them properly, and the tech will be watered down and slowly absorbed into typical GPUs because it's not altogether useless tech.
 

bit_user

Polypheme
Ambassador
According to https://devblogs.nvidia.com/parallelforall/cuda-9-features-revealed/:
During program execution, multiple Tensor Cores are used concurrently by a full warp of execution.
This sounds like tensor "cores" aren't really cores, but rather async compute units driven by SM threads. As I thought, more or less.

If these tensor units trickle down into smaller Volta GPUs, I can foresee a wave of GPU-accelerated apps adding support for them. Not sure Nvidia will do it, though. They didn't give us double-rate fp16 in any of the smaller Pascal GPUs, and this is pretty much an evolution of that capability.
 

bit_user

Polypheme
Ambassador
Why would this functionality even migrate out of GPUs into dedicated (consumer-oriented) cards, in the first place? Remember that PhysX started out as dedicated hardware that got absorbed into GPUs, as GPUs became more general.

If I'm a game developer, why would I even use some dedicated deep learning hardware that nobody has, when I could just do the same thing on the GPU? Maybe it increases the GPU specs required by a game, but gamers would probably rather spend another $100 on their GPU than to buy a $100 AI card that's supported by only 3 games.
 

dstarr3

Distinguished


*shrug* People bought the stuff before. Not a lot, obviously. But some. Just a stray thought, anyway.
 

nharon

Prominent
May 22, 2017
1
0
510
If precision is such a non-factor, why not just use analog calculation cores? It would take up a LOT less space. Power draw? something to think about...
 

bit_user

Polypheme
Ambassador

Since I don't actually know what's the deal-breaker for analog (though I could speculate), I'll just point out that it's been tried before:

http://www.nytimes.com/1993/02/13/business/from-intel-the-thinking-machine-s-chip.html

I assume that if this approach continued to make sense, people would be pursuing it.
 
Status
Not open for further replies.