Nvidia announced two new inference-optimized GPUs for deep learning, the Tesla P4 and Tesla P40. The two bring support for lower-precision INT8 operations as well Nvidia's new TensorRT inference engine, which significantly improve the chips' performance
Nvidia's Tesla P4 And P40 GPUs Boost Deep Learning Inference Performance With INT8, TensorRT Support : Read more
Nvidia's Tesla P4 And P40 GPUs Boost Deep Learning Inference Performance With INT8, TensorRT Support : Read more