November 27, 2019

38 words 1 min read

NVIDIA/tensorrt-inference-server

NVIDIA/tensorrt-inference-server

The TensorRT Inference Server provides a cloud inferencing solution optimized for NVIDIA GPUs.

repo name NVIDIA/tensorrt-inference-server
repo link https://github.com/NVIDIA/tensorrt-inference-server
homepage
language C++
size (curr.) 8120 kB
stars (curr.) 1016
created 2018-10-04
license BSD 3-Clause “New” or “Revised” License
comments powered by Disqus