NVIDIA/tensorrt-inference-server
The TensorRT Inference Server provides a cloud inferencing solution optimized for NVIDIA GPUs.
repo name | NVIDIA/tensorrt-inference-server |
repo link | https://github.com/NVIDIA/tensorrt-inference-server |
homepage | |
language | C++ |
size (curr.) | 8120 kB |
stars (curr.) | 1016 |
created | 2018-10-04 |
license | BSD 3-Clause “New” or “Revised” License |