NVIDIA/tensorrt-inference-server
![NVIDIA/tensorrt-inference-server](/images/project/repo-4_hu26043946899fa7e66e0980bc333aad54_47802_900x500_fit_q75_box.jpg)
The TensorRT Inference Server provides a cloud inferencing solution optimized for NVIDIA GPUs.
repo name | NVIDIA/tensorrt-inference-server |
repo link | https://github.com/NVIDIA/tensorrt-inference-server |
homepage | |
language | C++ |
size (curr.) | 8120 kB |
stars (curr.) | 1016 |
created | 2018-10-04 |
license | BSD 3-Clause “New” or “Revised” License |