onnx runtime fastapi模型部署
Published on Aug. 22, 2023, 12:10 p.m.
fastapi
模型部署大体流程如下:
- torch-> onnx
- fastapi做为接口服务,后端使用onnx推理
- Docker打包部署
fastapi官网
https://fastapi.tiangolo.com/
这里有个示例可供参考
https://github.com/naxty/scikit-onnx-fastapi-example
还有这篇文章介绍比较详细
https://medium.com/@nico.axtmann95/deploying-a-scikit-learn-model-with-onnx-und-fastapi-1af398268915
其他优化
onnx的官方docker示例
https://github.com/microsoft/onnxruntime/tree/master/dockerfiles