Supported Deep Learning Frameworks

FrameworkCheckpoint File requirementsFramework Type
ONNXYour ONNX model should be saved as a .onnx file, and must support inference with dynamic batch size."onnx"
TensorFlowInfery supports TF2 saved-model format. zip the file's directory before uploading (zipped directory)."tf2"
TensorFlow LiteInfery can load .tflite models using TensorFlow's API."tflite"
CoreMLInfery can load .coreml models using the coreml-tools package."coreml"
TorchScriptMake sure to save your traced TorchScript module as a .pth file."torchscript"
KerasSave your Keras model in .h5 format. Make sure to save both the architecture and the weights."keras"
OpenVinoA CPU optimized checkpoint that was optimized by the Deci Platform."openvino"
Nvidia TensorRTA GPU optimized checkpoint that was optimized by the Deci Platform.

Infery can also load TensorRT checkpoints, when saving them as described Here.

Please note that TensorRT is version sensitive, so the current supported version is 8.0.1.6
"trt"

What’s Next