Supported Deep Learning Frameworks
Framework | Checkpoint File requirements | Framework Type |
---|---|---|
ONNX | Your ONNX model should be saved as a .onnx file, and must support inference with dynamic batch size. | "onnx" |
TensorFlow | Infery supports TF2 saved-model format. zip the file's directory before uploading (zipped directory). | "tf2" |
TensorFlow Lite | Infery can load .tflite models using TensorFlow's API. | "tflite" |
CoreML | Infery can load .coreml models using the coreml-tools package. | "coreml" |
TorchScript | Make sure to save your traced TorchScript module as a .pth file. | "torchscript" |
Keras | Save your Keras model in .h5 format. Make sure to save both the architecture and the weights. | "keras" |
OpenVino | A CPU optimized checkpoint that was optimized by the Deci Platform. | "openvino" |
Nvidia TensorRT | A GPU optimized checkpoint that was optimized by the Deci Platform. Infery can also load TensorRT checkpoints, when saving them as described Here. Please note that TensorRT is version sensitive, so the current supported version is 8.0.1.6 | "trt" |
Updated about 1 year ago