DocumentationAPI Reference
Back to ConsoleLog In

Installing INFERY

Install INFERY

PIP install the INFERY Python package by clicking the Copy icon to copy the command. Run this command in a CLI (terminal) on the machine on which to deploy the model.

For CPU –

python3 -m pip install -U pip
python3 -m pip install infery

For GPU –

python3 -m pip install -U pip
python3 -m pip install -U --extra-index-url https://pypi.ngc.nvidia.com infery-gpu

👍

Installation On Existing Environments

Sometimes we need to install Infery inside our existing environment, that has pre-installed versions of numpy, TensorRT, Torch, etc.

To install Infery without replacing the above package's versions, you can pass "--no-deps" flag to the pip install command, telling pip to skip these dependencies. Those will not be re-installed and the current versions will be used by Infery.

Please keep in mind that in this constellation, some frameworks might not function as expected.

One example that might be problematic is using binary packages, like PyTorch, TensorFlow, ONNX-Runtime. These are usually coupled to an explicit numpy version during build time, and other numpy versions might prevent these libraries from loading.

Infery automatically manages these dependencies for you, making sure all the packages work together inside your interpreter.

We recommend NOT to use the --no-deps flag, whenever possible.

Verify INFERY Installation

To verify your installation, simply import INFERY and expect the following output –

$ python3
>>> import infery
-INFO- Infery was successfully imported with 2 CPUS and 1 GPUS.

INFERY should explicitly declare the visible hardware upon successful import.


Did this page help you?