Installing Infery

Installing Infery

Environment Requirements

infery-cpu:

  • Python 3.7, 3.8 or 3.9 (Python 3.9 currently not supported on Windows).
  • Other versions of python are supported, but they are not pre-built and compiled for optimization.
  • Please contact us is infery is not supported on your environment.

infery-gpu:

Installing Infery Inside Virtual Environments

Container technologies like Docker are not virtual environments. Containers of any type are not affected by this scope.

  • Some use cases require us to install infery inside virtual environments, like VirtualEnv, PyEnv or Conda, etc.

    • Please make sure to include the operating system's original site packages in your environment, by specifying the flag --system-site-packages upon environment creation, as explained in https://pypi.org/project/openvino/.
      This flag is crucial in order to use OpenVino models with infery on Linux hosts inside virtual environments.

    • When the system site-packages are not visible to the environment, for example - on an existing environment, some vital libraries will be missing. This will prevent OpenVino from loading properly on Linux hosts.
      In that case, infery will fail to load OpenVino models.
      This can be solved by installing https://packages.ubuntu.com/bionic/python3-dev or the equivalent for your Linux distribution.

Installation On Existing Environments

Sometimes we need to install Infery inside our existing environment, that has pre-installed versions of numpy, TensorRT, Torch, etc.

To install Infery without replacing the above package's versions, you can pass "--no-deps" flag to the pip install command, telling pip to skip these dependencies. Those will not be re-installed and the current versions will be used by Infery.

Please keep in mind that in this constellation, some frameworks might not function as expected.

One example that might be problematic is using binary packages, like PyTorch, TensorFlow, ONNX-Runtime. These are usually coupled to an explicit numpy version during build time, and other numpy versions might prevent these libraries from loading.

Infery automatically manages these dependencies for you, making sure all the packages work together inside your interpreter.

We recommend NOT to use the --no-deps flag, whenever possible.

Installing Infery

Infery is installed using pip, Python's package manager.
You can do so by clicking the Copy icon to copy the command.
Run this command in a CLI (terminal) on the machine on which to deploy the model.
Please make sure the machine and environment matches the prerequisites.

CPU Installation

The "infery" artifact on pypi will install all CPU supported frameworks.
Sub-Packages and plugins in Infery will be available soon.

python3 -m pip install -U pip
python3 -m pip install infery

GPU Installation

"infery-gpu" Is an artifact that installs all required GPU versions of the original 'infery' frameworks, including PyCuda and TensorR, etc.

We recommend installing pycuda and THEN install infery-gpu (Usually works better).
In the command, we use an extra index URL for nvidia NGC to fetch nvidia-tensorrt.

python3 -m pip install -U pip

# Compile pycuda for the local CUDA. The example uses CUDA 11.2, change it to your version.
export PATH=/usr/local/cuda-11.2/bin${PATH:+:${PATH}}
export LD_LIBRARY_PATH=/usr/local/cuda-11.2/lib64${LD_LIBRARY_PATH:+:${LD_LIBRARY_PATH}}
python3 -m pip install -U pycuda

# Install infery-gpu from PyPi and TensorRT from nvidia's pip repository
python3 -m pip install -U --extra-index-url https://pypi.ngc.nvidia.com infery-gpu

Jetson Installation

"infery-jetson" is the infery artifact for Jetson devices (Nano, Xavier and Orin).

The jetson artifacts are not available on pypi, because of the Jetson ARM propietery wheels.
It will install all supported frameworks using pre-built wheels for Jetson.
Some of the frameworks don't release artifacts for ARM, so, we built those wheels for you and they are included in the dependencies.

Infery-Jetson supports Python 3.6 and 3.8 at the moment, for Nvidia JetPack SDKs of versions 4. and 5. correspondingly.

python3 -m pip install -U pip

# For Python 3.6 
python3 -m pip install https://deci-packages-public.s3.amazonaws.com/infery_jetson-3.4.0-cp36-cp36m-linux_aarch64.whl

# For Python 3.8
python3 -m pip install https://deci-packages-public.s3.amazonaws.com/infery_jetson-3.4.0-cp38-cp38-linux_aarch64.whl

Verify The Installation

To verify your installation, simply import infery and expect the following output –

$ python3
>>> import infery
-INFO- Infery was successfully imported with 2 CPUS and 1 GPUS.

infery should explicitly declare the visible hardware upon successful import.