How to Containerize GPU Applications


By providing self-contained execution environments without the overhead of a full virtual machine, containers have become an appealing proposition for deploying applications at scale. The credit goes to Docker for making containers easy-to-use and hence making them popular. From enabling multiple engineering teams to play around with their own configuration for development, to benchmarking or deploying a scalable microservices architecture, containers are finding uses everywhere.

GPU-based applications, especially in the deep learning field, are rapidly becoming part of the standard workflow; deploying, testing and benchmarking these applications in a containerized application has quickly become the accepted convention. But native implementation of Docker containers does not support NVIDIA GPUs yet — that’s why we developed nvidia-docker plugin. Here I’ll walk you through how to use it.

Read more at SuperUser