This simple docker image is the basis for being able to use gpu based python solutions
(such as tensorflow) in an environment where the gpu may or may-not be available as well
as without the need to use nvidia-docker
.
There are a few pieces that are needed to make everything work as expected.
- The Nvidia driver, this allows the system to not required a seperate volume to be mounted
- The CUDA and CUDNN installations
- Python
Once these pieces are all in place installing a library such as tensorflow will work using
pip install tensorflow-gpu
If you wish to enable gpu support in the calls you will need to make sure the dev points are added to the image when started (in privileged mode).
docker run --device /dev/nvidiactl --device /dev/nvidia-uvm --device /dev/nvidia{/d} mikewright/python-cuda
If you were to run without the devices, the libraries would all be there, but looking up devices would fail.
So this image was created by referencing a number of other Dockerfile out there, so credit is given where credit is due.
Source{d} install of nvidia driver (overall idea)
NVIDIA CUDNN 5 Dockerfile
NVIDIA CUDA 8 Dockerfile
Python 3.6.1 Dockerfile
Has a built version found here