If you're on a Mac, make sure the Docker engine is running. If you're on Linux, then prefix your docker commands with sudo. Alternatively, you can create a docker group to get rid of this issue. The pull command fetches the busybox image from the Docker registry and saves it to our system. Docker lets you use the simple Docker Compose file format to deploy complex applications to Kubernetes. You can deploy the wordsmith app to the local Kubernetes cluster using docker-compose.yml. First, check whether Kubernetes is installed and running. Open up settings from the Docker Desktop menu and select Kubernetes.
- Docker.raw Mac
- Docker For Mac Raw Format Software
- Docker For Mac Raw Format Free
- Docker For Mac Raw Format Converter
- Docker For Mac Raw Format Windows 10
Docker uses containers tocreate virtual environments that isolate a TensorFlow installation from the restof the system. TensorFlow programs are run within this virtual environment thatcan share resources with its host machine (access directories, use the GPU,connect to the Internet, etc.). TheTensorFlow Docker images are tested for each release.
Docker is the easiest way to enable TensorFlow GPU support on Linux since only theNVIDIA® GPU driver is required on the host machine (the NVIDIA® CUDA® Toolkit does not need tobe installed).
TensorFlow Docker requirements
- Install Docker onyour local host machine.
- For GPU support on Linux, install NVIDIA Docker support.
- Take note of your Docker version with
docker -v
. Versions earlier than 19.03 require nvidia-docker2 and the--runtime=nvidia
flag. On versions including and after 19.03, you will use thenvidia-container-toolkit
package and the--gpus all
flag. Both options are documented on the page linked above.
- Take note of your Docker version with
docker
command without sudo
, create the docker
group andadd your user. For details, see thepost-installation steps for Linux.Download a TensorFlow Docker image
The official TensorFlow Docker images are located in the tensorflow/tensorflow Docker Hub repository. Image releases are tagged using the following format:
Tag | Description |
---|---|
latest | The latest release of TensorFlow CPU binary image. Default. |
nightly | Nightly builds of the TensorFlow image. (Unstable.) |
version | Specify the version of the TensorFlow binary image, for example: 2.1.0 |
devel | Nightly builds of a TensorFlow master development environment. Includes TensorFlow source code. |
custom-op | Special experimental image for developing TF custom ops. More info here. |
Each base tag has variants that add or change functionality:
Tag Variants | Description |
---|---|
tag -gpu | The specified tag release with GPU support. (See below) |
tag -jupyter | The specified tag release with Jupyter (includes TensorFlow tutorial notebooks) |
You can use multiple variants at once. For example, the following downloadsTensorFlow release images to your machine:
Start a TensorFlow Docker container
To start a TensorFlow-configured container, use the following command form:
For details, see the docker run reference.
Examples using CPU-only images
Let's verify the TensorFlow installation using the latest
tagged image. Dockerdownloads a new TensorFlow image the first time it is run:
Let's demonstrate some more TensorFlow Docker recipes. Start a bash
shellsession within a TensorFlow-configured container:
Within the container, you can start a python
session and import TensorFlow.
To run a TensorFlow program developed on the host machine within a container,mount the host directory and change the container's working directory(-v hostDir:containerDir -w workDir
):
Permission issues can arise when files created within a container are exposed tothe host. It's usually best to edit files on the host system.
Start a Jupyter Notebook server usingTensorFlow's nightly build:
Follow the instructions and open the URL in your host web browser:http://127.0.0.1:8888/?token=...
GPU support
Docker.raw Mac
Docker is the easiest way to run TensorFlow on a GPU since the host machineonly requires the NVIDIA® driver (the NVIDIA® CUDA® Toolkit is not required).
Install the Nvidia Container Toolkit to add NVIDIA® GPU support to Docker. nvidia-container-runtime
is onlyavailable for Linux. See the nvidia-container-runtime
platform support FAQ for details.
Docker For Mac Raw Format Software
Check if a GPU is available:
Verify your nvidia-docker
installation:
nvidia-docker
v2 uses --runtime=nvidia
instead of --gpus all
. nvidia-docker
v1 uses the nvidia-docker
alias, rather than the --runtime=nvidia
or --gpus all
command line flags.Docker For Mac Raw Format Free
Examples using GPU-enabled images
Download and run a GPU-enabled TensorFlow image (may take a few minutes):
It can take a while to set up the GPU-enabled image. If repeatedly runningGPU-based scripts, you can use docker exec
to reuse a container.
Use the latest TensorFlow GPU image to start a bash
shell session in the container: