Introduction
Using ML-Cloud is very convenient for most tasks, but for some tasks,
you need to know all the details of the libraries installed on your
instance. For example, if you want to build a tensorflow mobile
app for your mobile phone, you need the specified version of different
software, specified gcc, ndk, g++, even tensorflow version with specified
commit id in Github. In such a condition, you might need to build your
own customized instance. This blog gives a simple way to build your own
development env based docker.
About Docker
In some contexts, the docker is similar with the instance in google cloud.
Developers use Docker to eliminate “works on my machine” , sounds like
build once, run anywhere. With docker, you can export images like google cloud,
to do this , you can share your own dev env with others. You can preserve
it on the docker hub. More details can seen in Waht’s docker
How to
Before start, one notice is that you may try to finish your work through
the gcloud
command.
Create an Instance with GPU support.
Check your GPU limits
gcloud beta compute regions describe asia-east1
Notice these zones support GPU :us-west1-b, us-east1-d, europe-west1-b, asia-east1-a
If this command doesn’t work, you might try
gcloud components update && gcloud components install beta
Create an instance with GPUs
Run
gcloud beta compute instances create jeju-projec-docker --zone asia-east1-a --accelerator type=nvidia-tesla-k80,count=1 --image-family ubuntu-1604-lts --image-project ubuntu-os-cloud --boot-disk-size 200GB --maintenance-policy TERMINATE --restart-on-failure
GPU dirver
When you finish your creating, you can logging in your instance
throughgcloud compute ssh jeju-project-docker --zone asia-east1-a
You can follow the official instructions from Nvidia. If you are using Ubuntu, I recommend
you install cuda and its toolkit through deb file, it’s simplest way I have tried.If you have an account of Nvidia developer, you can download cudnn
to accelerate your application.
Build Dev Env
Firstly, you need to install docker through sudo apt-get install docker && sudo apt-get install docker.io
After your installation, if you want to use tensorflow,
using sudo docker search tensorflow
to search the tensorflow images.
You can see the most stared image if from google. You can runsudo docker run -it tensorflow/tensorflow bash
to get in the image,
then, you can test import tensorflow.
If you want to avoid to use sudo, you can add your account to docker
group.
Similarly, you can search Pytorch through docker search command.
If you want to use docker with GPU, you can install nvidia-docker
from here, and replace
docker with nvidia-docker in the above commands.