Skip to content

Google Collab Setup Guide

ProGamerGov edited this page Nov 10, 2019 · 43 revisions

Google Collab


Google Colaboratory offers free access to a single 12GB NVIDIA Tesla T4 or Tesla K80 GPU for machine learning projects. There are some limitations however, and you can learn more about them here: https://research.google.com/colaboratory/faq.html

Setup

Go to: https://colab.research.google.com and select a new Python3 Notebook.

Collab lets you run terminal commands by adding the ! character as the first character for the command.

Quick Start

Click on this gist: https://gist.github.com/ProGamerGov/1749bac4f0c0efd24c1b4161cbfb30e3, and then click the "Open in Collab" link at the very top. This will load up Collab with a fully setup notebook.

Github

To enable GPU usage navigate to: "Edit > Notebook settings" or "Runtime > Change runtime type" and select GPU as your Hardware accelerator. Note that you will have to reinstall neural-style-pt if you change your hardware accelerator.

Create a new code cell, add the following code to it, then click the play button on the left side of the code cell:

!git clone https://github.com/ProGamerGov/neural-style-pt

!mv neural-style-pt/* .

!rm -rf neural-style-pt

!python3 models/download_models.py

If successful, then you should see an output similar to this:

Cloning into 'neural-style-pt'...
remote: Enumerating objects: 428, done.
remote: Total 428 (delta 0), reused 0 (delta 0), pack-reused 428
Receiving objects: 100% (428/428), 36.21 MiB | 45.27 MiB/s, done.
Resolving deltas: 100% (222/222), done.
Downloading the VGG-19 model
Downloading: "https://s3-us-west-2.amazonaws.com/jcjohns-models/vgg19-d01eb7cb.pth" to /root/.cache/torch/checkpoints/vgg19-d01eb7cb.pth
100% 548M/548M [00:16<00:00, 35.4MB/s]
Downloading the VGG-16 model
Downloading: "https://s3-us-west-2.amazonaws.com/jcjohns-models/vgg16-00b39a1b.pth" to /root/.cache/torch/checkpoints/vgg16-00b39a1b.pth
100% 528M/528M [00:16<00:00, 33.0MB/s]
Downloading the NIN model
All models have been successfully downloaded

If successfully, remove the commands from the code cell, so that you don't accidentally run them again.

After that's complete, test that your neural-style-pt installation works with the appropriate command based on your chosen hardware:

CPU:

!python3 neural_style.py -gpu c -backend mkl -image_size 64

GPU:

!python3 neural_style.py -gpu 0 -backend cudnn

To see what files exist on your Collab instance, click the arrow on the left side and select "Files". You can then choose to download, delete, or rename any of the files that you see.

Multiscale Generation (Multires)

Code Cell

Instead of using a bash script, you can simply place all of the commands in the same code cell, or different code cells. If you use multiple code cells, then you will have to run each cell manually one after the other unless you navigate to "Runtime > Run all":

!python3 neural_style.py -output_image out1.png -image_size 512

!python3 neural_style.py -output_image out2.png -init image -init_image out1.png -image_size 720

!python3 neural_style.py -output_image out3.png -init image -init_image out2.png -image_size 1024

!python3 neural_style.py -output_image out4.png -init image -init_image out3.png -image_size 1536

Bash:

You can download scripts to your Collab instance with:

!wget <fileurl/script_name.sh>

Or you can simply upload them via the file browser, by using the upload option or dragging the files onto it.

Then fix the permissions with:

!chmod u+x ./<script_name.sh>

And finally you can run the script with:

!./<script_name.sh>

You can mount your Google Drive to your Collab instance by adding the following to a code cell:

from google.colab import drive
drive.mount('/content/drive')

To check whether your instance is using a Tesla K80 or a Tesla T4, add the following code to a cell and then run it:

import torch
torch.cuda.device_count()
torch.cuda.get_device_name(0)

Here's what your Python3 Notebook will look like before you start editing it:

And zoomed in:

The file browser:

Create a new code cell:

You can change a code cell's position with the arrows on the right, or delete the cell with the delete option:

Running the code:


You can display an image in a code cell with the following code:

from IPython.display import Image
# Add any available image to inside the brackets after "Image", to display it
Image("out.png")

Other useful commands:

!ls   # Get list of items in a directory

!rm -rf <filename> # Delete specified file or directory 

!wget <fileurl> # This will download the specified file

!mv <oldpath> <newpath> # Move a file or folder from one location to another.

!cp -r <oldpath> <newpath> # Copy a file or folder from one location to another.

Speed


  • Collab instances will use either a Tesla K80 or Tesla T4. You can find information about the speed of a Tesla K80 on the neural-style-pt README.

Here are the times for running 500 iterations with -image_size=512 on a Tesla T4 with different settings:

  • -backend nn -optimizer lbfgs: 72 seconds
  • -backend nn -optimizer adam: 66 seconds
  • -backend cudnn -optimizer lbfgs: 48 seconds
  • -backend cudnn -optimizer adam: 40 seconds
  • -backend cudnn -cudnn_autotune -optimizer lbfgs: 51 seconds
  • -backend cudnn -cudnn_autotune -optimizer adam: 43 seconds

Clone this wiki locally