Your submission was sent successfully! Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

You have successfully unsubscribed! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates about Ubuntu and upcoming events where you can meet our team.Close

Getting started with CUDA on Ubuntu on WSL 2

haydenb

on 17 June 2020

This article was last updated 2 years ago.


At Build 2020 Microsoft announced support for GPU compute on Windows Subsystem for Linux 2. Ubuntu is the leading Linux distribution for WSL and a sponsor of WSLConf. Canonical, the publisher of Ubuntu, provides enterprise support for Ubuntu on WSL through Ubuntu Advantage.

This guide will walk early adopters through the steps on turning their Windows 10 devices into a CUDA development workstation with Ubuntu on WSL.

For our purposes we will be setting up Jupyter Notebook in Docker with CUDA on WSL. These instructions can be adapted to set up other CUDA GPU compute workloads on WSL.

Install Windows 10 Insiders Dev Channel

To begin, we need the latest Windows 10 Insider build released today, June 17, 2020. You will need to register as a Windows Insider if you have not already, enroll your device in the Dev Channel (previously known as the ‘Fast Ring’), and then upgrade to Windows 10 build number 20150 or above.

Consult the Windows Insider documentation for more information on registering as an Insider, enrolling your device, and upgrading your machine to the Dev Channel.

Enable WSL 2

In future updates to Windows you will simply need to use the following to enable WSL:

wsl --install

For now, open PowerShell as Administrator.

First enable WSL 1:

dism.exe /online /enable-feature /featurename:Microsoft-Windows-Subsystem-Linux /all /norestart

Then enable WSL 2:

dism.exe /online /enable-feature /featurename:VirtualMachinePlatform /all /norestart

Restart Windows 10:

Restart-Computer

This step will become redundant in the future after WSL 2 becomes the default, but for now return to PowerShell and make WSL 2 your default before installing Ubuntu:

wsl.exe --set-default-version 2

To read more about Ubuntu on WSL, visit myasnchisdf.eu.org/wsl. For a more detailed look at enabling WSL on Windows, check out our tutorial. To convert existing WSL 1 installs to WSL 2, see my blog on the general availability of WSL 2.

Install Ubuntu on WSL

Install Ubuntu from the Microsoft Store:

For other ways to install Ubuntu on WSL, see our WSL wiki.

Install Windows Terminal

Optionally, you may install the new Windows Terminal from the Microsoft Store. It is has many features, such as GPU acceleration and customizability, that improves the Ubuntu experience on WSL over the traditional Windows console.

Windows Terminal can also be installed from the project’s GitHub page.

Setup Ubuntu on WSL

Open Ubuntu from the Windows Start Menu and configure your WSL user. This user is separate from your Windows user:

If you downloaded Windows Terminal you can then close the old console and re-open Ubuntu from the drop-down options in the new Terminal:

Now check to make sure you are running the correct WSL 2 Linux kernel.

In Ubuntu:

uname -r

You will need kernel 4.19.121 or higher. If you do not have kernel 4.19.121, first try:

wsl.exe --update

If that does not work, make sure you have ‘Receive updates for other Microsoft products when you update Windows’ checked:

And then re-run Windows updates:

Install Nvidia Drivers on Windows 10

Next, download the appropriate driver for your GeForce or Quadro Nvidia card.

In the next few months, the NVIDIA driver will be distributed via Windows Update, which will manually downloading and installing the driver unnecessary.

You will need to join the Nvidia Developer Program for early access to the driver for now. You can read more about CUDA on WSL on the Nvidia Developer blog.

Install Docker in WSL

sudo apt -y install docker.io

Install Nvidia Container Toolkit

Set the distribution variable, import the Nvidia repository GPG key, and then add the Nvidia repositories to the Ubuntu apt package manager:

distribution=$(. /etc/os-release;echo $ID$VERSION_ID)

curl -s -L https://nvidia.github.io/nvidia-docker/gpgkey | sudo apt-key add -

curl -s -L https://nvidia.github.io/nvidia-docker/$distribution/nvidia-docker.list | sudo tee /etc/apt/sources.list.d/nvidia-docker.list

curl -s -L https://nvidia.github.io/libnvidia-container/experimental/$distribution/libnvidia-container-experimental.list | sudo tee /etc/apt/sources.list.d/libnvidia-container-experimental.list

Refresh the Ubuntu apt repositories and then install the Nvidia runtime:

sudo apt update && sudo apt install -y nvidia-docker2

Close all open terminals of Ubuntu, open a PowerShell terminal, and manually shutdown Ubuntu:

wsl.exe --shutdown Ubuntu

Test GPU Compute

Open a new Ubuntu terminal and start Docker:

sudo service docker stop

And then run:

docker run --gpus all nvcr.io/nvidia/k8s/cuda-sample:nbody nbody -gpu -benchmark

If everything is configured correctly, the output should resemble:

Start A TensorFlow Container

Open a new Ubuntu terminal and run:

docker run -u $(id -u):$(id -g) -it --gpus all -p 8888:8888 tensorflow/tensorflow:latest-gpu-py3-jupyter

Open a second Ubuntu terminal, type wslview, copy and paste the notebook URL, but then edit the URL from 127.0.0.1 to localhost:

wslview http://localhost:8888/?token=a83a1ad288a7bf1bd1deb694c8a7f19223c8d0baa7d5fb3c

Your default browser on Windows will launch with a GPU-accelerated Jupyter notebook:

You are now all set to begin using TensorFlow with CUDA on Ubuntu WSL.

Related blog posts

Additional resources

Talk to us today

Interested in running Ubuntu in your organisation?

Newsletter signup

Get the latest Ubuntu news and updates in your inbox.

By submitting this form, I confirm that I have read and agree to Canonical's Privacy Policy.

Related posts

Accelerate AI development with Ubuntu and NVIDIA AI Workbench

As the preferred OS for data science, AI and ML, Ubuntu plays an integral role in NVIDIA AI Workbench capabilities. 

Canonical accelerates AI Application Development with NVIDIA AI Enterprise

Charmed Kubernetes support comes to NVIDIA AI Enterprise Canonical’s Charmed Kubernetes is now supported on NVIDIA AI Enterprise 5.0. Organisations using...

Canonical Kubernetes enhances AI/ML development capabilities with NVIDIA integrations

In recent years, Artificial Intelligence (AI) and Machine Learning (ML) have surged in importance. This rise can be attributed to a massive influx of data,...