Develop ML Interactive GPU-Workflows with Visual Studio Code, Docker and Dockerhub
Offered By: Docker via YouTube
Course Description
Overview
Explore a 45-minute conference talk on developing ML interactive GPU workflows using Visual Studio Code, Docker, and Dockerhub. Learn how to overcome common challenges such as CUDA errors, GPU detection issues, and custom C++ code loading problems. Discover the power of nvidia-containers and how Docker can be leveraged to manage multiple CUDA and NVCC versions while developing inside GPU-enabled containers. Gain insights into the history of GPU development, driver installation, and technology adaptation. Understand the anatomy of base images, guidelines for building GPU images, and considerations for running GPU containers. Follow along as the speaker demonstrates how to configure a project and streamline the development process for machine learning workflows.
Syllabus
Intro
A history perspective
Install the drivers Let us summarize the experience did you know....
Assume you have GPU Driver installation
Technology adapts
Solving the issue of injecting GPU Devices
External Requirements
Reduced complexity
Anatomy and Structure of Base Images Common guidelines
Considerations while building GPU Images Pinpoint your dependencies
Considerations while running GPU Containers
Configuring a project
Taught by
Docker
Related Courses
Deep Learning with Python and PyTorch.IBM via edX Introduction to Machine Learning
Duke University via Coursera How Google does Machine Learning em Português Brasileiro
Google Cloud via Coursera Intro to Deep Learning with PyTorch
Facebook via Udacity Secure and Private AI
Facebook via Udacity