Develop ML Interactive GPU-Workflows with Visual Studio Code, Docker and Dockerhub
Offered By: Docker via YouTube
Course Description
Overview
Explore a 45-minute conference talk on developing ML interactive GPU workflows using Visual Studio Code, Docker, and Dockerhub. Learn how to overcome common challenges such as CUDA errors, GPU detection issues, and custom C++ code loading problems. Discover the power of nvidia-containers and how Docker can be leveraged to manage multiple CUDA and NVCC versions while developing inside GPU-enabled containers. Gain insights into the history of GPU development, driver installation, and technology adaptation. Understand the anatomy of base images, guidelines for building GPU images, and considerations for running GPU containers. Follow along as the speaker demonstrates how to configure a project and streamline the development process for machine learning workflows.
Syllabus
Intro
A history perspective
Install the drivers Let us summarize the experience did you know....
Assume you have GPU Driver installation
Technology adapts
Solving the issue of injecting GPU Devices
External Requirements
Reduced complexity
Anatomy and Structure of Base Images Common guidelines
Considerations while building GPU Images Pinpoint your dependencies
Considerations while running GPU Containers
Configuring a project
Taught by
Docker
Related Courses
High Performance ComputingGeorgia Institute of Technology via Udacity Fundamentals of Accelerated Computing with CUDA C/C++
Nvidia via Independent High Performance Computing for Scientists and Engineers
Indian Institute of Technology, Kharagpur via Swayam CUDA programming Masterclass with C++
Udemy Neural Network Programming - Deep Learning with PyTorch
YouTube