YoVDO

Efficient Distributed Deep Learning Using MXNet

Offered By: Simons Institute via YouTube

Tags

Machine Learning Courses Amazon Rekognition Courses Parallel Programming Courses Tensors Courses Declarative Programming Courses Distributed Deep Learning Courses

Course Description

Overview

Explore efficient distributed deep learning techniques using MXNet in this 45-minute lecture by Anima Anandkumar from UC Irvine. Delve into practical considerations for machine learning, challenges in deploying large-scale learning, and declarative programming. Discover MXNet's mixed programming paradigm and hierarchical parameter server. Examine tensor contraction as a layer and learn about Amazon AI services like Rekognition for object, scene, and facial analysis, as well as Polly for voice quality and pronunciation. Gain insights into computational challenges in machine learning and strategies for writing parallel programs in this comprehensive talk from the Simons Institute's Computational Challenges in Machine Learning series.

Syllabus

Intro
PRACTICAL CONSIDERATIONS FOR MACHINE LEARNING
CHALLENGES IN DEPLOYING LARGE-SCALE LEARNING
DECLARATIVE PROGRAMMING
MXNET: MIXED PROGRAMMING PARADIGM
WRITING PARALLEL PROGRAMS IS HARD
HIERARCHICAL PARAMETER SERVER IN MXNET
TENSORS, DEEP LEARNING & MXNET
TENSOR CONTRACTION AS A LAYER
Introducing Amazon Al
Rekognition: Object & Scene Detection
Rekognition: Facial Analysis
Polly: A Focus On Voice Quality & Pronunciation


Taught by

Simons Institute

Related Courses

Functional Programming For Beginners With JavaScript
Udemy
Master Java Reactive Programming with RxJava 2
Udemy
[NEW] Functional programming for javascript developers
Udemy
Functional Programming in Java - Full Course
freeCodeCamp
Functional Programming with PHP
LinkedIn Learning