Deep Learning Fundamentals - Full Stack Deep Learning
Offered By: The Full Stack via YouTube
Course Description
Overview
Dive into the fundamentals of deep learning in this 30-minute lecture from the Full Stack Deep Learning Spring 2021 course. Explore artificial neural networks, the universal approximation theorem, and three major types of learning problems. Understand the empirical risk minimization problem, grasp the concept behind gradient descent, and learn about back-propagation in practice. Examine core neural architectures and the rise of GPUs in deep learning. Cover topics including neural networks, universality, learning problems, loss functions, gradient descent, backpropagation, automatic differentiation, architectural considerations, and CUDA cores. For those needing a refresher, consult the recommended online book at neuralnetworksanddeeplearning.com before watching.
Syllabus
- Intro
- Neural Networks
- Universality
- Learning Problems
- Empirical Risk Minimization / Loss Functions
- Gradient Descent
- Backpropagation / Automatic Differentiation
- Architectural Considerations
- CUDA / Cores of Compute
Taught by
The Full Stack
Related Courses
Practical Predictive Analytics: Models and MethodsUniversity of Washington via Coursera Deep Learning Fundamentals with Keras
IBM via edX Introduction to Machine Learning
Duke University via Coursera Intro to Deep Learning with PyTorch
Facebook via Udacity Introduction to Machine Learning for Coders!
fast.ai via Independent