MIT: Introduction to Deep Learning
Offered By: Alexander Amini via YouTube
Course Description
Overview
Syllabus
Intro
The Rise of Deep Learning
What is Deep Learning?
Lecture Schedule
Final Class Project
Class Support
Course Staff
Why Deep Learning
The Perceptron: Forward Propagation
Common Activation Functions
Importance of Activation Functions
The Perceptron: Example
The Perceptron: Simplified
Multi Output Perceptron
Single Layer Neural Network
Deep Neural Network
Quantifying Loss
Empirical Loss
Binary Cross Entropy Loss
Mean Squared Error Loss
Loss Optimization
Computing Gradients: Backpropagation
Training Neural Networks is Difficult
Setting the Learning Rate
Adaptive Learning Rates
Adaptive Learning Rate Algorithms
Stochastic Gradient Descent
Mini-batches while training
The Problem of Overfitting
Regularization 1: Dropout
Regularization 2: Early Stopping
Core Foundation Review
Taught by
https://www.youtube.com/@AAmini/videos
Tags
Related Courses
Data Analysis and VisualizationGeorgia Institute of Technology via Udacity Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
DeepLearning.AI via Coursera 機器學習基石下 (Machine Learning Foundations)---Algorithmic Foundations
National Taiwan University via Coursera Data Science: Machine Learning
Harvard University via edX Art and Science of Machine Learning auf Deutsch
Google Cloud via Coursera