YoVDO

Introduction to Optimization

Offered By: MITCBMM via YouTube

Tags

Machine Learning Courses Gradient Descent Courses Maximum Likelihood Estimation Courses Regularization Courses Stochastic Gradient Descent Courses

Course Description

Overview

Dive into the fundamentals of optimization in this 1-hour 12-minute tutorial led by Kevin Smith from MIT. Explore key concepts such as maximum likelihood estimation, cost functions, and gradient descent methods. Learn about convex and non-convex functions, local and global minima, and their implications in optimization problems. Discover practical applications through examples like balls in urns and coin flips. Advance to multi-dimensional gradients and their role in machine learning. Gain insights into stochastic gradient descent, regularization techniques, and sparse coding. Perfect for those looking to enhance their understanding of optimization principles and their applications in various fields, including machine learning.

Syllabus

Intro
What you will learn
Before we start
What is the likelihood?
Example: Balls in urns
Maximum likelihood estimator
Example: Coin flips
Likelihood - Cost
Back to the urn problem...
Grid search (brute force)
Local vs. global minima
Convex vs. non-convex functions
Implementation
Lecture attendance problem
Multi-dimensional gradients
Multi-dimensional gradient descent
Differentiable functions
Optimization for machine learning
Stochastic gradient descent
Regularization
Sparse coding


Taught by

MITCBMM

Related Courses

Data Analysis and Visualization
Georgia Institute of Technology via Udacity
Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
DeepLearning.AI via Coursera
機器學習基石下 (Machine Learning Foundations)---Algorithmic Foundations
National Taiwan University via Coursera
Data Science: Machine Learning
Harvard University via edX
Art and Science of Machine Learning auf Deutsch
Google Cloud via Coursera