YoVDO

Optimization

Offered By: MITCBMM via YouTube

Tags

Gradient Descent Courses Machine Learning Courses

Course Description

Overview

Explore optimization techniques in this 56-minute tutorial from the MIT BMM Summer Course 2018, presented by Kevin Smith. Dive into key concepts such as maximum likelihood estimation, cost functions, and gradient descent. Learn about grid search, local vs. global minima, and the differences between convex and non-convex functions. Examine practical applications through examples like the balls in urns problem and the lecture attendance problem. Discover how optimization applies to machine learning, including stochastic gradient descent, regularization, and sparse coding. Gain insights into multi-dimensional gradients, differentiable functions, and the importance of momentum in optimization algorithms. This comprehensive tutorial provides a solid foundation for understanding and implementing optimization techniques in various computational and analytical contexts.

Syllabus

What you will learn
Materials and notes
What is the likelihood?
Example: Balls in urns
Maximum likelihood estimator
Cost functions
Likelihood - Cost
Grid search (brute force)
Local vs. global minima
Convex vs. non-convex functions
Implementation
Lecture attendance problem
Multi-dimensional gradients
Multi-dimensional gradient descent
Differentiable functions
Optimization for machine learning
Stochastic gradient descent
Regularization
Sparse coding
Momentum
Important terms


Taught by

MITCBMM

Related Courses

Practical Predictive Analytics: Models and Methods
University of Washington via Coursera
Deep Learning Fundamentals with Keras
IBM via edX
Introduction to Machine Learning
Duke University via Coursera
Intro to Deep Learning with PyTorch
Facebook via Udacity
Introduction to Machine Learning for Coders!
fast.ai via Independent