Optimisation
Offered By: Alfredo Canziani via YouTube
Course Description
Overview
Explore a comprehensive lecture on optimization techniques in deep learning, covering gradient descent, stochastic gradient descent (SGD), and momentum updates. Delve into adaptive methods like RMSprop and ADAM, and understand the impact of normalization layers on neural network training. Learn about the intuition behind these concepts, their performance comparisons, and their effects on convergence. Discover a real-world application of neural networks in accelerating MRI scans, demonstrating the practical implications of optimization in industry.
Syllabus
– Week 5 – Lecture
– Gradient Descent
– Stochastic Gradient Descent
– Momentum
– Adaptive Methods
– Normalization Layers
– The Death of Optimization
Taught by
Alfredo Canziani
Tags
Related Courses
How Things Work: An Introduction to PhysicsUniversity of Virginia via Coursera Physics 1 for Physical Science Majors
University of Colorado Boulder via Coursera Introductory Physics I with Laboratory
Georgia Institute of Technology via Coursera Mechanics ReView
Massachusetts Institute of Technology via edX Mechanics: Motion, Forces, Energy and Gravity, from Particles to Planets
University of New South Wales via Coursera