YoVDO

Non-convex SGD and Lojasiewicz-type Conditions for Deep Learning

Offered By: Centre International de Rencontres Mathématiques via YouTube

Tags

Deep Learning Courses Machine Learning Courses Neural Networks Courses Mathematical Modeling Courses Stochastic Gradient Descent Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a conference talk on non-convex stochastic gradient descent (SGD) and Lojasiewicz-type conditions for deep learning, presented by Kevin Scaman at the Centre International de Rencontres Mathématiques in Marseille, France. Delve into advanced mathematical concepts applied to machine learning optimization techniques during this 47-minute presentation, recorded as part of the "Learning and Optimization in Luminy" thematic meeting. Access this talk and other presentations by renowned mathematicians through CIRM's Audiovisual Mathematics Library, featuring chapter markers, keywords, enriched content with abstracts and bibliographies, and a multi-criteria search function for easy navigation and in-depth exploration of mathematical topics.

Syllabus

Kevin Scaman: Non-convex SGD and Lojasiewicz-type conditions for deep learning


Taught by

Centre International de Rencontres Mathématiques

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX