YoVDO

Non-convex SGD and Lojasiewicz-type Conditions for Deep Learning

Offered By: Centre International de Rencontres Mathématiques via YouTube

Tags

Deep Learning Courses Machine Learning Courses Neural Networks Courses Mathematical Modeling Courses Stochastic Gradient Descent Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a conference talk on non-convex stochastic gradient descent (SGD) and Lojasiewicz-type conditions for deep learning, presented by Kevin Scaman at the Centre International de Rencontres Mathématiques in Marseille, France. Delve into advanced mathematical concepts applied to machine learning optimization techniques during this 47-minute presentation, recorded as part of the "Learning and Optimization in Luminy" thematic meeting. Access this talk and other presentations by renowned mathematicians through CIRM's Audiovisual Mathematics Library, featuring chapter markers, keywords, enriched content with abstracts and bibliographies, and a multi-criteria search function for easy navigation and in-depth exploration of mathematical topics.

Syllabus

Kevin Scaman: Non-convex SGD and Lojasiewicz-type conditions for deep learning


Taught by

Centre International de Rencontres Mathématiques

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera
Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera
Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera
Leading Ambitious Teaching and Learning
Microsoft via edX