Non-convex SGD and Lojasiewicz-type Conditions for Deep Learning
Offered By: Centre International de Rencontres Mathématiques via YouTube
Course Description
Overview
Explore a conference talk on non-convex stochastic gradient descent (SGD) and Lojasiewicz-type conditions for deep learning, presented by Kevin Scaman at the Centre International de Rencontres Mathématiques in Marseille, France. Delve into advanced mathematical concepts applied to machine learning optimization techniques during this 47-minute presentation, recorded as part of the "Learning and Optimization in Luminy" thematic meeting. Access this talk and other presentations by renowned mathematicians through CIRM's Audiovisual Mathematics Library, featuring chapter markers, keywords, enriched content with abstracts and bibliographies, and a multi-criteria search function for easy navigation and in-depth exploration of mathematical topics.
Syllabus
Kevin Scaman: Non-convex SGD and Lojasiewicz-type conditions for deep learning
Taught by
Centre International de Rencontres Mathématiques
Related Courses
Game TheoryStanford University via Coursera Network Analysis in Systems Biology
Icahn School of Medicine at Mount Sinai via Coursera Visualizing Algebra
San Jose State University via Udacity Conceptos y Herramientas para la Física Universitaria
Tecnológico de Monterrey via Coursera Aplicaciones de la Teoría de Grafos a la vida real
Universitat Politècnica de València via UPV [X]