Guaranteed Training of Neural Networks Using Tensor Methods - 2015
Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube
Course Description
Overview
Explore a lecture on guaranteed training methods for neural networks using tensor decomposition techniques. Delve into a novel approach that provides risk bounds for training two-layer neural networks with polynomial sample and computational complexity. Learn how unsupervised learning can enhance supervised tasks through the estimation of probabilistic score functions. Discover insights from Anima Anandkumar, a faculty member at U.C. Irvine's EECS Department, as she discusses her research in large-scale machine learning and high-dimensional statistics. Gain valuable knowledge about overcoming the non-convex optimization challenges in neural network training and avoiding local optima, especially in high-dimensional spaces.
Syllabus
Guaranteed Training of Neural Networks Using Tensor Methods – Anima Anandkumar - 2015
Taught by
Center for Language & Speech Processing(CLSP), JHU
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX