YoVDO

Recent Progress on Grokking and Probabilistic Federated Learning

Offered By: VinAI via YouTube

Tags

Machine Learning Courses Neural Networks Courses Federated Learning Courses Classification Courses Bayesian Inference Courses Gaussian Processes Courses Probabilistic Models Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore recent advancements in machine learning through this one-hour seminar presented by Thang Bui, a lecturer in Machine Learning at the Australian National University. Dive into two key topics: the grokking phenomenon and probabilistic federated learning. Discover how grokking, where neural networks achieve near-perfect accuracy on validation sets long after similar performance on training sets, extends beyond neural networks to Gaussian process classification, regression, and linear regression. Examine the hypothesis that this phenomenon is governed by the accessibility of specific regions in error and complexity landscapes. Then, delve into federated training of probabilistic models, focusing on Bayesian neural networks and Gaussian processes using partitioned variational inference. Gain insights into current techniques' limitations and potential future directions in this field.

Syllabus

[Seminar Series] Recent Progress on Grokking and Probabilistic Federated Learning


Taught by

VinAI

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX