YoVDO

Recent Progress on Grokking and Probabilistic Federated Learning

Offered By: VinAI via YouTube

Tags

Machine Learning Courses Neural Networks Courses Federated Learning Courses Classification Courses Bayesian Inference Courses Gaussian Processes Courses Probabilistic Models Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore recent advancements in machine learning through this one-hour seminar presented by Thang Bui, a lecturer in Machine Learning at the Australian National University. Dive into two key topics: the grokking phenomenon and probabilistic federated learning. Discover how grokking, where neural networks achieve near-perfect accuracy on validation sets long after similar performance on training sets, extends beyond neural networks to Gaussian process classification, regression, and linear regression. Examine the hypothesis that this phenomenon is governed by the accessibility of specific regions in error and complexity landscapes. Then, delve into federated training of probabilistic models, focusing on Bayesian neural networks and Gaussian processes using partitioned variational inference. Gain insights into current techniques' limitations and potential future directions in this field.

Syllabus

[Seminar Series] Recent Progress on Grokking and Probabilistic Federated Learning


Taught by

VinAI

Related Courses

Secure and Private AI
Facebook via Udacity
Advanced Deployment Scenarios with TensorFlow
DeepLearning.AI via Coursera
Big Data for Reliability and Security
Purdue University via edX
MLOps for Scaling TinyML
Harvard University via edX
Edge Analytics: IoT and Data Science
LinkedIn Learning