YoVDO

Recent Progress on Grokking and Probabilistic Federated Learning

Offered By: VinAI via YouTube

Tags

Machine Learning Courses Neural Networks Courses Federated Learning Courses Classification Courses Bayesian Inference Courses Gaussian Processes Courses Probabilistic Models Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore recent advancements in machine learning through this one-hour seminar presented by Thang Bui, a lecturer in Machine Learning at the Australian National University. Dive into two key topics: the grokking phenomenon and probabilistic federated learning. Discover how grokking, where neural networks achieve near-perfect accuracy on validation sets long after similar performance on training sets, extends beyond neural networks to Gaussian process classification, regression, and linear regression. Examine the hypothesis that this phenomenon is governed by the accessibility of specific regions in error and complexity landscapes. Then, delve into federated training of probabilistic models, focusing on Bayesian neural networks and Gaussian processes using partitioned variational inference. Gain insights into current techniques' limitations and potential future directions in this field.

Syllabus

[Seminar Series] Recent Progress on Grokking and Probabilistic Federated Learning


Taught by

VinAI

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent