The Lottery Ticket Hypothesis for Gigantic Pre-Trained Models
Offered By: VinAI via YouTube
Course Description
Overview
Explore the lottery ticket hypothesis for large pre-trained models in this seminar presented by Professor Atlas Wang from UT Austin. Delve into the fascinating world of machine learning, computer vision, and optimization as Wang discusses his research on finding smaller, trainable subnetworks within enormous pre-trained models. Learn about the application of this hypothesis to both NLP and computer vision domains, with specific examples from BERT and ImageNet pre-trained models. Discover how these subnetworks can achieve high levels of sparsity while maintaining full accuracy and transferability to downstream tasks. Gain insights into the implications of this research for the future of deep learning and large-scale pre-training paradigms. Access additional resources and project details through the provided webpage link.
Syllabus
Seminar Series: The lottery ticket hypothesis for gigantic pre trained models
Taught by
VinAI
Related Courses
Structuring Machine Learning ProjectsDeepLearning.AI via Coursera Natural Language Processing on Google Cloud
Google Cloud via Coursera Introduction to Learning Transfer and Life Long Learning (3L)
University of California, Irvine via Coursera Advanced Deployment Scenarios with TensorFlow
DeepLearning.AI via Coursera Neural Style Transfer with TensorFlow
Coursera Project Network via Coursera