YoVDO

Self-Improving Teacher Cultivates Better Student: Distillation Calibration for Multimodal Large Language Models - Lecture 3.3

Offered By: Association for Computing Machinery (ACM) via YouTube

Tags

Machine Learning Courses Self Improvement Courses Computer Vision Courses Neural Networks Courses Transfer Learning Courses Model Compression Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a cutting-edge approach to improving multimodal large language models in this 14-minute conference talk from SIGIR 2024. Delve into the concept of "Self-Improving Teacher Cultivates Better Student: Distillation Calibration For Multimodal Large Language Models" presented by authors Xinwei Li, Li Lin, Shuai Wang, and Chen Qian. Learn about innovative techniques for enhancing the performance and capabilities of multimodal AI systems through self-improvement and distillation calibration methods. Gain insights into the latest advancements in artificial intelligence and machine learning, specifically focused on multimodal large language models and their potential applications.

Syllabus

SIGIR 2024 M3.3 [fp] Self-Imp Teacher Cultivates Better Student: Distillation Calibration For MM LLM


Taught by

Association for Computing Machinery (ACM)

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX