Designing Losses for Data-Free Training of Normalizing Flows on Boltzmann Distributions
Offered By: Valence Labs via YouTube
Course Description
Overview
Explore a comprehensive lecture on designing losses for data-free training of normalizing flows on Boltzmann distributions. Delve into the challenges of generating Boltzmann distributions in high dimensions using Normalizing Flows, and discover innovative strategies to train models with incomplete or no data. Learn about the limitations of standard losses based on Kullback-Leibler divergences, including their tendency for mode collapse in high-dimensional distributions. Examine a new loss function grounded in theory and optimized for high-dimensional tasks. Witness the application of these concepts to 3D molecular configuration generation, demonstrating how imperfect pre-trained models can be optimized without training data. Gain insights from speakers Jérôme Hénin and Guillaume Charpiat as they cover topics such as Boltzmann generators, the collapse problem, experimental results on molecules, L2 losses, and L2+ on dialanine. Conclude with a summary of contributions and participate in a Q&A session to deepen your understanding of this cutting-edge research in AI for drug discovery.
Syllabus
- Intro
- Boltzmann generators
- The collapse problem + experimental results on molecules
- L2 losses
- L2+ on dialanine
- Summary of contributions
- Q+A
Taught by
Valence Labs
Related Courses
Visual Recognition & UnderstandingUniversity at Buffalo via Coursera Deep Learning for Computer Vision
IIT Hyderabad via Swayam Deep Learning in Life Sciences - Spring 2021
Massachusetts Institute of Technology via YouTube Advanced Deep Learning Methods for Healthcare
University of Illinois at Urbana-Champaign via Coursera Generative Models
Serrano.Academy via YouTube