Why Detecting Distributional Shift is So Hard and So Important
Offered By: Snorkel AI via YouTube
Course Description
Overview
Explore the challenges and significance of detecting data distributional shift in this 27-minute conference talk presented by Sharon Li, assistant professor at the University of Wisconsin-Madison, at Snorkel AI's The Future of Data-Centric AI Summit in 2022. Delve into the complexities of identifying changes in data distributions and understand why this task is crucial yet difficult in the field of artificial intelligence. Gain insights into potential opportunities and advancements in addressing this critical issue. Access additional related content through provided playlists to further expand your knowledge on data distribution shift, Snorkel AI, and data-centric AI approaches.
Syllabus
Why Detecting Distributional Shift is So Hard (And So Important)
Taught by
Snorkel AI
Related Courses
Data Science: Inferential Thinking through SimulationsUniversity of California, Berkeley via edX Decision Making Under Uncertainty: Introduction to Structured Expert Judgment
Delft University of Technology via edX Probabilistic Deep Learning with TensorFlow 2
Imperial College London via Coursera Agent Based Modeling
The National Centre for Research Methods via YouTube Sampling in Python
DataCamp