Inconsistency in Conference Peer Review: Revisiting the 2014 NeurIPS Experiment
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a detailed analysis of the 2014 NeurIPS peer review experiment in this informative video. Delve into the subjective nature of conference paper reviews, examining how well reviewers can predict future impact and the fate of rejected papers. Learn about the experiment's findings, including the lack of correlation between quality scores and citation counts for accepted papers, and the implications for assessing researcher quality. Gain insights into potential improvements for the reviewing process and understand the broader context of peer review in machine learning conferences.
Syllabus
- Intro & Overview
- Recap: The 2014 NeurIPS Experiment
- How much of reviewing is subjective?
- Validation via simulation
- Can reviewers predict future impact?
- Discussion & Comments
Taught by
Yannic Kilcher
Related Courses
Introduction to Statistics: Descriptive StatisticsUniversity of California, Berkeley via edX Analytical Chemistry / Instrumental Analysis
Rice University via Coursera Estadística para investigadores: Todo lo que siempre quiso saber
Universidad de Salamanca via Miríadax Valoración de futbolistas
Universitat Politècnica de València via UPV [X] Configuring the World, part 1: Comparative Political Economy
Leiden University via Coursera