Natural Language Supervision for Deep Learning - Toward Language-Guided Training
Offered By: Neurosymbolic Programming for Science via YouTube
Course Description
Overview
Explore the potential of natural language supervision in machine learning through Jacob Andreas' talk at the Neurosymbolic Programming for Science conference. Delve into the contrast between traditional example-based learning and language-guided approaches in deep networks. Discover how human learners acquire concepts and skills through richer, language-based supervision. Examine recent successes in natural language processing and the challenges of applying language-based training to broader learning problems. Learn about cutting-edge results in using natural language to guide search and library learning in inductive program synthesis. Investigate the connections between these approaches and human concept learning. The talk covers topics such as psychology experiments, simple tasks, program synthesis, primitives, training time regularization, and natural language integration in machine learning processes.
Syllabus
Intro
How people learn
Psychology experiment
Learning from language
Simple tasks
Training
Program Synthesis
Summary
Audience Questions
Primitives
Under the Hood
Training Time Regularization
Training Time Natural Language
Taught by
Neurosymbolic Programming for Science
Related Courses
Stanford Seminar - Concepts and Questions as ProgramsStanford University via YouTube DreamCoder- Growing Generalizable, Interpretable Knowledge With Wake-Sleep Bayesian Program Learning
Yannic Kilcher via YouTube A Neural Network Solves and Generates Mathematics Problems by Program Synthesis - Paper Explained
Aleksa Gordić - The AI Epiphany via YouTube EI Seminar - Recent Papers in Embodied Intelligence
Massachusetts Institute of Technology via YouTube Using Program Synthesis to Build Compilers
Simons Institute via YouTube