Natural Language Supervision for Deep Learning - Toward Language-Guided Training
Offered By: Neurosymbolic Programming for Science via YouTube
Course Description
Overview
Explore the potential of natural language supervision in machine learning through Jacob Andreas' talk at the Neurosymbolic Programming for Science conference. Delve into the contrast between traditional example-based learning and language-guided approaches in deep networks. Discover how human learners acquire concepts and skills through richer, language-based supervision. Examine recent successes in natural language processing and the challenges of applying language-based training to broader learning problems. Learn about cutting-edge results in using natural language to guide search and library learning in inductive program synthesis. Investigate the connections between these approaches and human concept learning. The talk covers topics such as psychology experiments, simple tasks, program synthesis, primitives, training time regularization, and natural language integration in machine learning processes.
Syllabus
Intro
How people learn
Psychology experiment
Learning from language
Simple tasks
Training
Program Synthesis
Summary
Audience Questions
Primitives
Under the Hood
Training Time Regularization
Training Time Natural Language
Taught by
Neurosymbolic Programming for Science
Related Courses
AI for Scientists: Accelerating Discovery through Knowledge, Data and LearningNeurosymbolic Programming for Science via YouTube Probabilistic Programming Tutorial - Part 1
Neurosymbolic Programming for Science via YouTube Model-Based Reasoning in Neurosymbolic Programming for Science
Neurosymbolic Programming for Science via YouTube Moving Beyond the First Portrait of Our Milky Way's Black Hole by Leveraging Underlying Structure
Neurosymbolic Programming for Science via YouTube Neurosymbolic Program Architecture Search Methods - Session 3
Neurosymbolic Programming for Science via YouTube