YoVDO

Languages with Decidable Learning: A Meta-theorem

Offered By: ACM SIGPLAN via YouTube

Tags

Program Synthesis Courses Computational Learning Theory Courses Semantics Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a groundbreaking meta-theorem on decidable learning in symbolic languages presented at OOPSLA1 2023. Delve into the concept of finite-aspect checkable languages and their role in characterizing symbolic languages with decidable learning. Discover how the semantics of these languages can be defined using bounded auxiliary information, independent of expression size but dependent on a fixed evaluation structure. Learn about a novel generic programming language for evaluating expression syntax trees and understand its connection to finite tree automata. Examine how this meta-theorem enables the derivation of new decidable learning results and decision procedures for various expression learning problems. Gain insights into exact learning, symbolic language learning, tree automata, version space algebra, program synthesis, and interpretable learning through this 18-minute video presentation by researchers from the University of Illinois at Urbana-Champaign.

Syllabus

[OOPSLA23] Languages with Decidable Learning: A Meta-theorem


Taught by

ACM SIGPLAN

Related Courses

Stanford Seminar - Concepts and Questions as Programs
Stanford University via YouTube
DreamCoder- Growing Generalizable, Interpretable Knowledge With Wake-Sleep Bayesian Program Learning
Yannic Kilcher via YouTube
A Neural Network Solves and Generates Mathematics Problems by Program Synthesis - Paper Explained
Aleksa Gordić - The AI Epiphany via YouTube
EI Seminar - Recent Papers in Embodied Intelligence
Massachusetts Institute of Technology via YouTube
Using Program Synthesis to Build Compilers
Simons Institute via YouTube