YoVDO

Languages with Decidable Learning: A Meta-theorem

Offered By: ACM SIGPLAN via YouTube

Tags

Program Synthesis Courses Computational Learning Theory Courses Semantics Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a groundbreaking meta-theorem on decidable learning in symbolic languages presented at OOPSLA1 2023. Delve into the concept of finite-aspect checkable languages and their role in characterizing symbolic languages with decidable learning. Discover how the semantics of these languages can be defined using bounded auxiliary information, independent of expression size but dependent on a fixed evaluation structure. Learn about a novel generic programming language for evaluating expression syntax trees and understand its connection to finite tree automata. Examine how this meta-theorem enables the derivation of new decidable learning results and decision procedures for various expression learning problems. Gain insights into exact learning, symbolic language learning, tree automata, version space algebra, program synthesis, and interpretable learning through this 18-minute video presentation by researchers from the University of Illinois at Urbana-Champaign.

Syllabus

[OOPSLA23] Languages with Decidable Learning: A Meta-theorem


Taught by

ACM SIGPLAN

Related Courses

Machine Learning 1—Supervised Learning
Brown University via Udacity
Computational Learning Theory and Beyond
openHPI
Leslie G. Valiant - Turing Award Lecture 2010
Association for Computing Machinery (ACM) via YouTube
Learning of Neural Networks with Quantum Computers and Learning of Quantum States with Graphical Models
Institute for Pure & Applied Mathematics (IPAM) via YouTube
Smoothed Analysis for Learning Concepts with Low Intrinsic Dimension
Institute for Pure & Applied Mathematics (IPAM) via YouTube