YoVDO

Innovations in Theoretical Computer Science 2020 - Session 9

Offered By: Paul G. Allen School via YouTube

Tags

Theoretical Computer Science Courses Machine Learning Courses Neural Networks Courses Algorithms Courses Data Structures Courses Computational Complexity Courses Probability Distributions Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore cutting-edge research in theoretical computer science through this conference session from the Innovations in Theoretical Computer Science (ITCS) 2020 conference. Delve into six presentations covering diverse topics such as testing properties of multiple distributions, local access to huge random objects, learning monotone probability distributions, agreement expansion, spiking neural networks, and influence maximization. Chaired by Paul Beame, this session showcases research with strong conceptual messages, introducing new models, techniques, and applications in both traditional and interdisciplinary areas. Gain insights from speakers including Maryam Aliakbarpour, Sandeep Silwal, Ronitt Rubinfeld, and others as they present their groundbreaking work. Recorded on January 14, 2020, this closed-captioned video provides a comprehensive look at the latest advancements in theoretical computer science.

Syllabus

Innovations in Theoretical Computer Science 2020 Session 9


Taught by

Paul G. Allen School

Related Courses

Automata Theory
Stanford University via edX
Intro to Theoretical Computer Science
Udacity
Computing: Art, Magic, Science
ETH Zurich via edX
理论计算机科学基础 | Introduction to Theoretical Computer Science
Peking University via edX
Quantitative Formal Modeling and Worst-Case Performance Analysis
EIT Digital via Coursera