YoVDO

Innovations in Theoretical Computer Science 2020 - Session 9

Offered By: Paul G. Allen School via YouTube

Tags

Theoretical Computer Science Courses Machine Learning Courses Neural Networks Courses Algorithms Courses Data Structures Courses Computational Complexity Courses Probability Distributions Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore cutting-edge research in theoretical computer science through this conference session from the Innovations in Theoretical Computer Science (ITCS) 2020 conference. Delve into six presentations covering diverse topics such as testing properties of multiple distributions, local access to huge random objects, learning monotone probability distributions, agreement expansion, spiking neural networks, and influence maximization. Chaired by Paul Beame, this session showcases research with strong conceptual messages, introducing new models, techniques, and applications in both traditional and interdisciplinary areas. Gain insights from speakers including Maryam Aliakbarpour, Sandeep Silwal, Ronitt Rubinfeld, and others as they present their groundbreaking work. Recorded on January 14, 2020, this closed-captioned video provides a comprehensive look at the latest advancements in theoretical computer science.

Syllabus

Innovations in Theoretical Computer Science 2020 Session 9


Taught by

Paul G. Allen School

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX