YoVDO

Neural Architecture Search Without Training - Paper Explained

Offered By: Yannic Kilcher via YouTube

Tags

Neural Architecture Search Courses Machine Learning Courses

Course Description

Overview

Explore a groundbreaking approach to Neural Architecture Search (NAS) that eliminates the need for time-consuming and resource-intensive training of numerous models. Learn how statistics of the Jacobian around data points can be used to estimate the performance of proposed architectures at initialization, significantly speeding up the NAS process. Dive into the concepts of linearization around datapoints and linearization statistics, and understand their application in the NAS-201 benchmark. Examine the experimental results that demonstrate the effectiveness of this novel method in finding powerful network architectures without any training, all within seconds on a single GPU.

Syllabus

- Intro & Overview
- Neural Architecture Search
- Controller-based NAS
- Architecture Search Without Training
- Linearization Around Datapoints
- Linearization Statistics
- NAS-201 Benchmark
- Experiments
- Conclusion & Comments


Taught by

Yannic Kilcher

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent