YoVDO

How Far Can We Scale Up? Deep Learning's Diminishing Returns

Offered By: Yannic Kilcher via YouTube

Tags

Deep Learning Courses CO2 Emissions Courses

Course Description

Overview

Explore the limits of exponential scaling in AI and potential solutions in this 20-minute video review of an article on deep learning's diminishing returns. Examine the impressive results achieved through massive increases in computational power and data, while considering the challenges of overparameterization, power usage, and CO2 emissions. Delve into current attempts to address scaling issues, including a discussion on ImageNet V2 and the potential of symbolic methods. Gain insights into the future of AI development and the need for more efficient approaches to continue advancing the field.

Syllabus

- Intro & Overview
- Deep Learning at its limits
- The cost of overparameterization
- Extrapolating power usage and CO2 emissions
- We cannot just continue scaling up
- Current solution attempts
- Aside: ImageNet V2
- Are symbolic methods the way out?


Taught by

Yannic Kilcher

Related Courses

Energy in buildings
The Open University via OpenLearn
Advancing Sustainability: A New Environmental Agenda for a Changing World
North Carolina School of Science and Mathematics via YouTube
HPC Green Computing - Relationships Among Throughput and Latency
Devoxx via YouTube
The Global Carbon Cycle and Human Impact - AGU Fall Meeting 2001
AGU via YouTube
Deep Crustal Metamorphic Carbon Cycling in Collisional Orogens - Daly Lecture
AGU via YouTube