YoVDO

Lessons from Scale for Large Language Models and Quantitative Reasoning

Offered By: Stanford Physics via YouTube

Tags

Artificial Intelligence Courses Machine Learning Courses Quantitative Reasoning Courses Computational Linguistics Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the latest advancements in large language models and their application to quantitative reasoning in this Stanford Physics colloquium talk. Delve into the impressive capabilities of these models in natural language tasks, often rivaling or surpassing human performance. Examine the robust power-law improvements observed across various scales of datasets, models, and computational resources. Investigate the challenges faced in extrapolating certain capabilities, particularly in the domain of multi-step quantitative reasoning for mathematics and science. Learn about recent progress in understanding and predicting model capabilities as they scale, with a focus on Minerva, a large language model specifically designed for multi-step STEM problem-solving. Gain insights into the potential future developments and applications of these powerful AI models in scientific and mathematical fields.

Syllabus

Ethan Dyer - “Lessons from scale for large language models and quantitative reasoning”


Taught by

Stanford Physics

Related Courses

Introduction to Systems Biology
Icahn School of Medicine at Mount Sinai via Coursera
Basic Science: Understanding Numbers
The Open University via FutureLearn
Precalculus
Arizona State University via edX
University Chemistry: Molecular Foundations and Global Frontiers Part 1
Harvard University via edX
Preparing for the GMAT
LinkedIn Learning