YoVDO

A Retrieval-based Language Model at Scale - Remote Talk

Offered By: Simons Institute via YouTube

Tags

Scaling Courses Transformers Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a remote talk on retrieval-based language models at scale, presented by Sewon Min from UC Berkeley and AI2. Delve into the advantages of retrieval-based LMs as an alternative to dense models, focusing on their ability to combine learned parameters with large datastores. Discover two recent works aimed at improving these models in the context of Large Language Models (LLMs). Learn about a novel pre-training approach for LMs to condition on retrieved documents, and examine the scaling properties of retrieval-based LMs using a massive 1.4 trillion token datastore. Investigate the potential for compute-optional setups across various downstream tasks and consider open-ended questions regarding the impact of retrieval on data training, handling data restrictions, and the possibilities for modular LMs. This 39-minute presentation, part of the "Transformers as a Computational Model" series at the Simons Institute, offers valuable insights into cutting-edge developments in language model technology.

Syllabus

A Retrieval-based Language Model at Scale (Remote Talk)


Taught by

Simons Institute

Related Courses

Web Development
Udacity
Fractals and Scaling
Santa Fe Institute via Complexity Explorer
Adobe Experience Manager and MongoDB
MongoDB University
Google Cloud Platform for AWS Professionals
Google via Coursera
Inove na gestão de equipes e negócios: O crescimento da empresa
Universidade de São Paulo via Coursera