YoVDO

Why You Can't Use Llama 2 (70B) Yet - Hardware Requirements and Democratization Efforts

Offered By: Data Centric via YouTube

Tags

Distributed Computing Courses Fine-Tuning Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the hardware limitations and challenges of utilizing larger large language models, particularly focusing on the 70-billion-parameter Llama 2. Delve into the specific hardware requirements necessary for running such extensive models and briefly discuss an open-source project aimed at democratizing access to these advanced AI systems. Gain insights into the complexities of implementing cutting-edge language models and understand the current barriers to widespread adoption of the most sophisticated AI technologies.

Syllabus

Why you can't use Llama 2 (70B), Yet


Taught by

Data Centric

Related Courses

Cloud Computing Concepts, Part 1
University of Illinois at Urbana-Champaign via Coursera
Cloud Computing Concepts: Part 2
University of Illinois at Urbana-Champaign via Coursera
Reliable Distributed Algorithms - Part 1
KTH Royal Institute of Technology via edX
Introduction to Apache Spark and AWS
University of London International Programmes via Coursera
Réalisez des calculs distribués sur des données massives
CentraleSupélec via OpenClassrooms