YoVDO

Why You Can't Use Llama 2 (70B) Yet - Hardware Requirements and Democratization Efforts

Offered By: Data Centric via YouTube

Tags

Distributed Computing Courses Fine-Tuning Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the hardware limitations and challenges of utilizing larger large language models, particularly focusing on the 70-billion-parameter Llama 2. Delve into the specific hardware requirements necessary for running such extensive models and briefly discuss an open-source project aimed at democratizing access to these advanced AI systems. Gain insights into the complexities of implementing cutting-edge language models and understand the current barriers to widespread adoption of the most sophisticated AI technologies.

Syllabus

Why you can't use Llama 2 (70B), Yet


Taught by

Data Centric

Related Courses

TensorFlow: Working with NLP
LinkedIn Learning
Introduction to Video Editing - Video Editing Tutorials
Great Learning via YouTube
HuggingFace Crash Course - Sentiment Analysis, Model Hub, Fine Tuning
Python Engineer via YouTube
GPT3 and Finetuning the Core Objective Functions - A Deep Dive
David Shapiro ~ AI via YouTube
How to Build a Q&A AI in Python - Open-Domain Question-Answering
James Briggs via YouTube