YoVDO

Locally-Hosted Offline LLM with LlamaIndex and OPT - Implementing Open-Source Instruction-Tuned Language Models

Offered By: Samuel Chan via YouTube

Tags

LlamaIndex Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn how to implement an open-source Large Language Model (LLM) that runs locally on your machine, even in offline mode. Explore Meta's OPT model, a 175-billion-parameter model rivaling GPT-3 in performance, with a focus on its instruction-tuned version, OPT-IML. Discover the process of setting up and utilizing this powerful offline LLM using LlamaIndex. Gain insights into the OPT architecture, its capabilities, and the advantages of instruction-tuned models. Dive into practical implementation steps, code examples, and best practices for leveraging this technology in your projects. Understand the implications of locally-hosted LLMs for privacy, customization, and offline accessibility. This video is part of a comprehensive series on LangChain and LLMs, offering valuable resources and references for further exploration in the field of natural language processing and AI.

Syllabus

Locally-hosted, offline LLM w/LlamaIndex + OPT (open source, instruction-tuning LLM)


Taught by

Samuel Chan

Related Courses

Building a Queryable Journal with OpenAI, Markdown, and LlamaIndex
Samuel Chan via YouTube
Building an AI Language Tutor with Pinecone, LlamaIndex, GPT-3, and BeautifulSoup
Samuel Chan via YouTube
Understanding Embeddings in Large Language Models - LlamaIndex and Chroma DB
Samuel Chan via YouTube
A Deep Dive Into Retrieval-Augmented Generation with LlamaIndex
Linux Foundation via YouTube
Bring Cassandra to the GenAI Crowd - Meet the Low-Friction CassIO Library
Linux Foundation via YouTube