YoVDO

Inside GPT - Large Language Models Demystified

Offered By: GOTO Conferences via YouTube

Tags

GPT-2 Courses ChatGPT Courses GPT-4 Courses Word Embeddings Courses Transformer Architecture Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive deep into the architecture and inner workings of GPT algorithms and ChatGPT in this comprehensive conference talk from GOTO Amsterdam 2024. Explore fundamental concepts of natural language processing, including word embedding, vectorization, and tokenization. Follow along with hands-on demonstrations of training a GPT2 model to generate song lyrics, showcasing the internals of word sequence prediction. Examine larger language models like ChatGPT and GPT4, understanding their capabilities and limitations. Learn about hyperparameters such as temperature and frequency penalty, and see their effects on generated output. Gain practical insights into harnessing GPT algorithms for your own solutions through multiple demos covering ChatGPT2, Word2Vec dimensionality reduction, GPT2 input embedding, multi-head attention, and next token prediction. Discover how to leverage these powerful tools to create engaging and useful applications for your business.

Syllabus

Intro
GPT sequence prediction
Prompt engineering
Demo: ChatGPT2
Processing text
Demo: Word2Vec dimensionality reduction
Transformer architecture
Demo: GPT2 input embedding
Self attention
Demo: GPT2 multi-head attention
Attention example
Demo: GPT2 next token prediction
Parameters
Thanks for explanations & inspiration
Outro


Taught by

GOTO Conferences

Related Courses

Artificial Intelligence Foundations: Neural Networks
LinkedIn Learning
Transformers: Text Classification for NLP Using BERT
LinkedIn Learning
TensorFlow: Working with NLP
LinkedIn Learning
BERTによる自然言語処理を学ぼう! -Attention、TransformerからBERTへとつながるNLP技術-
Udemy
Complete Natural Language Processing Tutorial in Python
Keith Galli via YouTube