YoVDO

Fine-Tune GPT-3 to Write an Entire Coherent Novel - Part 2

Offered By: David Shapiro ~ AI via YouTube

Tags

GPT-3 Courses ChatGPT Courses Novel Writing Courses

Course Description

Overview

Explore advanced techniques for fine-tuning GPT-3 to generate entire novels in this comprehensive video tutorial. Learn about Auto Muse for generating books through successive chunks, creating book summaries, and iteratively condensing text. Discover strategies for dealing with information decay, shortening outlines, and reducing word counts using GPT-3. Delve into practical examples involving classic literature like Pride and Prejudice, The Great Gatsby, and Sherlock Holmes. Master the challenges of working within character limits and fine-tuning models for specific genres like fan fiction.

Syllabus

- Auto Muse: Generating a book through successive chunks
- Writing the next chunk of a novel
- Generating book summaries
- Summarizing the summaries
- The time decay of information
- Shortening the Outlines
- Using GPT3 to reduce the word count of a passage
- Iteratively making a text more concise
- The futility of the human condition
- Summarizing Pride and Prejudice
- Solving the case of the extra spaces
- Using GPT3 to generate summaries
- Training GPT3 to write a novel
- The limit of 6000 characters for a gpt3 prompt
- Trying to make a super concise summary
- The Great Gatsby
- The Adventures of Sherlock Holmes
- Fine-tuning a GPT-3 model for Sherlock Holmes fan fiction


Taught by

David Shapiro ~ AI

Related Courses

How to Build Codex Solutions
Microsoft via YouTube
Unlocking the Power of OpenAI for Startups - Microsoft for Startups
Microsoft via YouTube
Building Intelligent Applications with World-Class AI
Microsoft via YouTube
Stanford Seminar - Transformers in Language: The Development of GPT Models Including GPT-3
Stanford University via YouTube
ChatGPT: GPT-3, GPT-4 Turbo: Unleash the Power of LLM's
Udemy