Text Generation
Offered By: Codecademy
Course Description
Overview
Learn about seq2seq and LSTM neural networks commonly used in NLP work and how to implement them with TensorFlow for machine translation.
Ever wanted to build your own AI? Text generation is the process of training a computer to create language. This course introduces language generation and machine translation using long short-term memory (LSTM) networks, recurrent neural networks (RNN).
Ever wanted to build your own AI? Text generation is the process of training a computer to create language. This course introduces language generation and machine translation using long short-term memory (LSTM) networks, recurrent neural networks (RNN).
Syllabus
- Text Generation: Learn about seq2seq and LSTM neural networks commonly used in NLP work and how to implement them with TensorFlow for machine translation.
- Article: Long Short Term Memory Networks
- Lesson: Generating Text with Deep Learning
- ExternalResource: Deep Learning with Python
- Informational: Off-Platform Project: Machine Translation
- Quiz: Generating Text with Deep Learning
Taught by
Jace van Auken
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Artificial Intelligence for Robotics
Stanford University via Udacity Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent