Text Generation
Offered By: Codecademy
Course Description
Overview
Learn about seq2seq and LSTM neural networks commonly used in NLP work and how to implement them with TensorFlow for machine translation.
Ever wanted to build your own AI? Text generation is the process of training a computer to create language. This course introduces language generation and machine translation using long short-term memory (LSTM) networks, recurrent neural networks (RNN).
Ever wanted to build your own AI? Text generation is the process of training a computer to create language. This course introduces language generation and machine translation using long short-term memory (LSTM) networks, recurrent neural networks (RNN).
Syllabus
- Text Generation: Learn about seq2seq and LSTM neural networks commonly used in NLP work and how to implement them with TensorFlow for machine translation.
- Article: Long Short Term Memory Networks
- Lesson: Generating Text with Deep Learning
- ExternalResource: Deep Learning with Python
- Informational: Off-Platform Project: Machine Translation
- Quiz: Generating Text with Deep Learning
Taught by
Jace van Auken
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologĂas del lenguaje humano
Universidad de Alicante via MirĂadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam