Text Generation
Offered By: Codecademy
Course Description
Overview
Learn about seq2seq and LSTM neural networks commonly used in NLP work and how to implement them with TensorFlow for machine translation.
Ever wanted to build your own AI? Text generation is the process of training a computer to create language. This course introduces language generation and machine translation using long short-term memory (LSTM) networks, recurrent neural networks (RNN).
Ever wanted to build your own AI? Text generation is the process of training a computer to create language. This course introduces language generation and machine translation using long short-term memory (LSTM) networks, recurrent neural networks (RNN).
Syllabus
- Text Generation: Learn about seq2seq and LSTM neural networks commonly used in NLP work and how to implement them with TensorFlow for machine translation.
- Article: Long Short Term Memory Networks
- Lesson: Generating Text with Deep Learning
- ExternalResource: Deep Learning with Python
- Informational: Off-Platform Project: Machine Translation
- Quiz: Generating Text with Deep Learning
Taught by
Jace van Auken
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX