Natural Language Generation in Python
Offered By: DataCamp
Course Description
Overview
Imitate Shakespear, translate language and autocomplete sentences using Deep Learning in Python.
Have you ever wondered how Gmail autocompletes your sentences, or, what powers the WhatsApp suggestions when you’re typing a message? The technology behind these helpful writing hints is machine learning. In this course, you'll build and train machine learning models for different natural language generation tasks. For example, you'll train a model on the literary works of Shakespeare and generate text in the style of his writing. You'll also learn how to create a neural translation model to translate English sentences into French. Finally, you'll train a seq2seq model to generate your own natural language autocomplete sentences, just like Gmail!
Have you ever wondered how Gmail autocompletes your sentences, or, what powers the WhatsApp suggestions when you’re typing a message? The technology behind these helpful writing hints is machine learning. In this course, you'll build and train machine learning models for different natural language generation tasks. For example, you'll train a model on the literary works of Shakespeare and generate text in the style of his writing. You'll also learn how to create a neural translation model to translate English sentences into French. Finally, you'll train a seq2seq model to generate your own natural language autocomplete sentences, just like Gmail!
Syllabus
Introduction to Sequential Data
-The order of words in sentences is important (unless Yoda you are called). That’s why in this chapter, you’ll learn how to represent your data sequentially and use neural network architecture to model your text data. You'll learn how to create and train a recurrent network to generate new text, character by character. You'll also use the names dataset to build your own baby name generator, using a very simple recurrent neural network and the Keras package.
Write Like Shakespeare
-In this chapter, you’ll find out how to overcome the limitations of recurrent neural networks when input sequences span long intervals. To avoid vanishing and exploding gradient problems you'll be introduced to long short term memory (LSTM) networks that are more effective when working with long-term dependencies. You'll work on a fun project where you'll build and train a simple LSTM model using selected literary works of Shakespeare to generate new text in the unique writing style of Shakespeare.
Translate Words into a Different Language
-In this chapter, you'll learn about the encoder-decoder architecture and how it can be used to model sequence-to-sequence datasets, converting information from one domain to another domain. You'll use this knowledge to build a model for neural machine translation, training your model to translate English sentences into French.
Autocomplete Your Sentences
-In this chapter, you'll build your very own machine learning seq2seq model. You'll use real-world messages from the Enron email dataset to train an encoder-decoder model. Using this you’ll predict the correct ending for an incomplete input sentence.
-The order of words in sentences is important (unless Yoda you are called). That’s why in this chapter, you’ll learn how to represent your data sequentially and use neural network architecture to model your text data. You'll learn how to create and train a recurrent network to generate new text, character by character. You'll also use the names dataset to build your own baby name generator, using a very simple recurrent neural network and the Keras package.
Write Like Shakespeare
-In this chapter, you’ll find out how to overcome the limitations of recurrent neural networks when input sequences span long intervals. To avoid vanishing and exploding gradient problems you'll be introduced to long short term memory (LSTM) networks that are more effective when working with long-term dependencies. You'll work on a fun project where you'll build and train a simple LSTM model using selected literary works of Shakespeare to generate new text in the unique writing style of Shakespeare.
Translate Words into a Different Language
-In this chapter, you'll learn about the encoder-decoder architecture and how it can be used to model sequence-to-sequence datasets, converting information from one domain to another domain. You'll use this knowledge to build a model for neural machine translation, training your model to translate English sentences into French.
Autocomplete Your Sentences
-In this chapter, you'll build your very own machine learning seq2seq model. You'll use real-world messages from the Enron email dataset to train an encoder-decoder model. Using this you’ll predict the correct ending for an incomplete input sentence.
Taught by
Biswanath Halder
Related Courses
Machine Translation with KerasDataCamp