Artificial Neural Networks
Offered By: Brilliant
Course Description
Overview
This course was written in collaboration with machine learning researchers and lecturers from MIT, Princeton, and Stanford.
This interactive course dives into the fundamentals of artificial neural networks, from the basic frameworks to more modern techniques like adversarial models.
You’ll answer questions such as how a computer can distinguish between pictures of dogs and cats, and how it can learn to play great chess.
Using inspiration from the human brain and some linear algebra, you’ll gain an intuition for why these models work – not just a collection of formulas.
This course is ideal for students and professionals seeking a fundamental understanding of neural networks, or brushing up on basics.
This interactive course dives into the fundamentals of artificial neural networks, from the basic frameworks to more modern techniques like adversarial models.
You’ll answer questions such as how a computer can distinguish between pictures of dogs and cats, and how it can learn to play great chess.
Using inspiration from the human brain and some linear algebra, you’ll gain an intuition for why these models work – not just a collection of formulas.
This course is ideal for students and professionals seeking a fundamental understanding of neural networks, or brushing up on basics.
Syllabus
- Learning and the Brain: To build an artificial learning algorithm, start with the human brain.
- Learning Problems for Neural Networks: Get a big-picture sense of what learning problems are all about.
- Computationally Modeling The Brain: Learn how the human brain inspires the mechanisms within ANNs.
- Computational Models of The Neuron: Build a computational model of a neuron and explore why it is so powerful.
- Math for Neural Networks: A refresher on vectors, matrices, and optimization.
- Vectors for Neural Networks: A quick overview of vectors, which are used to represent inputs and weights in an ANN.
- Matrices for Neural Networks: Matrices help simplify algorithm representation, and in practice can speed up performance.
- Optimization for Neural Networks: Derivatives play a key role in optimizing model parameters, such as weights and biases.
- Perceptrons: The building block of many neural networks.
- Perceptrons as Linear Classifiers: Get a sense of why perceptron is a linear classifier, and explore its strengths and limitations.
- Perceptron Learning Algorithm: Build up the learning algorithm for perceptron, and learn how to optimize it.
- Dealing with Perceptron Limitations: Dive deeper into the limitations of perceptron, and explore how to overcome some of them.
- Multilayer Perceptrons: Stringing it all together.
- Basics and Motivation: Learn how to transform data so that it becomes linearly separable.
- Practical Example: Behold the power of multilayer perceptrons, applied to a sportswear marketing problem.
- Multilayer Perceptron - Model Complexity: How can you measure how complex a model is, and avoid unnecessary complexion?
- Backpropagation: Using a model's outputs to train it to do even better.
- Gradient Descent: Master this powerful tool for optimization problems, such as minimizing loss.
- Backpropagation - Updating Parameters: Learn how we can update parameters — even those that are in hidden layers!
- Backpropagation: For gradient descent, you’ll need this tool for efficiently computing the gradient of an error function.
- Vanishing and Exploding Gradient: If you’re not careful, an activation function can squash or amplify gradients.
- Convolutional Neural Networks: Models to capture structural information within data.
- Convolutional Neural Networks - Overview: These networks excel in image classification problems, even achieving better-than-human performance!
- Convolutions and Striding: Explore convolutions, padding, and striding — the mathematical nuts and bolts behind feature maps.
- Convolutional Neural Networks - Pooling: Learn how to downsample an image, while retaining enough information to recognize rich objects.
- Applications and Performance: The incredible things real CNNs can accomplish.
- Recurrent Neural Networks: Models to process sequential data by remembering what we already know.
- Recurrent Neural Networks: Context and variably sized inputs are no problem for these networks.
- Training Recurrent Neural Networks: Learn how to backpropagate through time to train a simple recurrent neural network.
- Long Short-Term Memory: Learn how to add memory to a simple RNN to track long-term dependencies from inputs.
- Advanced Architectures: A look into stochastic ANNs, adversarial techniques, vectorization, and other advanced topics.
- Stochastic Neural Networks: Use randomness to bring your neural networks to the next level.
- Generative Adversarial Networks: Learn how even deterministic networks can be used to generate realistic images.
- Variational Autoencoders: Use an encoder-decoder structure to build upon the success of GANs.
- Word2Vec: Neural networks aren't just for images — see how they can be used on words, too!
- Reinforcement Learning: Take a look at the state-of-the-art work behind AlphaGo, a Go-playing AI.
Related Courses
Deep Learning Fundamentals with KerasIBM via edX Deep Learning Essentials
Université de Montréal via edX Neural Networks and Convolutional Neural Networks Essential Training
LinkedIn Learning Neuronale Netze und Deep Learning
DeepLearning.AI via Coursera Réseaux neuronaux et Deep Learning
DeepLearning.AI via Coursera