How Convolutional Neural Networks Work, in Depth
Offered By: Brandon Rohrer via YouTube
Course Description
Overview
Dive deep into the inner workings of convolutional neural networks in this comprehensive one-hour lecture. Explore the fundamental concepts, including filtering, convolution, pooling, and rectified linear units (ReLUs). Learn how ConvNets match pieces of images and understand the mathematics behind the matching process. Discover the role of fully connected layers, input vectors, and neurons in neural networks. Examine the complexities of receptive fields and output layers. Gain insights into training techniques such as gradient descent, backpropagation, and training from scratch. Understand how these concepts apply to real-world scenarios like customer data analysis and tea drinking temperature prediction.
Syllabus
Intro
Trickier cases
ConvNets match pieces of the image
Filtering: The math behind the match
Convolution: Trying every possible match
Pooling
Rectified Linear Units (ReLUS)
Fully connected layer
Input vector
A neuron
Squash the result
Weighted sum-and-squash neuron
Receptive fields get more complex
Add an output layer
Exhaustive search
Gradient descent with curvature
Tea drinking temperature
Chaining
Backpropagation challenge: weights
Backpropagation challenge: sums
Backpropagation challenge: sigmoid
Backpropagation challenge: ReLU
Training from scratch
Customer data
Taught by
Brandon Rohrer
Related Courses
Practical Predictive Analytics: Models and MethodsUniversity of Washington via Coursera Deep Learning Fundamentals with Keras
IBM via edX Introduction to Machine Learning
Duke University via Coursera Intro to Deep Learning with PyTorch
Facebook via Udacity Introduction to Machine Learning for Coders!
fast.ai via Independent