How Convolutional Neural Networks Work, in Depth
Offered By: Brandon Rohrer via YouTube
Course Description
Overview
Dive deep into the inner workings of convolutional neural networks in this comprehensive one-hour lecture. Explore the fundamental concepts, including filtering, convolution, pooling, and rectified linear units (ReLUs). Learn how ConvNets match pieces of images and understand the mathematics behind the matching process. Discover the role of fully connected layers, input vectors, and neurons in neural networks. Examine the complexities of receptive fields and output layers. Gain insights into training techniques such as gradient descent, backpropagation, and training from scratch. Understand how these concepts apply to real-world scenarios like customer data analysis and tea drinking temperature prediction.
Syllabus
Intro
Trickier cases
ConvNets match pieces of the image
Filtering: The math behind the match
Convolution: Trying every possible match
Pooling
Rectified Linear Units (ReLUS)
Fully connected layer
Input vector
A neuron
Squash the result
Weighted sum-and-squash neuron
Receptive fields get more complex
Add an output layer
Exhaustive search
Gradient descent with curvature
Tea drinking temperature
Chaining
Backpropagation challenge: weights
Backpropagation challenge: sums
Backpropagation challenge: sigmoid
Backpropagation challenge: ReLU
Training from scratch
Customer data
Taught by
Brandon Rohrer
Related Courses
Computational PhotographyGeorgia Institute of Technology via Udacity Discrete Time Signals and Systems, Part 1: Time Domain
Rice University via edX Signals and Systems, Part 1
Indian Institute of Technology Bombay via edX Discrete Time Signals and Systems, Part 2: Frequency Domain
Rice University via edX Introduction to Sound and Acoustic Sketching
University St. Joseph via Kadenze