YoVDO

CAP6412 - SmoothGrad: Removing Noise by Adding Noise - Lecture

Offered By: University of Central Florida via YouTube

Tags

Deep Learning Courses Machine Learning Courses Computer Vision Courses Neural Networks Courses Image Processing Courses Noise Reduction Courses

Course Description

Overview

Explore a comprehensive lecture on SmoothGrad, a technique for improving the quality of sensitivity maps in deep neural networks. Delve into the paper's details, starting with an overview and definition of sensitivity maps. Examine previous work in the field before focusing on the SmoothGrad proposal. Analyze various experiments, including models used, visualization techniques, parameter adjustments, and comparisons to baseline methods. Investigate the combination of SmoothGrad with other techniques and the effects of adding noise during training. Conclude with a critical evaluation of the paper's strengths and weaknesses, providing a well-rounded understanding of this innovative approach to reducing noise in neural network visualizations.

Syllabus

Paper Details
Overview
Definition of Sensitivity Maps
Previous Work
Smooth Grad Proposal
Experiments - Models
Experiments - Visualization (Value of gradients)
Experiments - Visualization (Capping Values)
Experiments - Visualization (Multiplying with Input)
Experiments - Parameters
Experiments - Comparison to Baseline Methods
Experiments - Combining SmoothGrad
Experiments - Adding Noise During Training
Conclusion
For Paper
Against Paper


Taught by

UCF CRCV

Tags

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent