Apply LIME to Explain, Trust, and Validate Your Predictions for Any ML Model
Offered By: Prodramp via YouTube
Course Description
Overview
Syllabus
- Tutorial Introduction
- Why LIME is needed?
- Need for a surrogate model
- LIME Properties
- LIME is not Feature Importance
- Explaining image classification
- Another LIME based explanation
- Tabular data classification explanation
- Two types of explanations
- What is in notebook exercises?
- 1st Original LIME explanation
- Loading Inception V3 model
- LIME library Installation
- Lime Explainer Module
- LIME Explanation Model Creation
- Creating superpixel Image
- Showing Pros and Cons in image
- Showing Pros and Cons with weight higher 0.1 in image
- Analyzing 2nd Prediction
- LIME Custom Implementation
- Loading EffecientNet Model
- Loading LIME class from custom Implementation
- LIME Explanation Results
- Loading ResNet50 Model
- LIME Explanations
- Step by Step Custom Explanations
- Explanations Comparisons
- Saving Notebooks to GitHub
- Recap
Taught by
Prodramp
Related Courses
Explainable Machine Learning with LIME and H2O in RCoursera Project Network via Coursera Machine Learning Interpretable: interpretML y LIME
Coursera Project Network via Coursera Capstone Assignment - CDSS 5
University of Glasgow via Coursera Machine Learning and AI Foundations: Producing Explainable AI (XAI) and Interpretable Machine Learning Solutions
LinkedIn Learning Guided Project: Predict World Cup Soccer Results with ML
IBM via edX