YoVDO

Learn TensorFlow and Deep Learning Fundamentals with Python - Code-First Introduction Part 2/2

Offered By: YouTube

Tags

TensorFlow Courses Data Visualization Courses Deep Learning Courses Neural Networks Courses Model Evaluation Courses Model Optimization Courses

Course Description

Overview

Dive into the second part of an extensive video series on TensorFlow and deep learning fundamentals using Python. Continue your journey with hands-on coding exercises in Google Colab, exploring non-linear functions, neural networks with non-linear activation functions, and multi-layer models. Learn to optimize your models by tweaking learning rates, using callbacks, and analyzing loss curves. Delve into multi-class classification, preparing and visualizing data, building and improving models, and evaluating their performance. Master essential techniques like creating confusion matrices, normalizing data, and interpreting model patterns. Access additional resources, including the full course, GitHub repository for code and materials, and community discussions. Perfect for those who have completed part one and are ready to deepen their understanding of TensorFlow and deep learning concepts.

Syllabus

- Intro/hello/have you watched part 1? If not, you should.
- 66. Non-linearity part 1 (straight lines and non-straight lines).
- 67. Non-linearity part 2 (building our first neural network with a non-linear activation function).
- 68. Non-linearity part 3 (upgrading our non-linear model with more layers).
- 69. Non-linearity part 4 (modelling our non-linear data).
- 70. Non-linearity part 5 (reproducing our non-linear functions from scratch).
- 71. Getting great results in less time by tweaking the learning rate.
- 72. Using the history object to plot a model’s loss curves.
- 73. Using callbacks to find a model’s ideal learning rate.
- 74. Training and evaluating a model with an ideal learning rate.
- [Keynote] 75. Introducing more classification methods.
- 76. Finding the accuracy of our model.
- 77. Creating our first confusion matrix.
- 78. Making our confusion matrix prettier.
- 79. Multi-class classification part 1 (preparing data).
- 80. Multi-class classification part 2 (becoming one with the data).
- 81. Multi-class classification part 3 (building a multi-class model).
- 82. Multi-class classification part 4 (improving our multi-class model).
- 83. Multi-class classification part 5 (normalised vs non-normalised).
- 84. Multi-class classification part 6 (finding the ideal learning rate).
- 85. Multi-class classification part 7 (evaluating our model).
- 86. Multi-class classification part 8 (creating a confusion matrix).
- 87. Multi-class classification part 9 (visualising random samples) .
- 88. What patterns is our model learning?.


Taught by

Daniel Bourke

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera
Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera
Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera
Leading Ambitious Teaching and Learning
Microsoft via edX