YoVDO

Hyperparameter Tuning with Neural Network Intelligence

Offered By: Coursera Project Network via Coursera

Tags

Hyperparameter Optimization Courses Python Courses Neural Networks Courses TensorFlow Courses Keras Courses Hyperparameter Tuning Courses MNIST Dataset Courses

Course Description

Overview

In this 2-hour long guided project, we will learn the basics of using Microsoft's Neural Network Intelligence (NNI) toolkit and will use it to run a Hyperparameter tuning experiment on a Neural Network. NNI is an open source, AutoML toolkit created by Microsoft which can help machine learning practitioners automate Feature engineering, Hyperparameter tuning, Neural Architecture search and Model compression. In this guided project, we are going to take a look at using NNI to perform hyperparameter tuning. Please note that we are going to learn to use the NNI toolkit for hyperparameter tuning, and are not going to implement the tuning algorithms ourselves. We will use the popular MNIST dataset and train a simple Neural Network to learn to classify images of hand-written digits from the dataset. Once a basic script is in place, we will use the NNI toolkit to run a hyperparameter tuning experiment to find optimal values for batch size, learning rate, choice of activation function for the hidden layer, number of hidden units for the hidden layer, and dropout rate for the dropout layer. To be able to complete this project successfully, you should be familiar with the Python programming language. You should also be familiar with Neural Networks, TensorFlow and Keras. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

Syllabus

  • Hyperparameter Tuning with Microsoft NNI
    • In this 2-hour long guided project, we will learn the basics of using Microsoft's Neural Network Intelligence (NNI) toolkit and will use it to run a Hyperparameter tuning experiment on a Neural Network. NNI is an open source, AutoML toolkit created by Microsoft which can help machine learning practitioners automate Feature engineering, Hyperparameter tuning, Neural Architecture search and Model compression. In this guided project, we are going to take a look at using NNI to perform hyperparameter tuning. Please note that we are going to learn to use the NNI toolkit for hyperparameter tuning, and are not going to implement the tuning algorithms ourselves. We will use the popular MNIST dataset and train a simple Neural Network to learn to classify images of hand-written digits from the dataset. Once a basic script is in place, we will use the NNI toolkit to run a hyperparameter tuning experiment to find optimal values for batch size, learning rate, choice of activation function for the hidden layer, number of hidden units for the hidden layer, and dropout rate for the dropout layer.

Taught by

Amit Yadav

Related Courses

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
DeepLearning.AI via Coursera
How to Win a Data Science Competition: Learn from Top Kagglers
Higher School of Economics via Coursera
Predictive Modeling and Machine Learning with MATLAB
MathWorks via Coursera
Machine Learning Rapid Prototyping with IBM Watson Studio
IBM via Coursera
AutoML avec AutoKeras - Classification d'images
Coursera Project Network via Coursera