Machine Learning: Unsupervised Learning
Offered By: Brown University via Udacity
Course Description
Overview
This is the second course in the 3-course Machine Learning Series and is offered at Georgia Tech as CS7641. Taking this class here does not earn Georgia Tech credit.
Ever wonder how Netflix can predict what movies you'll like? Or how Amazon knows what you want to buy before you do? The answer can be found in Unsupervised Learning!
Closely related to pattern recognition, Unsupervised Learning is about analyzing data and looking for patterns. It is an extremely powerful tool for identifying structure in data. This course focuses on how you can use Unsupervised Learning approaches -- including randomized optimization, clustering, and feature selection and transformation -- to find structure in unlabeled data.
Series Information: Machine Learning is a graduate-level series of 3 courses, covering the area of Artificial Intelligence concerned with computer programs that modify and improve their performance through experiences.
The entire series is taught as an engaging dialogue between two eminent Machine Learning professors and friends: Professor Charles Isbell (Georgia Tech) and Professor Michael Littman (Brown University).
Syllabus
- Randomized optimization
- Optimization, randomized,Hill climbing,Random restart hill climbing,Simulated annealing,Annealing algorithm,Properties of simulated annealing,Genetic algorithms,GA skeleton,Crossover example,What have we learned,MIMIC,MIMIC: A probability model,MIMIC: Pseudo code,MIMIC: Estimating distributions,Finding dependency trees,Probability distribution
- Clustering
- Clustering and expectation maximization,Basic clustering problem,Single linkage clustering (SLC),Running time of SLC,Issues with SLC,K-means clustering,K-means in Euclidean space,K-means as optimization,Soft clustering,Maximum likelihood Gaussian,Expectation Maximization (EM),Impossibility theorem
- Feature Selection
- Algorithms,Filtering and Wrapping,Speed,Searching,Relevance,Relevance vs. Usefulness
- Feature Transformation
- Feature Transformation,Words like Tesla,Principal Components Analysis,Independent Components Analysis,Cocktail Party Problem,Matrix,Alternatives
- Information Theory
- History -Sending a Message,Expected size of the message,Information between two variables,Mutual information,Two Independent Coins,Two Dependent Coins,Kullback Leibler Divergence
- Unsupervised Learning Project
Taught by
Charles Isbell and Michael Littman
Related Courses
Продвинутые методы машинного обученияHigher School of Economics via Coursera Advanced Machine Learning and Signal Processing
IBM via Coursera Applied Data Science for Data Analysts
Databricks via Coursera Aprendizaje Automático con Python
IBM via Coursera Aprendizaje de máquinas
Universidad Nacional Autónoma de México via Coursera