Probabilistic Methods for Classification - 2009
Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube
Course Description
Overview
Explore probabilistic methods for classification in this comprehensive lecture by Gideon Mann from the Center for Language & Speech Processing at Johns Hopkins University. Delve into supervised machine learning techniques, covering topics such as information extraction, semisupervised learning, and document classification. Learn about Naive Bayes, maximum likelihood estimation, and conditional log-linear models. Examine graphical models, including Maximum Entropy Models and Conditional Random Fields. Understand gradient-based optimization, hidden Markov models, and dependency parsing. Investigate advanced concepts like the Generalized Expectations Criteria, KL Divergence, and label regularization. Gain valuable insights into the theoretical foundations and practical applications of probabilistic classification methods in natural language processing and machine learning.
Syllabus
Introduction
Information Extraction
Semisupervised Learning
Outline
Supervised Machine Learning
Estimation
Classification
Document Classification
Naive Base
Maximum likelihood estimation
Sum over data
Recap
Conditional Log Linear Models
Graphical Models
Maximum Entropy Models
GradientBased Optimization
Naive Phase vs Maximum Entropy
Conditional Random Field
Hidden Markov Model
Model Framework
Model Structure
Conditional Random Field Models
Dependency Parsing
Generalized Expectations Criteria
KL Divergence
GE Estimation
Label Regularization
Taught by
Center for Language & Speech Processing(CLSP), JHU
Related Courses
علم اجتماع المايكروباتKing Saud University via Rwaq (رواق) Statistical Learning with R
Stanford University via edX More Data Mining with Weka
University of Waikato via Independent The Caltech-JPL Summer School on Big Data Analytics
California Institute of Technology via Coursera Machine Learning for Musicians and Artists
Goldsmiths University of London via Kadenze