Deterministic Annealing for Clustering, Classification and Speech Recognition
Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube
Course Description
Overview
Explore the deterministic annealing approach to clustering and its extensions in this comprehensive lecture from the Center for Language & Speech Processing at Johns Hopkins University. Delve into the method's three key features: avoiding poor local optima, applicability to various structures, and ability to minimize complex cost functions. Gain insights into the probabilistic framework and information theoretic principles underlying the approach, including maximum entropy and random coding. Discover the analogy to statistical physics and the connection to rate-distortion theory, providing new perspectives on both the method and theory. Learn how structural constraints are incorporated to optimize popular structures like vector quantizers, decision trees, multilayer perceptrons, radial basis functions, and mixtures of experts. Examine experimental results demonstrating significant performance improvements over standard training methods in applications such as compression, estimation, pattern recognition, classification, and statistical regression. Conclude with a brief overview of ongoing research and extensions to the deterministic annealing method.
Syllabus
Deterministic Annealing for Clustering, Classification and Speech Recognition - Kenneth Rose - 2001
Taught by
Center for Language & Speech Processing(CLSP), JHU
Related Courses
Machine Learning Capstone: An Intelligent Application with Deep LearningUniversity of Washington via Coursera Elaborazione del linguaggio naturale
University of Naples Federico II via Federica Deep Learning for Natural Language Processing
University of Oxford via Independent Deep Learning Summer School
Independent Sequence Models
DeepLearning.AI via Coursera