YoVDO

Sparse Representations in Signal and Image Processing: Fundamentals

Offered By: Technion - Israel Institute of Technology via edX

Tags

Digital Image Processing Courses Machine Learning Courses Linear Algebra Courses Signal Processing Courses Image Processing Courses Approximation Theory Courses

Course Description

Overview

This course introduces the fundamentals of the field of sparse representations, starting with its theoretical concepts, and systematically presenting its key achievements. We will touch on theory and numerical algorithms.

Modeling data is the way we - scientists - believe that information should be explained and handled. Indeed, models play a central role in practically every task in signal and image processing. Sparse representation theory puts forward an emerging, highly effective, and universal such model. Its core idea is the description of the data as a linear combination of few building blocks - atoms - taken from a pre-defined dictionary of such fundamental elements.

A series of theoretical problems arise in deploying this seemingly simple model to data sources, leading to fascinating new results in linear algebra, approximation theory, optimization, and machine learning. In this course you will learn of these achievements, which serve as the foundations for a revolution that took place in signal and image processing in recent years.


Syllabus

This program is composed from two separate parts:

1.Part 1: Sparse Representations in Signal and Image Processing: Fundamentals.

2.Part 2: Sparse Representations in Image Processing: From Theory to Practice.

While we recommend taking both courses, each of them can be taken independently of the other. The duration of each course is five weeks, and each part includes: (i) knowledge-check questions and discussions, (ii) series of quizzes, and (iii) Matlab programming projects. Each course will be graded separately, using the average grades of the questions/discussions [K] quizzes [Q], and projects [P], by Final-Grade = 0.1K + 0.5Q + 0.4P.

The following includes more details of the topics we will cover in the first course:

  • Overview of Sparseland, including mathematical warm-up and intro to L1-minimization.

  • Seeking sparse solutions: the L0 norm and P0 problem.

  • Theoretical analysis of the Two-Ortho case of P0, including definitions of Spark and Mutual-Coherence.

  • Theoretical analysis of the general case of the P0 problem.

  • Greedy pursuit algorithms including: Thresholding (THR), Orthogonal Matching Pursuit (OMP) and its variants.

  • Relaxation pursuit algorithms including Basis Pursuit (BP).

  • Theoretical guarantees of pursuit algorithms: THR, OMP and BP.

  • Practical tools to solve approximate problems, including exact solution of unitary case, Iterative Re-weighted Least Squares algorithm (IRLS) and Alternating Direction Method of Multipliers (ADMM).

  • Theoretical guarantees to approximate solutions including definition of Restricted Isometry Property (RIP) and pursuit algorithms' stability.


Taught by

Michael Elad and Yaniv Romano

Tags

Related Courses

Advanced Machine Learning
The Open University via FutureLearn
Advanced Statistics for Data Science
Johns Hopkins University via Coursera
Algebra & Algorithms
Moscow Institute of Physics and Technology via Coursera
Algèbre Linéaire (Partie 2)
École Polytechnique Fédérale de Lausanne via edX
Linear Algebra III: Determinants and Eigenvalues
Georgia Institute of Technology via edX