YoVDO

Introduction to Probability: Part 1 - The Fundamentals

Offered By: Massachusetts Institute of Technology via edX

Tags

Statistics & Probability Courses Data Analysis Courses Statistical Inference Courses Probability Theory Courses Discrete Random Variables Courses Continuous Random Variables Courses

Course Description

Overview

The world is full of uncertainty: accidents, storms, unruly financial markets, and noisy communications. The world is also full of data. Probabilistic modeling and the related field of statistical inference are the keys to analyzing data and making scientifically sound predictions.

This is Part 1 of a 2-part sequence on the basic tools of probabilistic modeling. Part 1 introduces the general framework of probability models, multiple discrete or continuous random variables, expectations, conditional distributions, and various powerful tools of general applicability. Part 2 will then continue into further topics that include laws of large numbers, the main tools of Bayesian inference methods, and an introduction to random processes (Poisson processes and Markov chains).

The contents of the two parts of the course are essentially the same as those of the corresponding MIT class, which has been offered and continuously refined over more than 50 years. It is a challenging class, but will enable you to apply the tools of probability theory to real-world applications or your research.

Probabilistic models use the language of mathematics. But instead of relying on the traditional "theorem - proof" format, we develop the material in an intuitive -- but still rigorous and mathematically precise -- manner. Furthermore, while the applications are multiple and evident, we emphasize the basic concepts and methodologies that are universally applicable.

Photo by User: Pablo Ruiz Múzquiz on Flickr. (CC BY-NC-SA 2.0)


Syllabus

  • Probability models and axioms
  • Conditioning, Bayes’ rule, independence
  • Counting methods in discrete probability
  • Discrete random variables (distributions, mean, variance, conditioning, etc.)
  • Continuous random variables (including general forms of Bayes’ rule)
  • Further topics (derived distributions; covariance & correlation, etc.)

Taught by

John Tsitsiklis , Patrick Jaillet , Zied Ben Chaouch , Dimitri Bertsekas , Qing He, Jimmy Li, Jagdish Ramakrishnan , Katie Szeto and Kuang Xu

Tags

Related Courses

Анализ данных
Novosibirsk State University via Coursera
Approximation Algorithms
EIT Digital via Coursera
Basic Statistics
University of Amsterdam via Coursera
What are the Chances? Probability and Uncertainty in Statistics
Johns Hopkins University via Coursera
Understanding Clinical Research: Behind the Statistics
University of Cape Town via Coursera