Introduction to Quantum Computing for Everyone
Offered By: The University of Chicago via edX
Course Description
Overview
Quantum computing is coming closer to reality, with 80+ bit machines in active use. This course provides an intuitive introduction to the impacts, underlying phenomenon, and programming principles that underlie quantum computing.
The course begins with an exploration of classes of computational problems that classical computers are not well-suited to solve. We then progress to an intuitive introduction to key QIS concepts that underlie quantum computing. Next, we introduce individual quantum operations, but with a symbolic representation and mathematical representation. A limited set of linear algebra operations will be taught so that students can calculate operation results. Finally, we string these individual operations together to create the first algorithm that illustrates the performance advantage resulting from these unique operations.
Syllabus
QIS Applications & Hardware
Quantum Operations,
Qubit Representation
Measurement
Superposition
Matrix Multiplication
Multi-Qubit Operations
Quantum Circuits
Entanglement
Deutsch’s Algorithm
Taught by
Diana Franklin
Tags
Related Courses
AWS Certified Machine Learning - Specialty (LA)A Cloud Guru Blockchain Essentials
A Cloud Guru Algorithms for DNA Sequencing
Johns Hopkins University via Coursera Applied AI with DeepLearning
IBM via Coursera Artificial Intelligence Algorithms Models and Limitations
LearnQuest via Coursera