Applications of Linear Algebra
Offered By: Georgia Institute of Technology via edX
Course Description
Overview
This certificate program will take students through roughly seven weeks of MATH 1554, Linear Algebra, as taught in the School of Mathematics at The Georgia Institute of Technology.
In the first course, you will explore the determinant, which yields two important results. First, you will be able to apply an invertibility criterion for a square matrix that plays a pivotal role in, for example, computer graphics and in other more advanced courses, such as multivariable calculus. The first course then moves on to eigenvalues and eigenvectors. The goal of this part of the course is to decompose the action of a linear transformation that may be visualized. The main applications described here are to discrete dynamical systems, including Markov chains. However, the basic concepts afforded by eigenvectors and eigenvalues are useful throughout industry, science, engineering and mathematics.
In the second course you will explore methods to compute an approximate solution to an inconsistent system of equations that have no solutions. This has a central role in the understanding of current data science applications. The second course then turns to symmetric matrices. They arise often in applications of the singular value decomposition, which is another tool often found in data science and machine learning.
Syllabus
Course 1: Linear Algebra III: Determinants and Eigenvalues
This course takes you through roughly three weeks of MATH 1554, Linear Algebra, as taught in the School of Mathematics at The Georgia Institute of Technology.
Course 2: Linear Algebra IV: Orthogonality & Symmetric Matrices and the SVD
This course takes you through roughly five weeks of MATH 1554, Linear Algebra, as taught in the School of Mathematics at The Georgia Institute of Technology.
Courses
-
At the beginning of this course we introduce the determinant, which yields two important concepts that you will use in this course. First, you will be able to apply an invertibility criterion for a square matrix that plays a pivotal role in, for example, the understanding of eigenvalues. You will also use the determinant to measure the amount by which a linear transformation changes the area of a region. This idea plays a critical role in computer graphics and in other more advanced courses, such as multivariable calculus.
This course then moves on to eigenvalues and eigenvectors. The goal of this part of the course is to decompose the action of a linear transformation that may be visualized. The main applications described here are to discrete dynamical systems, including Markov chains. However, the basic concepts— eigenvectors and eigenvalues—are useful throughout industry, science, engineering and mathematics.
Prospective students enrolling in this class are encouraged to first complete the linear equations and matrix algebra courses before starting this class.
-
In the first part of this course you will explore methods to compute an approximate solution to an inconsistent system of equations that have no solutions. Our overall approach is to center our algorithms on the concept of distance. To this end, you will first tackle the ideas of distance and orthogonality in a vector space. You will then apply orthogonality to identify the point within a subspace that is nearest to a point outside of it. This has a central role in the understanding of solutions to inconsistent systems. By taking the subspace to be the column space of a matrix, you will develop a method for producing approximate (“least-squares”) solutions for inconsistent systems.
You will then explore another application of orthogonal projections: creating a matrix factorization widely used in practical applications of linear algebra. The remaining sections examine some of the many least-squares problems that arise in applications, including the least squares procedure with more general polynomials and functions.
This course then turns to symmetric matrices. arise more often in applications, in one way or another, than any other major class of matrices. You will construct the diagonalization of a symmetric matrix, which gives a basis for the remainder of the course.
Taught by
Greg Mayer
Tags
Related Courses
Computational PhotographyGeorgia Institute of Technology via Coursera Computer Graphics
University of California, San Diego via edX Interactive 3D Graphics
Autodesk via Udacity Introducción a la Programación para Ciencias e Ingeniería
Universidad Politécnica de Madrid via Miríadax Interactive Computer Graphics
University of Tokyo via Coursera