Transformations and Automatic Differentiation in Computational Thinking - Lecture 3
Offered By: The Julia Programming Language via YouTube
Course Description
Overview
Syllabus
Introduction by MIT's Prof. Alan Edelman.
Agenda of Lecture0-1:30 Transformations and Automatic Differentiations.
General Linear Transformation.
Shear Transformation.
Non-Linear Transformation(Warp).
Rotation.
Compose Transformation(Rotate followed by Warp).
More Transformations(xy, rθ).
Linear and Non-Linear Transformations.
Linear combinations of Images.
Functions in Maths and in Julia(Short form, anonymous and long form).
Automatic Differentiation of Univariates.
Scalar Valued Multivariate Functions.
Automatic Differentiation: Scalar valued and Multivariate Functions.
Minimizing "loss function" in Machine Learning.
Transformations: Vector Valued Multivariate Functions.
Automatic Differentiation of Transformations.
But what is a transformation, really?.
Significance of Determinants in Scaling.
Resource for Automatic Differentiation in 10 minutes with Julia.
Taught by
The Julia Programming Language
Related Courses
Introduction to Neural Networks and PyTorchIBM via Coursera Regression with Automatic Differentiation in TensorFlow
Coursera Project Network via Coursera Neural Network from Scratch in TensorFlow
Coursera Project Network via Coursera Customising your models with TensorFlow 2
Imperial College London via Coursera PyTorch Fundamentals
Microsoft via Microsoft Learn