Gradients for Everyone: A Quick Guide to Autodiff in Julia
Offered By: The Julia Programming Language via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the world of automatic differentiation (AD) in Julia through this informative conference talk. Dive into the core concepts behind taking gradients of arbitrary computer programs, a crucial element in scientific and machine learning breakthroughs. Compare Julia's approach to AD with Python's fragmented frameworks, and discover the vision of making the entire Julia language differentiable. Learn about various AD packages in Julia, including ForwardDiff, ReverseDiff, Zygote, and Enzyme, and understand their distinct tradeoffs. Gain insights from both package developer and user perspectives, covering topics such as classification of AD systems, forward and reverse modes, making code differentiable, and using differentiable code effectively. Acquire the knowledge needed to make informed decisions about AD implementation in your Julia projects.
Syllabus
Gradients for everyone: a quick guide to autodiff in Julia | Dalle, Hill | JuliaCon 2024
Taught by
The Julia Programming Language
Related Courses
Proteins: Biology's WorkforceRice University via edX Add Internationalization (i18n) to a React app using React Intl
egghead.io Auditing React Apps for Accessibility
egghead.io Test React Components with Enzyme and Jest
egghead.io Science & Cooking: From Haute Cuisine to Soft Matter Science (physics)
Harvard University via edX