Raising the Ki of a SciML Model - Enhancing GOKU-nets for Scientific Machine Learning
Offered By: The Julia Programming Language via YouTube
Course Description
Overview
Explore the advancements in Scientific Machine Learning (SciML) through this JuliaCon 2024 conference talk by Germán Abrevaya. Dive into the world of GOKU-nets (Generative ODE Modeling with Known Unknowns) and their enhanced version, GOKU-UI (GOKU-nets with Ubiquitous Inference). Learn how these models combine domain-aware, interpretable approaches with machine learning techniques to improve performance in reconstruction and forecasting tasks. Discover the implementation of GOKU-nets in Julia, leveraging the SciML Ecosystem and Flux, and understand the key enhancements introduced, including attention mechanisms and a novel training strategy based on multiple shooting techniques. Gain insights into the model's applications in simulated and empirical data, particularly in capturing complex brain dynamics using resting state fMRI data. Explore the potential of these advancements for practical applications in brain functionality research and psychiatric condition classification. Develop a comprehensive understanding of both the foundational GOKU-net model and its advanced iteration, GOKU-UI, through experimental evidence and detailed explanations of the improvements. Apply the lessons learned to enhance your own SciML models and contribute to the growing field of Scientific Machine Learning.
Syllabus
Raising the Ki of a SciML model | Abrevaya | JuliaCon 2024
Taught by
The Julia Programming Language
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX