Towards Robust and Efficient AI at Scale
Offered By: NHR@FAU via YouTube
Course Description
Overview
Explore cutting-edge research on robust and efficient AI at scale in this 51-minute NHR PerfLab seminar talk by Dr. Charlotte Debus. Delve into the challenges and opportunities of large-scale AI models, focusing on time series forecasting in scientific applications. Learn about transformer architectures, scalability issues, and the balance between computational power and energy efficiency. Discover innovative approaches to address the growing energy consumption of AI, including distributed optimization techniques and hyperparameter tuning. Gain insights into real-world applications such as electric load forecasting and power generation prediction. Examine the environmental impact of AI workloads and explore strategies for developing more sustainable AI solutions. Engage with topics like uncertainty quantification, attention matrix sparsity, and community outreach in the context of advancing AI technologies for scientific research and engineering.
Syllabus
Introduction
What does it do
Electric load forecasting
Transformer architecture
Robustness
AI method methods
Transformer architectures
The curse of dimensionality
Largescale AI
Scalable AI
Heat
Task scheduling
Neural networks
Distributed asynchronous and selective optimization
Hyperparameter optimization
Environmental footprint
Efficiency
Community outreach
Energy consumption
AI workload
Perun
Model energy consumption
Uncertainty quantification
Questions
Power generation forecasting
Attention Matrix sparsity
Energy efficiency
Taught by
NHR@FAU
Related Courses
Scientific ComputingUniversity of Washington via Coursera Biology Meets Programming: Bioinformatics for Beginners
University of California, San Diego via Coursera High Performance Scientific Computing
University of Washington via Coursera Practical Numerical Methods with Python
George Washington University via Independent Julia Scientific Programming
University of Cape Town via Coursera