Learning Paradigms for Neural Networks: The Locally Backpropagated Forward-Forward Algorithm
Offered By: Inside Livermore Lab via YouTube
Course Description
Overview
Explore a cutting-edge approach to neural network training in this 57-minute talk by Fabio Giampaolo from the University of Naples Federico II. Delve into the Locally Backpropagated Forward Forward training strategy, a novel method combining the effectiveness of backpropagation with the appealing attributes of the Forward-Forward algorithm. Understand how this innovative technique addresses limitations of traditional methods, particularly in integrating Deep Learning strategies within complex frameworks dealing with physics-related problems. Learn about challenges such as incorporating non-differentiable components in neural architectures and implementing distributed learning on heterogeneous devices. Gain insights into the potential of this approach to broaden the applicability of AI strategies in real-world situations, especially in contexts where conventional methods face limitations.
Syllabus
DDPS | Learning paradigms for neural networks: The locally backpropagated forward-forward algorithm
Taught by
Inside Livermore Lab
Related Courses
Deep Learning Fundamentals with KerasIBM via edX Deep Learning Essentials
Université de Montréal via edX Deep Learning with TensorFlow 2.0
Udemy Data Science: Deep Learning and Neural Networks in Python
Udemy Нейронные сети и глубокое обучение
DeepLearning.AI via Coursera