YoVDO

Learning Paradigms for Neural Networks: The Locally Backpropagated Forward-Forward Algorithm

Offered By: Inside Livermore Lab via YouTube

Tags

Neural Networks Courses Artificial Intelligence Courses Machine Learning Courses Deep Learning Courses Backpropagation Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a cutting-edge approach to neural network training in this 57-minute talk by Fabio Giampaolo from the University of Naples Federico II. Delve into the Locally Backpropagated Forward Forward training strategy, a novel method combining the effectiveness of backpropagation with the appealing attributes of the Forward-Forward algorithm. Understand how this innovative technique addresses limitations of traditional methods, particularly in integrating Deep Learning strategies within complex frameworks dealing with physics-related problems. Learn about challenges such as incorporating non-differentiable components in neural architectures and implementing distributed learning on heterogeneous devices. Gain insights into the potential of this approach to broaden the applicability of AI strategies in real-world situations, especially in contexts where conventional methods face limitations.

Syllabus

DDPS | Learning paradigms for neural networks: The locally backpropagated forward-forward algorithm


Taught by

Inside Livermore Lab

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent