YoVDO

Kolmogorov-Arnold Networks - Alternatives to Multi-Layer Perceptrons

Offered By: Valence Labs via YouTube

Tags

Artificial Intelligence Courses Machine Learning Courses Neural Networks Courses Drug Discovery Courses Activation Functions Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore Kolmogorov-Arnold Networks (KANs) as promising alternatives to Multi-Layer Perceptrons (MLPs) in this comprehensive 1 hour 35 minute talk by Ziming Liu from Valence Labs. Learn how KANs, inspired by the Kolmogorov-Arnold representation theorem, utilize learnable activation functions on edges instead of fixed activation functions on nodes. Discover why smaller KANs can achieve comparable or better accuracy than larger MLPs in data fitting and PDE solving, and how they possess faster neural scaling laws. Examine the interpretability advantages of KANs, including intuitive visualization and easy interaction with human users. Through examples in mathematics and physics, see how KANs can assist scientists in (re)discovering mathematical and physical laws. The talk covers background information, comparisons between MLPs and KANs, accuracy and scaling, interpretability in scientific applications, strengths and weaknesses, philosophical aspects, behind-the-scenes anecdotes, and concludes with a Q&A session.

Syllabus

- Intro + Background
- From KART to KAN
- MLP vs KAN
- Accuracy: Scaling of KANs
- Interpretability: KAN for Science
- Q+A Break
- Strengths and Weaknesses
- Philosophy
- Anecdotes Behind the Scenes
- Final Thoughts
- Q+A


Taught by

Valence Labs

Related Courses

Drug Discovery
University of California, San Diego via Coursera
新药发现和药物靶点 | Drug Discovery and its Target
Peking University via edX
Principles and Applications of NMR Spectroscopy
Indian Institute of Science Bangalore via Swayam
Cell Culture Technologies
Indian Institute of Technology Kanpur via Swayam
Medicinal Chemistry
Indian Institute of Technology Madras via Swayam