YoVDO

Toward Length Extrapolatable Transformers

Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube

Tags

Transformers Courses Machine Learning Courses Neural Networks Courses Computational Linguistics Courses Attention Mechanisms Courses Language Models Courses Sequence Modeling Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore cutting-edge research on length extrapolation in Transformer models in this hour-long lecture by Ta-Chung Chi from Carnegie Mellon University. Delve into innovative approaches for improving the ability of Transformer architectures to handle sequences of varying lengths, a crucial challenge in natural language processing and machine learning. Gain insights into the latest techniques and methodologies aimed at enhancing the scalability and adaptability of these powerful models across different input sizes.

Syllabus

Toward Length Extrapolatable Transformers -- Ta-Chung Chi (CMU)


Taught by

Center for Language & Speech Processing(CLSP), JHU

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX