Using MPT-7B in Hugging Face and LangChain
Offered By: James Briggs via YouTube
Course Description
Overview
Explore the implementation of Mosaic ML's new MPT-7B language model in Hugging Face transformers and LangChain. Learn how to utilize various MPT-7B models, including instruct, chat, and storywriter-65k versions, while gaining access to powerful tooling such as AI agents and chatbot functionality. Follow along with Python setup, model initialization, tokenizer configuration, and text generation processes. Discover the potential of open-source LLMs and their integration with popular NLP libraries for advanced natural language processing tasks.
Syllabus
Open Source LLMs like MPT-7B
MPT-7B Models in Hugging Face
Python setup
Initializing MPT-7B-Instruct
Initializing the MPT-7B tokenizer
Stopping Criteria and HF Pipeline
Hugging Face Pipeline
Generating Text with Hugging Face
Implementing MPT-7B in LangChain
Final Thoughts on Open Source LLMs
Taught by
James Briggs
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX