Databricks to Local LLMs
Offered By: Duke University via Coursera
Course Description
Overview
By the end of this course, a learner will master Databricks to perform data engineering and data analytics tasks for data science workflows. Additionally, a student will learn to master running local large language models like Mixtral via Hugging Face Candle and Mozilla llamafile.
Syllabus
- Databricks Lakehouse Platform Fundamentals
- In this module, you will learn how to describe the Databricks architecture, create clusters, use notebooks for analysis, and share notebooks by completing hands-on labs and knowledge checks on these topics.
- Data Transformation and Pipelines
- In this module, you will learn how to read and transform data, create Delta Lake pipelines, and work with complex data types by implementing ETL solutions and passing code samples reviews.
- Responsible Generative AI
- In this module, you will learn foundations of generative AI and responsible deployment strategies to benefit from the latest advancements while maintaining safety, accuracy, and oversight.By directly applying concepts through hands-on labs and peer discussions, you will gain practical experience putting AI into production.
- Local LLMOps
- In this module, you will learn mitigation strategies, evaluate task performance, and operationalize workflows by identifying risks in notebooks and deploying an LLM application.
Taught by
Noah Gift, Alfredo Deza and Derek Wales
Tags
Related Courses
Distributed Computing with Spark SQLUniversity of California, Davis via Coursera Apache Spark (TM) SQL for Data Analysts
Databricks via Coursera Building Your First ETL Pipeline Using Azure Databricks
Pluralsight Implement a data lakehouse analytics solution with Azure Databricks
Microsoft via Microsoft Learn Perform data science with Azure Databricks
Microsoft via Microsoft Learn