Making Your Company LLM-native
Offered By: MLOps.community via YouTube
Course Description
Overview
Explore the concept of making your company LLM-native in this insightful podcast episode featuring Francisco Ingham, Founder of Pampa Labs. Delve into the strategic application of LLMs for scaling businesses, enhancing user experiences, and integrating AI into daily workflows. Discover unexpected LLM applications, learn about experiment tracking optimizations, and understand the importance of SEO expertise in AI integration. Gain valuable insights on AI operating systems, agents, and the spectrum of RAG approaches. Examine the differences between search and retrieval in AI, compare recommender systems with RAG, and uncover key considerations for LLM interface design.
Syllabus
[] Francisco's preferred coffee
[] Takeaways
[] Please like, share, leave a review, and subscribe to our MLOps channels!
[] A Literature Geek
[] LLM-native company
[] Integrating LLM in workflows
[] Unexpected LLM applications
[] LLM's in development process
[] Vibe check to evaluation
[] Experiment tracking optimizations
[] LLMs as judges discussion
[] Presentaciones automatizadas para podcast
[] AI operating system and agents
[] Importance of SEO expertise
[] Experimentation and evaluation
[] AI integration strategies
[] RAG approach spectrum analysis
[] Search vs Retrieval in AI
[] Recommender Systems vs RAG
[] LLMs in recommender systems
[] LLM interface design insights
Taught by
MLOps.community
Related Courses
Machine Learning Operations (MLOps): Getting StartedGoogle Cloud via Coursera Проектирование и реализация систем машинного обучения
Higher School of Economics via Coursera Demystifying Machine Learning Operations (MLOps)
Pluralsight Machine Learning Engineer with Microsoft Azure
Microsoft via Udacity Machine Learning Engineering for Production (MLOps)
DeepLearning.AI via Coursera