The Future of AI: LLMs, AGI, and Beyond - Part 2
Offered By: Data Science Dojo via YouTube
Course Description
Overview
Explore the future of AI in this insightful 49-minute podcast episode featuring Raja Iqbal and Bob van Luijt. Delve into the evolution of generative AI, current LLM landscape, and future prospects. Examine key concepts like RAG, fine-tuning, vector search, and context windows. Discuss the potential of Small Language Models, enterprise adoption challenges, and the interplay between art, science, engineering, and design in AI development. Investigate the possibility of Artificial General Intelligence (AGI) and its implications. Gain valuable insights into GPU vs. CPU usage, generative feedback loops in vector databases, and the concept of uncanny valley in AI applications.
Syllabus
- Introduction
- Current LLM landscape
- Evolution of Generative AI and its future outlook
- Small Language Models
- The interrelation between art, science, engineering and design
- GPU vs. CPU
- Context windows - Are bigger context windows always better?
- Artificial General Intelligence AGI
- Enterprise adoption risks and challenges with RAG and fine-tuning based solutions
- Generative feedback loop in vector databases
- Vector search as a feature vs. AI-native applications
- Uncanny valley
Taught by
Data Science Dojo
Related Courses
Pinecone Vercel Starter Template and RAG - Live Code Review Part 2Pinecone via YouTube Will LLMs Kill Search? The Future of Information Retrieval
Aleksa Gordić - The AI Epiphany via YouTube RAG But Better: Rerankers with Cohere AI - Improving Retrieval Pipelines
James Briggs via YouTube Advanced RAG - Contextual Compressors and Filters - Lecture 4
Sam Witteveen via YouTube LangChain Multi-Query Retriever for RAG - Advanced Technique for Broader Vector Space Search
James Briggs via YouTube