Bringing GenAI to the Modern Enterprise - A Production Use-Case in Serverless Java
Offered By: Devoxx via YouTube
Course Description
Overview
Explore a hands-on workshop that demonstrates how to build, test, and deploy cutting-edge Generative AI applications in a modern enterprise setting using serverless Java. Learn to create a well-balanced, end-to-end, multi-modal RAG application that serves as a reference architecture for enterprise-level GenAI solutions. Gain practical experience with AI orchestration frameworks like SpringAI and Langchain4J, and deploy applications in serverless environments such as Cloud Run. Discover how to leverage multiple Large Language Models (LLMs) in managed environments like Google VertexAI, local environments with Ollama and Testcontainers, and Kubernetes using vLLM. Walk away with a complete codebase, configuration, and deployment instructions to implement powerful GenAI features in Java applications, catering to both seasoned developers and newcomers to Generative AI.
Syllabus
Bringing GenAI to the Modern Enterprise. A production use-case. In Serverless Java! by Dan Dobrin
Taught by
Devoxx
Related Courses
The GenAI Stack - From Zero to Database-Backed Support BotDocker via YouTube Ollama Crash Course: Running AI Models Locally Offline on CPU
1littlecoder via YouTube AI Anytime, Anywhere - Getting Started with LLMs on Your Laptop
Docker via YouTube Rust Ollama Tutorial - Interfacing with Ollama API Using ollama-rs
Jeremy Chone via YouTube Ollama: Libraries, Vision Models, and OpenAI Compatibility Updates
Sam Witteveen via YouTube