Deploying Serverless Inference Endpoints - Setup and Cost Comparison
Offered By: Trelis Research via YouTube
Course Description
Overview
Learn how to deploy serverless inference endpoints in this 20-minute video tutorial. Explore the benefits and use cases of serverless APIs, and follow a step-by-step guide to set up your own serverless endpoint. Discover the process of inferencing using a serverless approach and compare the costs of serverless solutions versus traditional GPU rentals. Gain access to valuable resources, including setup guides, inference scripts, and additional repositories for advanced topics such as fine-tuning, vision, and transcription. By the end of this tutorial, you'll have the knowledge to implement efficient and cost-effective serverless inference solutions for your projects.
Syllabus
Deploying a Serverless API Endpoint
Serverless Demo
Video Overview
Serverless Use Cases
Setting up a Serverless API
Inferencing a Serverless Endpoint
Serverless Costs versus GPU Rental
Accessing Instructions and Scripts
Taught by
Trelis Research
Related Courses
Introduction to Cloud Infrastructure TechnologiesLinux Foundation via edX Cloud Computing
Indian Institute of Technology, Kharagpur via Swayam Elastic Cloud Infrastructure: Containers and Services en Español
Google Cloud via Coursera Kyma – A Flexible Way to Connect and Extend Applications
SAP Learning Modernize Infrastructure and Applications with Google Cloud
Google Cloud via Coursera