Deploying Serverless Inference Endpoints - Setup and Cost Comparison
Offered By: Trelis Research via YouTube
Course Description
Overview
Learn how to deploy serverless inference endpoints in this 20-minute video tutorial. Explore the benefits and use cases of serverless APIs, and follow a step-by-step guide to set up your own serverless endpoint. Discover the process of inferencing using a serverless approach and compare the costs of serverless solutions versus traditional GPU rentals. Gain access to valuable resources, including setup guides, inference scripts, and additional repositories for advanced topics such as fine-tuning, vision, and transcription. By the end of this tutorial, you'll have the knowledge to implement efficient and cost-effective serverless inference solutions for your projects.
Syllabus
Deploying a Serverless API Endpoint
Serverless Demo
Video Overview
Serverless Use Cases
Setting up a Serverless API
Inferencing a Serverless Endpoint
Serverless Costs versus GPU Rental
Accessing Instructions and Scripts
Taught by
Trelis Research
Related Courses
Discrete Inference and Learning in Artificial VisionÉcole Centrale Paris via Coursera Teaching Literacy Through Film
The British Film Institute via FutureLearn Linear Regression and Modeling
Duke University via Coursera Probability and Statistics
Stanford University via Stanford OpenEdx Statistical Reasoning
Stanford University via Stanford OpenEdx