Intel® Solutions Pro – Principles of AI Everywhere
Offered By: Intel via Coursera
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
AI is transforming how we work and live every day, and it is evolving rapidly. Intel is delivering a full spectrum of hardware and software platforms, offering open and modular solutions to expedite time-to-value in this era of exponential growth. Intel integrates AI seamlessly across its hardware and software technologies, supporting generative AI workloads and driving innovations like AI PC and AI at the edge.
In this curriculum, you'll delve into Deep Learning, Machine Learning, and Generative AI, and learn to navigate AI challenges using industry models tailored to data parameters. Learn how to offer solutions from Intel's diverse portfolio, including CPU, GPUs, accelerators, technologies, software, and toolkits, for ease of AI solution deployments.e.g. This is primarily aimed at first- and second-year undergraduates interested in engineering or science, along with high school students and professionals with an interest in programming.
Syllabus
- Intel Bringing AI Everywhere
- What does AI Everywhere mean? What product should I consider for what applications or AI development stage? Learn all this and get sales guidance around AI from client to edge and cloud.
- Bringing AI Everywhere...in the Data Center
- Contrary to popular belief, Nvidia GPUs are not the ONLY viable AI solution in the data center. Intel delivers outstanding solutions ranging from Intel® Xeon® CPUs to GPUs (GPU MAX and GPU FLEX) to Intel® Gaudi® 2 AI accelerators. This module highlights how Intel solves business challenges with AI and offers compelling alternatives to Nvidia
- Bringing AI Everywhere...in the Client
- It is of huge importance for Intel to establish that AI runs on PCs. Intel® Core™ processors and Intel® ARC™ GPUs enable many inference use cases on client systems. This module will educate you on the AI applications and how Intel Core CPUs with neural processing unit (NPU), and Intel ARC GPUs enable these use cases.
- Bringing AI Everywhere...at the Edge
- AI is enabling business transformations everywhere across the network and edge. Vision, language, and other use cases deploy AI across the edge--across a broad array of locations--in manufacturing, smart cities, transportation, and networking. Learn about the tools and enablement for edge deployments in this module.
- AI on Intel® Xeon® Processors in the Cloud, Datacenter and Enterprise
- Intel® Xeon® processors can be a great fit for AI, from ML/DL applications and even Generative AI. Note that while inference is a certain target AI use case we can also sell into retraining and fine tuning with Intel Xeon processors.
- Introduction to Intel® Gaudi® AI Accelerators
- This module will provide a foundational overview of Intel® Gaudi® AI accelerators, ensuring that everyone can grasp the core concepts including MLPerf results.
- Intel® Xeon® CPU with Intel® AMX and High Bandwidth Memory - Supercharged AI Acceleration
- The matrix multiplication acceleration provided by Intel® Advanced Matrix Extensions (Intel® AMX) in Intel® Xeon® CPU Max makes it an exceptional value for AI. Pairing that acceleration with the increased memory bandwidth of the Intel® Xeon® CPU Max provides even better performance on many workloads and can greatly speed up workflows where AI is used to augment HPC as well as in LLM Inference. This submodule will provide a summary of the technical characteristics, their benefits, and performance results to show how customers and users can make use of these technologies to solve their problems within the Intel® Xeon® CPU ecosystem they already know and love.
- Meet the Growing Needs of AI in HPC with Intel® Data Center GPU Max Series
- Understand how Intel® Data Center GPU Max Series is a viable solution for your growing demand for AI and general-purpose workloads. In this course you will learn the basics of the technology, its software and framework readiness, and examples of where to use it effectively.
- Meet the Growing Needs of AI Visual Inference on the Edge, Cloud, with Intel® Data Center GPU Flex Series
- Understand how Intel® Data Center GPU Flex Series is a viable solution for your growing demand for AI and general-purpose workloads. In this course, you will learn the basics of the technology, its software and framework readiness, and examples of where to use it effectively.
- The Intel® AI Software Value
- oneAPI AI ToolKit, IPEX, OpenVINO™ - Learn the basics of positioning Intel AI Software
- Gnerative AI and Large Language Models for the Real World
- ChatGPT and other massive models represent an amazing step forward in AI that is moving at light speed. This course will survey how the AI ecosystem has worked non-stop to take these all-purpose multi-task models and optimize them so they can be used by organizations to address domain-specific problems. Learn how Intel can help you become a trusted thought leader who can demystify this topic for your partners and customers.
- The Intel® AI Workstation: Purpose-Built for AI Development, Data Science, and Classical Machine Learning
- A lot of AI preparation, development, prototyping, and increasingly, deployment is happening on workstations. Workstations liberate the AI developer and data scientist from negotiating server time while also providing the increased memory capacity and cores to handle larger AI datasets that would cripple a consumer PC or laptop. With an AI Workstation from Intel, organizations benefit from a robust platform for AI experimentation, thus avoiding expensive production costs. Finally, with the growth of Generative AI and Small Language Models (SLMs), the AI Workstation from Intel offers a compelling solution for enterprises to maximize their AI investments. By using industry-specific and proprietary data with SLMs in a workstation, enterprises can achieve multiple objectives: efficiency, accuracy, customization, and security.
- Confidential AI - Your Path to More Security & Compliant AI
- AI is the defining workload of our time, and organizations are racing to adopt AI into their businesses and services. At the same time, governments around the world are passing new regulations to help ensure AI evolves in a way that is secure, trustworthy, and respectful of individual privacy. Confidential AI is a method to protect the data and model while it is actively in use, helping organizations stay compliant with regulations and protect their IP.
- Congratulation Complete
- Congratulations you are complete.
Taught by
Jennifer James
Related Courses
4.0 Shades of Digitalisation for the Chemical and Process IndustriesUniversity of Padova via FutureLearn A Day in the Life of a Data Engineer
Amazon Web Services via AWS Skill Builder FinTech for Finance and Business Leaders
ACCA via edX Accounting Data Analytics
University of Illinois at Urbana-Champaign via Coursera Accounting Data Analytics
Coursera