Dyalog and Apache Kafka Integration - True?
Offered By: Dyalog User Meetings via YouTube
Course Description
Overview
          Explore the potential integration of Dyalog APL with Apache Kafka in this 26-minute conference talk from Dyalog '23. Delve into the fundamentals of Apache Kafka, a distributed event streaming platform used for building real-time data pipelines and streaming applications. Learn about Kafka's key concepts including messages, topics, producers, consumers, partitions, and message retention. Discover how Kafka ensures message ordering and various ways to run it. Examine Stefan Kruger's experimental Dyalog interface to Apache Kafka, focusing on DKaf.NET and its use of C# generics. Follow along with practical examples, including reading a London bike rental stream and implementing a string producer. Gain insights into the possibilities of combining Dyalog's powerful array programming capabilities with Kafka's robust distributed streaming architecture.
        
Syllabus
 About Stefan
 Apache Kafka distributed streaming platform
 Messages
 Topics
 Producers and consumers
 Partition
 Message retention
 Kafka guarantees message ordering
 How to run Kafka
 A Dyalog Kafka interface
 DKaf.NET
 C# generics
 Example reading London bike rental stream
 String producer example
 Summary
Taught by
Dyalog User Meetings
Related Courses
Processing Real-Time Data Streams in AzureMicrosoft via edX Gérez des flux de données temps réel
CentraleSupélec via OpenClassrooms Data Streaming
Udacity Taming Big Data with Apache Spark and Python - Hands On!
Udemy Python & Cryptocurrency API: Build 5 Real World Applications
Udemy