Python script use case to stream tweets from Twitter's API using Confluent Kafka basic components.
-
Updated
Dec 12, 2022 - Python
Python script use case to stream tweets from Twitter's API using Confluent Kafka basic components.
It is a data ingestion pipeline where data from sensors is passed to kafka server (use confluent kafka) and then finally stored in a nosql database i.e. mongodb.
Some basic use cases of kafka.
Connect to Kafka using Spring in JSON format
DC/OS examples
Kafka Schema Registry V1
Replicate a table from a Source SQL Server to a Sink SQL Server via Kafka.
Example on how to set up Confluent Kafka inside of docker compose
Construct a streaming event pipeline around Apache Kafka and its ecosystem. Using public data from the Chicago Transit Authority, we will construct an event pipeline around Kafka that allows us to simulate and display the status of train lines in real time.
Utility to test slow consumer behaviour
Confluent Kafka template for the AsyncAPI Generator
Demo app showing how to setup local configuration for kafka infrastructure (broker, zookeeper, schema registry) with a topic, producer and consumer.
This Confluent Kafka Pipeline project is designed to ingest data from sensors, process it using Kafka Streams, and then store the processed data in MongoDB. It provides a structured data pipeline for real-time data processing and storage.
schema registry using kafka avro
GitOps enabled repo to provision Kafka as a service for a single team.
Simple local running Apache Kafka implementation using confluent-kafka
Add a description, image, and links to the confluent-kafka topic page so that developers can more easily learn about it.
To associate your repository with the confluent-kafka topic, visit your repo's landing page and select "manage topics."