🚀 Traffic Sentinel: A scalable IoT system using Fog nodes and Apache Flink to process 📷 IP camera streams, powered by YOLO for intelligent 🚗 traffic monitoring on highways. 🛣️
-
Updated
Dec 9, 2023 - Python
🚀 Traffic Sentinel: A scalable IoT system using Fog nodes and Apache Flink to process 📷 IP camera streams, powered by YOLO for intelligent 🚗 traffic monitoring on highways. 🛣️
Apache Flink examples designed to be run by AWS Kinesis Data Analytics (KDA).
Simple example for access count in Apache logs using Apache Flink
A Makeshift data infrastructure setup for datafirstjobs.com.
Setup for realtime data streaming using Kafka, Flink, Pinot, MySQL, Postgres and Superset
A streaming data pipeline uses Kafka as the backbone and Flink for data processing and transformations. Kafka Connect is used for writing the streams to S3 compatible blob stores and Redis (low latency KV store for real-time ML inference). Spark is used for the batch job to backfill the ml feature data.
Automated deployment of an Apache Flink cluster in your Grid'5000 reserved nodes.
we are thrilled to announce our new PoC project aimed at providing a complete real-time extraction, transformation, and exposure architecture for the new provincial transportation systems.
Stream Processing of website click data using Kafka and monitored and visualised using Prometheus and Grafana
This project demonstrates data cleaning, processing with Apache Spark and Apache Flink, both locally and on AWS EMR.
This project showcases a real-time data streaming pipeline using Apache Flink, Apache Spark, and Grafana. It streams data, stores it in Parquet format, and performs aggregations for insights, with seamless visualization via Grafana dashboards.
Add a description, image, and links to the flink-stream-processing topic page so that developers can more easily learn about it.
To associate your repository with the flink-stream-processing topic, visit your repo's landing page and select "manage topics."