Kafka
Kafka related tips & tricks, issues & resolutions
Change Data Capture from PostgreSQL to Kafka
As data becomes increasingly critical to businesses, the need to capture and process changes in real-time has never been more important. In this article, we'll explore how to read changed data from a PostgreSQL database and write it to a Kafka topic as event messages using Debezium's PostgreSQL CDC Source Connector.
Updated 20 Apr, 2026
Read MoreChange Data Capture from MySQL to Kafka
As data becomes increasingly critical to businesses, the need to capture and process changes in real-time has never been more important. In this article, we'll explore how to read changed data from a MySQL database and write it to a Kafka topic as event messages using Debezium's MySQL CDC Source Connector.
Updated 20 Apr, 2026
Read MoreChange Data Capture from MSSQL Server to Kafka
As data becomes increasingly critical to businesses, the need to capture and process changes in real-time has never been more important. In this article, we'll explore how to read changed data from a MSSQL Server and write it to a Kafka topic as event messages using Debezium's SQL Server CDC Source Connector.
Updated 20 Apr, 2026
Read MoreChange Data Capture from MongoDB to Kafka
As data becomes increasingly critical to businesses, the need to capture and process changes in real-time has never been more important. In this article, we'll explore how to read changed data from a MongoDB Server and write it to a Kafka topic as event messages using Debezium's MongoDB CDC Source Connector.
Updated 20 Apr, 2026
Read MoreChange Data Capture from Oracle to Kafka
As data becomes increasingly critical to businesses, the need to capture and process changes in real-time has never been more important. In this article, we'll explore how to read changed data from a Oracle database and write it to a Kafka topic as event messages using Confluent Oracle CDC Source Connector.
Updated 20 Apr, 2026
Read MoreKafka Connect to AWS S3 Sink
In this article we will learn how to write Kafka event messages to AWS S3 using Kafka Connect. We will use Amazon S3 Sink Connector to write the messages as Parquet files to our S3 Datalake. Also we will write the Kafka Tombstone records to a separate file to handle downstream delete operations.
Updated 20 Apr, 2026
Read MoreKafka Connect Debezium Source to AWS S3 Sink
In this article we will learn how to write Kafka event messages from Debezium Source Database Topics to AWS S3 using Kafka Connect. We will use Amazon S3 Sink Connector to write the messages as Parquet files to our S3 Datalake. Also we will write the Kafka Tombstone records to a separate file to handle downstream delete operations.
Updated 20 Apr, 2026
Read MoreNo sub-category under this category