Every week, the OCTOlog will explore data streaming technologies, how we make decisions about the software we build, and ultimately, how we can be more successful.
How to use Kafka Streams to aggregate change data capture (CDC) messages from a relational database into transactional messages, powering a scalable microservices architecture.
With the advent of service mesh and microservices, control and data planes have become popular. This post shows you how to ensure security and governance controls in your Kafka system.
There are two main consumer group memberships in Apache Kafka®. Here’s how static and dynamic consumer groups work, how they affect rebalancing, and which to choose for your application.
Apache Kafka might be free, but using it at scale is incredibly expensive. Here’s how Michelin used Confluent to enable a modern IT infrastructure and real-time analytics, while saving money
Co-partitioning is when two streams are joined by topics with the same number of partitions. Learn how to implement co-partitioning, the criteria needed, considerations, and more.
reehouse Software and Confluent allow simple, modern data management across applications, databases, data warehouses, or legacy systems without disrupting critical workloads.
Confluent’s Stream Governance feature enables a data streaming system that makes real-time data reliable, discoverable, and secure across every part of the business.
CDC is a software design pattern that identifies and captures changes made to data in a database. Learn how CDC works, the best solutions, and how to get started with various implementations.
Introducing fully managed Apache Kafka® + Flink for the most robust, cloud-native data streaming platform with stream processing, integration, and streaming analytics in one.
As businesses move to meet modern demands, these technologies ensure not only a digital transformation, but data transformation, with new use cases surrounding real-time data.