Learn how US Foods built a digital-first ecommerce platform—using data streaming to future-proof its infrastructure and implement data mesh principles.
Explore the intersection of AI and data streaming with essential resources for developers. Discover recorded talks, blog posts, and upcoming events to boost your knowledge.
Discover the top 5 best practices for building event-driven architectures using Confluent and AWS Lambda. Learn how to optimize your architecture for scalability, reliability, and performance.
Learn how Apache Kafka message compression works, why and how to use it, the five types of compression, configurations for the compression type, and how messages are decompressed.
Data streaming breaks down silos and fosters innovation across financial services, including risk, capital markets, consumer banking, payments, insurance, and more.
Event-driven microservice architecture transforms the way this service marketplace connects customers and tradespeople with jobs, scheduling, payments, and more.
Get an in-depth introduction to Flink SQL. Learn how it relates to other APIs, its built in functions and operations, which queries to try first, and see syntax examples.
See how you can leverage a data streaming platform to build event-driven microservices and do stream processing for faster, better customer onboarding.
At the Sydney tour stop of the Confluent Data in Motion Tour 2023, Kmart’s Principal Architect - Enterprise Technology, Duane Gomes, gave us the lowdown on how the company has used data streaming to power the digital loyalty program OnePass.
Kafka-on-Windows tutorials are everywhere, but most run Kafka directly on Windows. Here's how to use Kafka on Windows in a Linux environment backed by WSL 2, maximizing performance and stability
Explore how Confluent and Apache Kafka are reshaping the data streaming landscape in FinTech. Learn how these powerful tools are fostering efficiency and scalability, featuring customers like Kredivo Holdings.
Using microservices, Confluent connectors, and stream processing on applicant data, historical data, actuarial tables, and predictive modeling to instantly generate insurance quotes.