“The expectation these days is that banks should always be available. Customers constantly want real-time experiences. One of the best ways to facilitate that is by leveraging a platform like Confluent.”
"We know that Confluent Platform and Apache Kafka will continue to help us scale."
“Confluent plays an integral role in accelerating our journey to becoming a data-first and digital business…We use Confluent Cloud as an essential piece of our data infrastructure to unlock data and stream it in real time.”
"Kafka technology, combined with Confluent's enterprise features and high-performance Intel architecture, support our mission to make it safe for Intel to go fast."
"We needed a real-time event streaming architecture to improve the time to market for new apps and reduce the time it takes to stand up and build new clusters and transform a couple of key use cases. Confluent Platform is really helping us, enabling us to move the data in more flexible and reliable ways."
"Availability of data is crucial, and having a fully managed platform for the data distribution lets us provide solutions to our customers at speed."
”With Confluent, it was easier for the team to set up, configure, and scale to support the increasing volume and throughput of the source systems we’re generating. This allowed us to build more microservices that are processing and aggregating data to deliver fit-for-purpose topics.”
"One of the things we did right on this project was work with Confluent early on."
"With Confluent Cloud we quickly had a state-of-the-art Kafka cluster up and running perfectly smoothly. And if we run into any issues, we have experts at Confluent to help us look into them and resolve them. That puts us in a great position as a team and as a company."
“Notion grew by millions of members in 2021, driven by increasing demand for remote collaboration tools during the pandemic. But we realized that our legacy messaging architecture that powered event logging wasn’t going to scale with this kind of rapid growth. At the same time, we needed to optimize costs to ensure long-term sustainability.”
“We wanted to be able to transform the data into the shape we needed by building our own connectors to other locations that weren’t supported by our existing tools.”
"Since deployment this year, we've never had any form of downtime with respect to processing millions of requests from our customers."
"You have real-time data: stock prices, things that are ticking. You have data that's pretty static: terms and conditions, things like that. And then you have data that's updating periodically: things like position updates. If you can use a tool like Kafka to pull all that data together and combine it in ways that you display to end users—whether they be traders, salespeople, managers—and you provide them analytics across that data, it's extremely powerful."
"Fans care about what seats are still available, how much do they cost, and what packages they can get with the tickets. And the venue wants to know who’s buying the tickets, are we having problems converting people, do we need to change the offering, or do we need to put another tour date on the calendar because this one is super popular. These are all different ways of looking at the same data. So we put all of this into an inventory stream which gets placed in a different data store for the various uses of that data. Confluent and Kafka have allowed us to get to the position where it’s now fairly low-friction for the data science team to roll out new capabilities with our data."
"If you want more compute, it just shows up out of nowhere. With Confluent, it’s taken care of."