"With Kafka, we can liberate data from our heritage systems and combine it with real-time signals from our customers to deliver a hyper-personalized experience. For example, if our clickstream data shows a customer lingering over a product they looked at in the past, we can push a voucher to them in real time to create more compelling propositions for customers. You just can’t do that with a data lake and batch processing."
"As we bring in more private data, the security and monitoring capabilities of Confluent Platform are becoming more and more important. We are very bullish on using Confluent Control Center to improve the monitoring of our pipelines, from source to target."
"From a technology perspective at KeyBank, we want to continue to invest in new platforms that deliver new capabilities and improve our ability to deliver faster. When we saw the demand for harnessing data in motion growing, working with Confluent and investing in our center of excellence enabled us to implement a solution in a way that best benefits the enterprise by reducing complexity, costs, and time to market."
"By capturing events all along the food value chain, processing those events in Confluent Cloud and sharing insights with interested parties throughout the chain, we are not only increasing the value we provide but also making food production as a whole more intelligent and efficient."
"Confluent Cloud has become a vital service for us—the 100% availability and uptime have significantly benefited both our organization and our valued customers."
”The visual representation of data sources and topics with Stream Lineage is very helpful because a lot of time, the idea of streaming is new and this makes it possible to have discussions about where data is coming from and how it’s moving. I can’t stress enough how nice it is to have it be stable, always work, and always be there.”
“Notion grew by millions of members in 2021, driven by increasing demand for remote collaboration tools during the pandemic. But we realized that our legacy messaging architecture that powered event logging wasn’t going to scale with this kind of rapid growth. At the same time, we needed to optimize costs to ensure long-term sustainability.”
“We wanted to be able to transform the data into the shape we needed by building our own connectors to other locations that weren’t supported by our existing tools.”
“When we're using services like Confluent for data delivery, we don't need to think about it. It just gets delivered.”
"Confluent and Google Cloud enabled us to address our large database footprint and retire our legacy data platform, which was in many ways our Achilles heel. After moving to real-time streaming on a cloud-based modern architecture, we can now deliver new features and capabilities to our customers and know that they won’t be slowed by an outdated architecture."
"Confluent has become a lynchpin to harness Apache Kafka® for improved operational transparency and timely data-driven decisions."
“Think of Confluent as a Swiss Army knife - it can do so many things in terms of data processing. We want to reduce the latency and increase data availability as much as we can, and we know we can scale even further with Confluent.”
"Using Confluent’s out-of-the-box tools really helped us to focus on building new things, rather than solving problems in-house that have already been solved for us at scale by Confluent."
"It was a massive project, and a highly reliable message brokering infrastructure that could scale just by adding more nodes as our business needs changed was key to the entire effort."
"When we started, Kafka was a new technology to us, and one that we had decided to use for a very critical application in our system. With Confluent we felt supported in our decision and we knew we had the right level of expertise to get prepared and to help if we encountered any issues. That was a key element in our success."