-
“When it comes to batch processing versus stream processing, both of them work. But one is a much more efficient way of doing it. A lot of the cost savings over the last twelve months has come from moving into a more efficient way of developing applications with stream processing.”
-
”With Confluent, it was easier for the team to set up, configure, and scale to support the increasing volume and throughput of the source systems we’re generating. This allowed us to build more microservices that are processing and aggregating data to deliver fit-for-purpose topics.”
-
"Confluent serves as a central nervous system connecting core ACERTUS systems, including our transportation management, title and registration, and car haul systems."
-
"Confluent has been instrumental in the company’s ability to scale. It’s not unusual to have 25 million concurrent users for a live match, and on a daily basis seven to eight terabytes of data come into the platform. According to Disney+ Hotstar, powering so many use cases at this scale …
-
"Running Kafka ourselves was a bit of a no-go given our team size and requirements. As a team we decided we wanted to focus on the technical functions of the platform. We wanted to spend more energy on these functionalities, and leave other things to Confluent who can do it …
-
“Kafka scales way better than most other data-exchange platforms, as it doesn’t use HTTP for data exchange and offers blazingly fast throughput out-of-the-box.”
-
"The Connection Platform is a tremendous advantage for us – it’s the core of our IT landscape for the next 10 to 15 years. It connects applications with channels through one vehicle that assures coherence of data."
-
"The Connection Platform is a game-changer for Generali. The data streaming solution we built with Attunity and Confluent allows us to replicate and stream data not in hours or days as in the past, but in a few seconds."
-
"As an insurer, at the end of the day, we have always been in a data industry. Our specialty is using data to price risk, and we are pretty good at it. Now, what we are getting better at is using data to engage with our customer."
-
“We shouldn’t care where the data sits. We should be able to share and move data seamlessly between environments. Streaming is the key. Otherwise you’re copying and shipping data and it’s a moment in time. And our strategy based on what we’re doing with Confluent Kafka is saying ‘the moment …
-
"With our Kafka-backed data pipeline, we are able to support our partners, who every year create more services, more features, more data instrumentation, and even more granular data than the year before."
-
"To provide accurate and current data across the Trimble Platform, it requires streaming data pipelines that connect our internal services and data systems across the globe. Custom connectors will allow us to quickly bridge our in-house event service and Kafka without setting up and managing the underlying connector infrastructure. We …
-
"Confluent and Google Cloud enabled us to address our large database footprint and retire our legacy data platform, which was in many ways our Achilles heel. After moving to real-time streaming on a cloud-based modern architecture, we can now deliver new features and capabilities to our customers and know that …
-
"Confluent is a strategic platform for us. With every project we look at, we now think about how we use Confluent to move things around and join things together."
-
"Running a modern bank serving fast growing enterprises requires data in motion we can trust. Schema registry allows us to provide customers with reliable, real-time experiences for tasks like loan processing while we maintain focus on furthering innovations. We’re excited about new discovery capabilities within Stream Governance which will enable …