Confluent provides the industry’s only enterprise-ready Event Streaming Platform, driving a new paradigm for application and data infrastructure. With Confluent Platform you can leverage data as a continually updating stream of events rather than as discrete snapshots.
Kafka Project @ ERGO
In 2019, ERGO decided to implement Confluent Kafka by its trusted implementation partner beON consult.
Challenge
Improve ERGO`s customer experience through data while staying compliant with industry regulations
Solution
beON implemented Confluent Platform for ERGO to stream data in real time, migrate to an event-based architecture and embark on a microservices transformation
What is Apache Kafka?
Apache Kafka® is a distributed streaming platform that:
- Publishes and subscribes to streams of records, similar to a message queue or enterprise messaging system.
- Stores streams of records in a fault-tolerant durable way.
- Processes streams of records as they occur.
Kafka is used for these broad classes of applications:
- Building real-time streaming data pipelines that reliably get data between systems or applications.
- Building real-time streaming applications that transform or react to the streams of data.
Quelle: Confluent
What motivated ERGO to go for Kafka?
The transformation in ERGO …
- improves ERGO`s customer experiences with streaming data.
- capitalizes on the next generation of open-source, cloud-enabled, real-time and responsive application and software development.
- designs new services faster and meet the needs of ERGO’s diverse client base.
- helps the business lines bring up new services faster while helping the development teams navigate complex data ownership, security, cloud and regulatory compliance.
- enables the insurance to become a technology and data-driven organization.
- integrates data from various backend systems into a forward cache, improving performance, simplifying consumer-facing application development, and reducing demand on costly mainframes.
What are the results for ERGO?
Streaming Data Empowers ERGO to be Data Driven
- Higher quality, faster and scalable customer services
- Reduction of Mainframe costs
- Accelerated delivery of new solutions
- Establishment of a platform for future expansion
- Facilitating Innovation
- Increased efficiency in application building
- Lowered anomaly detection time from weeks to seconds
- Implemented data reuse across teams for relevant business insights
- Stream across on-premises and public or private clouds
- Mission-critical reliability: Stream at enterprise scale, delivering sub-25ms latency at GBps throughput.
- Secure ERGO streaming platform with enterprise-grade encryption, authentication, and authorization.