This talk was given by Daniel Rossos, a Software Engineer Intern at IBM.
As event-driven architecture progressively dominates industry data pipelines, a diversity of infrastructure patterns appear. By using Kafka and Apache Ignite hosted on an IBM OpenShift cluster, we were able to create a real-time, credit-card-transaction processor.
This pattern uses Kafka for processing and mapping the credit card data. The pattern also uses GridGain Kafka Connect for synсing and sourcing data between Apache Ignite and Kafka, and uses Apache Ignite as not only a database, but also to run an in-memory continuous query to flag the fraudulent transaction. These flagged transactions are then placed into a separate cache and, by way of a synс connector, are transferred back into Kafka within a “fraud” topic.
This pattern enables the event-processing power of Kafka with the in-memory computing and querying speed of Apache Ignite.
This session describes the details surrounding the process of creating this architecture, from writing the Apache Ignite Helm Chart to configuring the continuous query. We discuss what we have learned, as well as what the next steps are in building on a design like this.
Talk Date