In August, Aiven created an Apache Kafka meetup in Helsinki to discuss hot topics surrounding Kafka.
Due to its popularity, we decided to make it an ongoing event and held our second meetup yesterday.
This time, around 30 people attended Lifeline Ventures' office in downtown Helsinki and there were two presenters:
- Niklas Nylund of Paf, an operator of slots, lotteries, poker and casino games as well as betting both online and in casinos.
- Heikki Nousiainen, our CTO
Apache Kafka reduces the spaghetti
Paf collects Change Data Capture events from databases and sends the events to Apache Kafka, which are then consumed and imported into Kudu for analytic work. Although Kafka possesses its quirks, Nylund is satisfied with how Paf has been able to use it to streamline their architecture.
Or, as he vividly put it, "Reducing the spaghetti."
Check out his presentation slides here to get a thorough understanding of Paf's integration process from beginning to end.
Apache Kafka Connect simplifies integration
Using Python code to interact with the services, Nousiainen was able to demonstrate how easy it was to push and pull data between Kafka and the external systems.
With a large number of available connectors from the Kafka community, integrating Kafka with other systems can be quite straightforward.
This in turn allows quick benefits and migration towards real-time stream analytics with Kafka-centric architecture.
Check out Nousiainen's presentation slides to get a better idea of the use cases for Kafka Connect.
Join the next Kafka discussion
As the transition from a monolithic to microservices architecture continues, the use case for integrating Kafka as a streaming platform will only strengthen.
This is evidenced by the increase in attendance of our events where many developers, be they users of Kafka or not, are gathering to learn more about what Kafka is, its use cases, and best practices for implementing it.