The top requirements for building performant real-time applications

Olivier de Garrigues from Lenses.io outlines a few things that engineering teams should consider when building and launching real-time apps — particularly in our world of fierce, fast innovation.

05 August 2020
Olivier de GarriguesHead of Alliances at Lenses.io

A real-time application is a program that sends data to the user in what is perceived to be real-time. These apps underpin nearly every aspect of our daily lives — from consumer apps that allow us to order and track a ride to business apps that let us collaborate in real-time.

When you see exactly where your Uber driver is after requesting a ride, you’re using a real-time app. When you’re collaborating within a Google Doc, you guessed it — real-time app.

The teams behind these applications are made up of some of the best engineers in the world; but, they are only one piece of the puzzle. To succeed and pull ahead of the competition, you need to combine talent with carefully selected, innovative technology, and best practices.

From architecture choices and technology considerations to data governance, let’s look at five key requirements when building best-in-class, real-time applications in greater detail.

Requirements for building successful real-time applications

1. Use a microservice architecture

Real-time apps most commonly run within a microservices architecture, a software development method where apps are structured as loosely coupled services. This is common practice for good reason — the architecture is more flexible, scalable, and reliable than a monolithic one.

Microservice architectures provide greater speed and scale for several reasons. For instance, you can more easily add, configure, and remove components with little to no downstream impact because they aren’t delivered as part of a single, logical executable.

But, every architecture comes with potential trade-offs — such as latency for microservices. This can be mitigated by making sound architectural choices and choosing the appropriate communications model and technology, such as publish-subscribe and Apache Kafka.

2. Utilize an open source data streaming platform

To zero in on real-time data and translate it into business value, Apache Kafka is an infrastructure must-have due to its low latency, scalability, and fault-tolerance. Kafka acts as the central communication hub for your microservices, thereby simplifying your application.

Kafka enables real-time stream processing through the Kafka Streams API, giving you the ability to transform and enrich your data through complex aggregations or joins of input streams on a per-record basis with millisecond latency onto an output stream of processed data.

Apache Kafka also has a diverse ecosystem of connectors that allow you to quickly connect Kafka to other systems like databases, file systems, search indexes, and key-value stores.

But, there’s a catch — Kafka can be notoriously difficult to manage. Many users who initially deploy and self-manage Kafka clusters begin to experience a lot of headaches as they scale. Often, they reach a point where they avoid updating their clusters for fear of the downstream impact.

That’s why we recommend using a managed Kafka service like Aiven for Kafka. A good managed service allows you to deploy clusters immediately in the cloud and region of your choice while also handling the nasty bits of management, such as when a node goes down.

3. Use self-healing nodes

Time spent babysitting your real-time app’s underlying data infrastructure is time not spent on building, deploying, and refining the app itself. Therefore, your team needs to be able to create Kafka clusters on demand without worrying about unhealthy nodes and surprise downtime.

Starting from zero? A managed Kafka service removes the inherent difficulty of setting up a self-hosted Kafka server. Already set up but struggling to scale? A good managed services will ensure high availability and security, as well as mitigate the unexpected costs from undetected failures.

A managed service takes care of countless mundane tasks such as replacing nodes when they go down (they will) and applying any patches/updates required for running the cluster within preselected maintenance windows.

4. Don’t forget about observability

Observability is a double-edged sword. You need to know that both your Kafka service and the data flowing through it are healthy. Aiven for Kafka provides free metrics and logging integrations for real-time, fine-grain performance and security insights out of the box.

On the other side, what does it mean to know if you have “healthy data”? Kafka is still a black box; an expanse that only grows as teams go remote and work in physical silos. To get a full overview of your app health, teams need a purpose-built workspace for their data operations.

Lenses.io workspaces give you full data visibility from all angles. A Lenses workspace allows nearly anyone across the business that speaks SQL to troubleshoot a problem on a streaming application within minutes, rather than waiting a few days or weeks to free up a Kafka wizard.

Lenses.io is fully compatible with Aiven for Kafka. This means teams can get a fully managed service that seamlessly and efficiently snaps into the foundation of their streaming apps, and then peer into their data pipelines and orchestrate access across teams as they scale.

5. Built-in data governance

Mature governance practices lead to great real-time applications. An enterprise framework and features for governance help secure sensitive data and give your teams the ability to go to market faster with data-dependent experiences and app features.

To scale a Kafka project across your business, it is best to visually represent data governance controls in a UI for non-engineers. This is particularly useful in helping executive stakeholders to speed up their seal of approval and avoid any slowdowns for mission-critical app teams.

Governance tools need to be in place before disaster strikes to reduce potential downtime. You can try removing dependencies from your relatively small team, but you run the risk of downtime as they prioritize building monitoring and auditing features. Manual configuration sure is difficult!

Lenses lets teams secure and audit data in any Kafka environment, create role-based groups, and set up data policies that automatically detect and redact metadata across real-time apps. Being able to see the trail of all actions across users gives decision-makers the confidence to open up their data platform across the business.

Aiven and Lenses.io

Lenses and Aiven have partnered to provide a full-spectrum, unified Kafka solution to ensure real-time apps make it to production in highly regulated environments and give them the best chance of success on the high, often turbulent seas of innovation.

When it comes to your application layer, you can trust Lenses to make app teams more efficient and productive. And when it comes to the stability and reliability of your infrastructure, you can trust Aiven to manage your Kafka deployments and assorted open-source data infrastructure.

Kafka-101-CTA

kafkamicroservices

Start your free 30 day trial

Test the whole platform for 30 days with no ifs, ands, or buts.

Aiven logo

Let‘s connect

Apache Kafka, Apache Kafka Connect, Apache Kafka MirrorMaker 2, Apache Cassandra, Elasticsearch, PostgreSQL, MySQL, Redis, InfluxDB, Grafana are trademarks and property of their respective owners. All product and service names used in this website are for identification purposes only and do not imply endorsement.