Everactive, a pioneer in a new category of batteryless Internet of Things technology, sought a pipeline for huge amounts of sensor data, a solution that they wouldn’t have to manage themselves. In Aiven they found a flexible, reliable and secure partner, and Aiven for Apache Kafka® now forms the backbone of their data infrastructure.
Case study highlights
- Groundbreaking green innovation in sensor technology.
- Using Apache Kafka as a message broker between IoT devices and longterm storage databases.
- Small team + big data: needed a managed solution.
"With Aiven, things just work better than they did before."
Principal Software Engineer at Everactive
Founded in 2012 with roots going back to pioneering M.I.T. research, Everactive provides an IoT platform based on self-powered sensor devices and low-power wireless communication. “We’re trying to remove the batteries from the Internet of Things,” says Carlos Olmos, Senior Principal Software Engineer at Everactive.
The company’s low-power devices run on low levels of energy harvested from the environment, for example solar panels, radio waves, thermal gradients and vibrations. The feats of electrical engineering that created their offering, however, are only a part of the story. They also have to provide their customers with a performant data pipeline from sensor to storage to visualization.
“It’s not enough just to have hardware,” says Rob Cook, Principal Software Engineer at Everactive. “For the system to be useful, you have to make it easy for people to utilize it, not just in terms of building other hardware on top of it but also consuming the streams of data that it generates.”
The company’s first application was a self-powered Steam Trap Monitoring solution, released in 2018, which reduces energy waste and carbon emissions. In 2020 they made available a Machine Health Monitoring solution that analyzes vibrations of industrial rotating machines like motors and compressors. These are definitely not just hardware solutions; instead, they put data to work and employ innovative algorithms to process it to gain insights that no other systems can.
When Everactive constructed their first data pipeline in 2014, they started with Apache NiFi and OpenTSDB. Very soon they outgrew this setup. “We weren’t able to automate anything in that system, and management was a pain,” says Rob, “Then we ran into performance issues, and upgrading into a cluster wasn’t feasible.” If they couldn’t manage one single server, how would they ever manage a whole cluster?
Next they tried out an Apache Pulsar cluster, but this had essentially the same problems. “In theory, it was inexpensive to run our own cluster, but we couldn’t both run it and do our actual jobs,” Rob explains. “And we couldn’t find anybody we could pay money to run it for us.”
In the meantime, performance issues were accumulating. Sensor installations were taking too long, because the signals from the sensor had to travel through a bottlenecked system, which took up to 5 minutes per sensor.
At this point, too, the concept was fully commercialized and business was really taking off. They needed a solution. Now, at least, they had a much better idea of what they were looking for: a system that would be…
- … able to ingest huge amounts of data and pass it into a time-series database for processing.
- … easily scalable.
- … fault tolerant.
- … managed for Everactive by experts.
Fortunately, there was Apache Kafka®, a widely used solution available as a managed service.
"[With Kafka and Terraform] we’ve managed to automate almost everything: configuration, deployment, and maintenance."
Senior Principal Software Engineer at Everactive
Apache Kafka is designed for rapid high-volume throughput and is a staple of IoT and streaming architectures. It’s no wonder that Everactive found it suitable for their needs, too. Today, Aiven for Apache Kafka is very much at the heart of their data infrastructure. It receives time series data from millions of self-powered sensors and stores it temporarily. According to the original use case, the events are then ingested into a PostgreSQL database where customer systems can retrieve them.
Everactive has moved forward from the initial concept, however. Now their Apache Kafka instance serves data also to the monitoring system and elsewhere, because connectors are easily available.
Is it more expensive for Everactive? “Possibly,” Carlos says, “if you just count the wages of the staff. But every hour we spend trying to reboot a server is an hour we don't spend developing our core business.”
He continues: “Having our clusters managed by experts is also an insurance policy. Just one event, if it’s bad enough, can destroy the entire value you’ve built. If you’re buying a managed service, you know that there's a whole company devoted to preventing or fixing that event, and they’ll do it much more efficiently than we would do it internally.”
Everactive has now had their managed Apache Kafka service running for more than a year, and they are very happy with how things have turned out.
“With Aiven, things just work better than they did before,” says Rob.
According to Carlos, the clearest benefit is change management in the configuration of the system. “With Kafka, thanks to its APIs, and to the Terraform provider that Aiven put out there, we’ve managed to automate almost everything: configuration, deployment, and maintenance. The automation has given us a lot of speed ín the development work. It’s also provided security, not just in terms of protecting against malicious things, but also in terms of not making a mistake. So we can change our configuration and scale up. We got rid of those performance problems!”
Another clear benefit is improved observability. “Before, it was really hard to understand the health of the system-we didn't really know if the system was about to be overloaded or not.
It was only when a problem happened that we knew that it was in bad shape.”
Thanks to the metrics that Apache Kafka offers, that’s now history. “We can react early on when something is happening and our DevOps person sleeps better at night. Besides, in the system I can track only the metrics I’m interested in, and not get flooded with data I wasn’t asking for.
And those sensor installations? The signal checking time has gone from 5 minutes to 1 second. It’s not only Apache Kafka but also new microservices that are easy to build on top of it.
Rob says innovation is easy with Apache Kafka. “We've been able to offer some additional services that we wouldn't have been able to offer before. For example, our customers can now receive their readings via webhook, and soon also via a MQTT streaming service. With Apache Kafka, we can automatically spin up services without any manual steps."
Both Rob and Carlos are happy enough with their current setup to want more of the same. “We're always changing our systems, of course. Next we’re planning to put Aiven for Apache Kafka at the very center of everything, providing data to and from every endpoint in the system.”
"Having our clusters managed by experts is also an insurance policy."
Senior Principal Software Engineer at Everactive
Related case studies
Explore more customer success stories relevant to this one or browse all Aiven case studies
Energy and Utilities
OVO is a UK-based energy company. It uses an Apache Kafka-based data infrastructure to meet its sustainability goals.
Ometria's marketing platform delivers a consistent, personalized customer experience across many channels using Artifical Intelligence and PostgreSQL.
Software and Internet
NetSpyGlass offers advanced network monitoring automation. They use Terraform with their complex Apache Kafka and PostgreSQL-based data backend to simplify service management.
You might also like
What is Apache Kafka®?
Have you ever been confused by all this talk about kafkas and streaming? Get the basics in this post full of information and resources.
Introduction to event-driven architecture
Goodbye request-response, welcome producers and consumers! Read on to discover event-driven architecture, the best way to build microservice applications.