Apache Kafka users need a quick and easy way to view the state of their message queues and see if recent messages are being ingested correctly. Previously, you’d have needed external tools, monitoring systems, or APIs to view messages residing on a specific Kafka topic.
Now you can view Kafka messages live in the Aiven Console.
Viewing the message queues dynamically in the console allows Aiven for Apache Kafka users to get a feel for their data, as well as troubleshoot their event streams, content and schema.
The message-viewing feature comes with the ability to sort messages by location, partition, offset, timeout and maximum size. This adds simplicity and provides a degree of granularity useful for troubleshooting complex message queues.
How to use Aiven Console to produce and view messages
Messages written to Aiven for Apache Kafka topics appear under the Topics section of your Kafka instance in the Console. Let’s generate a few messages to show how the feature works.
Viewing messages from the console
NOTE: You’ll need an Aiven for Apache Kafka Business-4 plan or better to see Kafka messages in Aiven Console. And you’ll need to enable Kafka REST API to use the feature, as follows:
In the “Topics” tab on Aiven Console, create a couple of topics, for example
Now, let’s try producing and viewing messages directly from the
jsontest topic from within Aiven Console. First, click <span></span> icon in the
jsontest topic row — this will expand the message view window.
Next, press the <span></span> button to invoke the Produce Message dialog. Enter a valid JSON message and value, and hit <span></span>. Once you are looking at the message view again, just hit <span></span> to update the list.
Here’s how that looks altogether:
Of course, the messages would also appear here if you’d piped them in from a producer, or used, for example
curl to generate new messages to your topic. (You’d simply press the <span></span> button to refresh the list.)
The ability to produce and fetch messages from within the Aiven Console makes it a lot easier for Kafka users to test their clusters, event streams, and topic configurations. These troubleshooting tools remove some of the guesswork and extra tooling needed for users to get a feel for their data as it moves through their messaging queue.
Jun 2, 2021
Build a Streaming SQL Pipeline with Apache Flink® and Apache Kafka®
Apache Kafka® is the perfect base for a streaming application. Apache Flink® has the power of stateful data transformations. Together, they move data!
Jul 12, 2022
Why streaming data is essential to empower the ‘Modern Data Stack'
The case for streaming over batch data continues to get stronger. Read on to see how our experience confirms this.
Feb 10, 2021
Create your own data stream for Apache Kafka® with Python and Faker
How can you test an empty data pipeline? Well, you can't, really. Read on and let's walk you through creating pretend streaming data using Python and Faker.
Subscribe to the Aiven newsletter
All things open source, plus our product updates and news in a monthly newsletter.