Create a Debezium source connector from MongoDB to Apache Kafka®
Track and write MongoDB database changes to an Apache Kafka® topic in a standard format with the Debezium source connector, enabling transformation and access by multiple consumers using a MongoDB replica set or sharded cluster.
Aiven supports multiple Debezium versions through multi-version support, including versions 1.9.7, 2.5.0, 2.7.4, and 3.1.0.
Debezium 2.5 introduced changes to connector configuration and behavior. To prevent
unintentional upgrades during maintenance updates, pin the connector version using the
plugin_versions configuration property.
For details, see Manage connector versions.
If you use Debezium for PostgreSQL version 1.9.7 with the wal2json replication format,
do not upgrade to version 2.0 or later until you migrate to a supported format such as
pgoutput.
To upgrade from version 1.9.7, use multi-version support to test your configuration before applying changes in production.
For further assistance, contact Aiven support.
Prerequisites
To configure a Debezium source connector for MongoDB, you need either an Aiven for Apache Kafka service with Apache Kafka Connect enabled or a dedicated Aiven for Apache Kafka Connect cluster.
You can view the full set of available parameters and configuration options in the connector's documentation.
Before you begin, gather the necessary information about your source MongoDB database:
MONGODB_HOST: The database hostnameMONGODB_PORT: The database portMONGODB_USER: The database user to connectMONGODB_PASSWORD: The database password for theMONGODB_USERMONGODB_DATABASE_NAME: The database name to include in the replicaMONGODB_REPLICA_SET_NAME: The name of MongoDB's replica setAPACHE_KAFKA_HOST: The hostname of the Apache Kafka service, only needed when using Avro as data formatSCHEMA_REGISTRY_PORT: The Apache Kafka's schema registry port, only needed when using Avro as data formatSCHEMA_REGISTRY_USER: The Apache Kafka's schema registry username, only needed when using Avro as data formatSCHEMA_REGISTRY_PASSWORD: The Apache Kafka's schema registry user password, only needed when using Avro as data format
With Aiven for Apache Kafka, you can gather the necessary Apache Kafka
details from the service's Overview page on the Aiven
console or by using the avn service get command with
the Aiven CLI.
Setup a MongoDB Debezium source connector with Aiven Console
The following example demonstrates how to setup a Debezium source connector for Apache Kafka to a MongoDB database using the Aiven Console.
Define a Kafka Connect configuration file
Create a configuration file named debezium_source_mongodb.json with the following
connector configurations. While optional, creating this file helps you organize your
settings in one place and copy/paste them into the
Aiven Console later.
{
"name":"CONNECTOR_NAME",
"connector.class": "io.debezium.connector.mongodb.MongoDbConnector",
"mongodb.hosts": "MONGODB_REPLICA_SET_NAME/MONGODB_HOST:MONGODB_PORT",
"mongodb.name" : "MONGODB_DATABASE_NAME",
"mongodb.user": "MONGODB_USER",
"mongodb.password": "MONGODB_PASSWORD",
"tasks.max":"NR_TASKS",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "https://APACHE_KAFKA_HOST:SCHEMA_REGISTRY_PORT",
"key.converter.basic.auth.credentials.source": "USER_INFO",
"key.converter.schema.registry.basic.auth.user.info": "SCHEMA_REGISTRY_USER:SCHEMA_REGISTRY_PASSWORD",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "https://APACHE_KAFKA_HOST:SCHEMA_REGISTRY_PORT",
"value.converter.basic.auth.credentials.source": "USER_INFO",
"value.converter.schema.registry.basic.auth.user.info": "SCHEMA_REGISTRY_USER:SCHEMA_REGISTRY_PASSWORD"
}
The configuration file contains the following entries:
-
name: The connector name, replace CONNECTOR_NAME with the name you want to use for the connector -
MONGODB_HOST,MONGODB_PORT,MONGODB_DATABASE_NAME,MONGODB_USER,MONGODB_PASSWORDandMONGODB_REPLICA_SET_NAME: Source database parameters collected in the prerequisite phase. -
tasks.max: Maximum number of tasks to execute in parallel. By default this is 1, the connector can use at most 1 task for each collection defined. ReplaceNR_TASKSwith the amount of parallel task based on the number of input collections. -
key.converterandvalue.converter: Defines the messages data format in the Apache Kafka topic. Theio.confluent.connect.avro.AvroConverterconverter pushes messages in Avro format. To store the message schemas, Aiven's Karapace schema registry is used, specified by theschema.registry.urlparameter and related credentials.noteThe
key.converterandvalue.convertersections are only needed when pushing data in Avro format. Otherwise, messages default to JSON format.The
USER_INFOis not a placeholder and does not require any parameter substitution.