Kafka Topic
- Publishes (writes) and subscribes to (reads) streams of events from sources like databases, cloud services, and software applications.
- Stores these events durably and reliably for as long as you want.
- Processes and reacts to the event streams in real-time and retrospectively.
Those events are organized and durably stored in topics. These topics are then partitioned over a number of buckets located on different Kafka brokers.
Event streaming thus ensures a continuous flow and interpretation of data so that the right information is at the right place, at the right time for your key use cases.
The Kafka Topic destination supports batch and real-time syncs.
The following table outlines the mandatory and optional parameters you will find on the Destination tab (Image 1).
Destination Details
Column Mapping
The following parameters will help to define your data sync destination
- If you are running a batch sync, click Jobs > Start a Job to begin your sync.
and how it functions.
- If you are running a batch sync, click Jobs > Start a Job to begin your sync.
Parameter | Description | Example |
---|---|---|
Destination | Mandatory. Select your destination from the drop down menu. | Kafka Topic |
Bootstrap Servers | Mandatory. Bootstrap Servers are a list of host/port pairs to use for establishing the initial connection to the Kafka cluster.
This parameter should a CSV list of "broker host" or "host:port" | localhost:9092,another.host:9092 |
Topic Name | Mandatory. The name of the Kafka Topic that messages will be produced to. | |
Use SSL | Check this if you want to connect to Kafka over SSL | |
SASL Mechanism | Mandatory. Select the SASL (Simple Authentication and Security Layer) Mechanism to use for authentication:
- None
- PLAIN
- SCRAM-SHA-256
- SCRAM-SHA-512
- OATHBEARER (default)
- OATHBEARER (oidc) | |
The Column Mapping section is where you define which source columns you want to sync to which destination columns. You can repeat the values for multiple columns.
The names you specify in your "Target Column" value will turn into attributes in a JSON payload that will be constructed and pushed to Kafka. The name of this target column can be whatever you choose, but we recommend maintaining your naming convention across columns for simplicity.
Parameter | Description | Example |
---|---|---|
Source Column | Mandatory. The name of your column as it appears in the source. | Name |
Target Column | Mandatory. The name of your column as it appears in the destination. | Name |

Image 2: Define your Destination
- If you are running a batch sync, click Jobs > Start a Job to begin your sync.
Last modified 3mo ago