Kafka Topic

1. Overview

Apache Kafka is an end-to-end event streaming platform that:

  • Publishes (writes) and subscribes to (reads) streams of events from sources like databases, cloud services, and software applications.

  • Stores these events durably and reliably for as long as you want.

  • Processes and reacts to the event streams in real-time and retrospectively.

Those events are organized and durably stored in topics. These topics are then partitioned over a number of buckets located on different Kafka brokers.

Event streaming thus ensures a continuous flow and interpretation of data so that the right information is at the right place, at the right time for your key use cases.

Prior to setting up your data sync destination, ensure that you've configured your Source.

The Kafka Topic destination supports batch and real-time syncs.

2. Destination Tab

The following table outlines the mandatory and optional parameters you will find on the Destination tab (Image 1).

The following parameters will help to define your data sync destination

and how it functions.

Parameter
Description
Example

Destination

Mandatory. Select your destination from the drop down menu.

Kafka Topic

Bootstrap Servers

Mandatory. Bootstrap Servers are a list of host/port pairs to use for establishing the initial connection to the Kafka cluster.

This parameter should a CSV list of "broker host" or "host:port"

localhost:9092,another.host:9092

Topic Name

Mandatory. The name of the Kafka Topic that messages will be produced to.

Use SSL

Check this if you want to connect to Kafka over SSL

SASL Mechanism

Mandatory. Select the SASL (Simple Authentication and Security Layer) Mechanism to use for authentication: - None - PLAIN - SCRAM-SHA-256 - SCRAM-SHA-512 - OATHBEARER (default) - OATHBEARER (oidc)

4. Next Steps

Last updated