Kafka Topic

Overview

Apache Kafka is an end-to-end event streaming platform that:

  • Publishes (writes) and subscribes to (reads) streams of events from sources like databases, cloud services, and software applications.

  • Stores these events durably and reliably for as long as you want.

  • Processes and reacts to the event streams in real-time and retrospectively.

Those events are organized and durably stored in topics. These topics are then partitioned over a number of buckets located on different Kafka brokers.

Event streaming thus ensures a continuous flow and interpretation of data so that the right information is at the right place, at the right time for your key use cases.

Example use case

You currently use Kafka to store the metrics for user logins, but being stuck in the Kafka silo means that you can't easily use this data across a range of business use cases or teams. You can use a batch sync to liberate your data into Cinchy.

The Kafka Topic source supports real-time syncs.

Info tab

You can find the parameters in the Info tab below (Image 1).

Values

Parameter
Description
Example

Title

Mandatory. Input a name for your data sync

Website Metrics

Variables

Optional. Review our documentation on Variables here for more information about this field.

Permissions

Data syncs are role based access systems where you can give specific groups read, write, execute, and/or all of the above with admin access. Inputting at least an Admin Group is mandatory.

Source tab

The following table outlines the mandatory and optional parameters you will find on the Source tab (Image 2).

The following parameters will help to define your data sync source and how it functions.

Parameter
Description
Example

Source

Mandatory. Select your source from the drop down menu.

Kafka Topic

Next steps

Last updated