Kafka Topic
Last updated
Was this helpful?
Last updated
Was this helpful?
is an end-to-end event streaming platform that:
Publishes (writes) and subscribes to (reads) streams of events from sources like databases, cloud services, and software applications.
Stores these events durably and reliably for as long as you want.
Processes and reacts to the event streams in real-time and retrospectively.
Those events are organized and durably stored in topics. These topics are then partitioned over a number of buckets located on different Kafka brokers.
Event streaming thus ensures a continuous flow and interpretation of data so that the right information is at the right place, at the right time .
Example Use Case: You currently use Kafka to store the metrics for user logins, but being stuck in the Kafka silo means that you can't easily use this data across a range of business use cases or teams. You can use a batch sync in order to liberate your data into Cinchy.
The Kafka Topic source supports batch syncs.
You can review the parameters that can be found in the info tab below (Image 1).
Title
Mandatory. Input a name for your data sync
Website Metrics
Version
Mandatory. This is a pre-populated field containing a version number for your data sync. You can override it if you wish.
1.0.0
Parameters
The following table outlines the mandatory and optional parameters you will find on the Source tab.
The following parameters will help to define your data sync source and how it functions.
Source
Mandatory. Select your source from the drop down menu.
Kafka Topic
Optional. Review our for more information about this field.
The section is where you define which source columns you want to sync in your connection. You can repeat the values for multiple columns.
You have the option to add a source filter to your data sync. Please review the documentation here for more information on
Configure your
Define your.
Add in your , if required.
Define your .
To run a real-time sync, and enable it to begin your sync.