Comment on page
The page outlines the parameters that should be included in your listener configuration when setting up a real time sync with a Kafka stream source. This process currently supports **JSON string or AVRO serialized format.**To listen in on multiple topics, you will need to configure multiple listener configs.
To listen in on multiple topics, you will need to configure multiple listener configs.
To set up an Stream Source, you must navigate to the Listener Config table and insert a new row for your data sync (Image 1). Most of the columns within the Listener Config table persist across all Stream Sources, however exceptions will be noted. You can find all of these parameters and their relevant descriptions in the tables below.
Image 1: The Listener Config table
The following column parameters can be found in the Listener Config table:
The below table can be used to help create your Topic JSON needed to set up a real-time sync.
Example Topic JSON
"topicName": "<(mandatory) kafka topic name to listen messages on>",
"messageFormat": "<(optional) Put "AVRO" if the messages are serialized in AVRO>"
The below table can be used to help create your Connection Attributes JSON needed to set up a real-time sync.
"bootstrapServers": "< (mandatory) kafka bootstrap servers in a comma-separated list in the form of host:port>",
"url": "<(optional) This is required if your data follows a schema when serialized in Avro. A comma-separated list of URLs for schema registry instances that are used to register or lookup schemas. >",
"basicAuthCredentialsSource": "<(optional) Specifies the Kafka configuration property "schema.registry.basic.auth.credentials.source" that provides the basic authentication credentials, this can be "UserInfo" | "SaslInherit">",
"basicAuthUserInfo": "<(optional) Basic Auth credentials specified in the form of username:password>",
"sslKeystorePassword": "<(optional) The client keystore (PKCS#12) password>"
"securityProtocol": "Plaintext | SaslPlaintext | SaslSsl"