Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Sync Stream Sources are an integral part of the real-time sync experience in Connections, and must be configured prior to executing your data sync.
Stream Sources are configured and enabled via the Listener Configuration table or the Connections UI in Cinchy.
While a Stream Source can feasibly be configured at any point during the creation of a real-time data sync, we recommend that you do so as a final step after setting up your Source, Destination, and any other relevant additional settings.
Stream Sources are only used for real-time syncs.
A Salesforce Push Topic is a supported Sync Source that you can use in your Cinchy data syncs. The below documentation will refer to the parameters necessary to set up your Push Topic as part of your sync configuration.
You can use a Push Topic already configured in Salesforce, or have Cinchy Event Listener create the Push Topic for you.
Cinchy will compare the JSON with the properties on the push topic in Salesforce by name. If the attributes match, the listener will start listening on the push topic.
Cinchy will compare the JSON with the properties on the push topic in Salesforce by name. If any of the attributes don't match, Cinchy will sync the push topic from Salesforce into Cinchy and disable the listener.
If the Push Topic name doesn't exist in Salesforce, Cinchy will attempt to create the Push Topic. If it's successful, it will sync in the Id from Salesforce and start listening on the push topic.
To set up an Stream Source, you must navigate to the Listener Config table and insert a new row for your data sync (Image 1). Most of the columns within the Listener Config table persist across all Stream Sources, however exceptions will be noted. You can find all of these parameters and their relevant descriptions in the tables below.
The following column parameters can be found in the Listener Config table:
The below table can be used to help create your Topic JSON needed to set up a real-time sync.
Example Topic JSON
The below table can be used to help create your Connection Attributes JSON needed to set up a real-time sync.
The MongoDB stream source works similar to Cinchy's Change Data Capture functionality. The listener subscribes to monitor the change stream of a specific collection in the database of the MongoDB server. Any actions performed on document(s) inside of that collection are picked up by the listener and sent to the queue.
In order to use change streams in MongoDB, there are a few requirements your environment must meet.
The database must be in a or .
The database must use the storage engine.
The replica set or sharded cluster must use replica set protocol .
To set up an Stream Source, you must navigate to the Listener Config table and insert a new row for your data sync (Image 1). Most of the columns within the Listener Config table persist across all Stream Sources, however exceptions will be noted. You can find all of these parameters and their relevant descriptions in the tables below.
The following column parameters can be found in the Listener Config table:
The below table can be used to help create your Topic JSON needed to set up a real-time sync.
Example Topic JSON
The below table can be used to help create your Connection Attributes JSON needed to set up a real-time sync.
Parameter | Description | Example |
---|---|---|
Parameter | Description | Example |
---|---|---|
Parameter | Description | Example |
---|---|---|
Parameter | Description | Example |
---|
Parameter | Description | Example |
---|
Parameter | Description | Example |
---|
Name
Mandatory. Provide a name for your Listener Config.
SF Push Topic Real-Time Sync
Event Connector Type
Mandatory. Select your Connector type from the drop down menu.
Salesforce Push Topic
Topic
Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring.
See the Topic tab.
Connection Attributes
Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring.
See the Connection Attributes tab.
Status
Mandatory. This value refers to whether your listener config/real-time sync is turned on or off. Make sure you keep this set to Disabled until you are confident you have the rest of your data sync properly configured first.
Disabled
Data Sync Config
Mandatory. This drop down will list all of the data syncs on your platform. Select the one that you want to use for your real-time sync.
Salesforce Data Sync
Subscription Expires On
This value is only relevant for Salesforce Stream Sources. This field is a timestamp that's auto populated when it has successfully subscribed to a topic.
Message
Leave this value blank when setting up your configuration. This field will auto populate during the running of your sync with any relevant messages. For instance Cinchy listener is running, or Listener is disabled.
Auto Offset Reset
Earliest, Latest or None. In the case where the listener is started and either there is no last message ID, or when the last message ID is invalid (due to it being deleted or it's just a new listener), it will use this column as a fallback to determine where to start reading events from. Earliest will start reading from the beginning on the queue (when the CDC was enabled on the table). This might be a suggested configuration if your use case is recoverable or re-runnable and if you need to reprocess all events to ensure accuracy. Latest will fetch the last value after whatever was last processed. This is the typical configuration. None won't read start reading any events. You are able to switch between Auto Offset Reset types after your initial configuration through the below steps: 1. Navigate to the Listener Config table. 2. Re-configure the Auto Offset Reset value. 3. Set the Status column of the Listener Config to Disabled. 4. Navigate to the Event Listener State table. 5. Find the column that pertains to your data sync's Listener Config and delete it. 6. Navigate back to the Listener Config table. 7. Set the Status column of the Listener Config to Enabled in order for your new Auto Offset Reset configuration to take effect.
Latest
Id
Name
Mandatory. Descriptive name of the PushTopic. Note that there is a 25 character limit on this field.
LeadsTopic
Query
Mandatory. The SOQL query statement that determines which record changes trigger events to be sent to the channel. Note that there is a 1,300 character limit on this field.
SELECT Id, Name, Email FROM Lead
ApiVersion
Mandatory. The API version to use for executing the query specified in Query. It must be an API version greater than 20.0. If your query applies to a custom object from a package, this value must match the package's ApiVersion.
47.0
NotifyForOperationCreate
Set this to true if a create operation should generate a notification, otherwise, false. Defaults to true.
true
NotifyForOperationUpdate
Set this to true if an update operation should generate a notification, otherwise, false. Defaults to true.
true
NotifyForOperationUndelete
Set this to true if an undelete operation should generate a notification, otherwise, false. Defaults to true.
true
NotifyForOperationDelete
Set this to true if a delete operation should generate a notification, otherwise, false. Defaults to true.
true
NotifyForFields
Specifies which fields are evaluated to generate a notification. Possible values are:AllReferenced (default)SelectWhere
Referenced
ApiVersion
Mandatory. Your Salesforce API Version. Note that this needs to be an exact match; for instance "47.0" can't be written as simply 47
.
47.0
GrantType
This value should be set to password
.
password
ClientId
The encrypted Salesforce Client ID. You can encrypt this value using the Cinchy CLI.
Bn8UmtiLydmYQV6//qCL5dqfNUMhqchdk959hu0XXgauGMYAmYoyWN8FD+voGuMwGyJa7onrc60q1Hu6QFsQXHVA==
ClientSecret
The encrypted Salesforce Client Secret. You can encrypt this value using the Cinchy CLI.
DyU1hqde3cWwkPOwK97T6rzwqv6t3bgQeCGq/fUx+tKI=
Username
The encrypted Salesforce username. You can encrypt this value using the Cinchy CLI.
dXNlcm5hbWVAZW1haWwuY29t
Password
The encrypted Salesforce password You can encrypt this value using the Cinchy CLI.
cGFzc3dvcmRwYXNzd29yZA==
InstanceAuthUrl
The authorization URL of the Salesforce instance.
Name | Mandatory. Provide a name for your Listener Config. | MongoDB Real-Time Sync |
Event Connector Type | Mandatory. Select your Connector type from the drop down menu. | MongoDB |
Topic | Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring. | See the Topic tab. |
Connection Attributes | Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring. | See the Connection Attributes tab. |
Status | Mandatory. This value refers to whether your listener config/real-time sync is turned on or off. Make sure you keep this set to Disabled until you are confident you have the rest of your data sync properly configured first. | Disabled |
Data Sync Config | Mandatory. This drop down will list all of the data syncs on your platform. Select the one that you want to use for your real-time sync. | MongoDB Data Sync |
Subscription Expires On | This value is only relevant for Salesforce Stream Sources. This field is a timestamp that's auto populated when it has successfully subscribed to a topic. |
Message | Leave this value blank when setting up your configuration. This field will auto populate during the running of your sync with any relevant messages. For instance Cinchy listener is running, or Listener is disabled. |
Auto Offset Reset | Earliest, Latest or None. In the case where the listener is started and either there is no last message ID, or when the last message ID is invalid (due to it being deleted or it's just a new listener), it will use this column as a fallback to determine where to start reading events from. Earliest will start reading from the beginning on the queue (when the CDC was enabled on the table). This might be a suggested configuration if your use case is recoverable or re-runnable and if you need to reprocess all events to ensure accuracy. Latest will fetch the last value after whatever was last processed. This is the typical configuration. None won't read start reading any events. You are able to switch between Auto Offset Reset types after your initial configuration through the below steps: 1. Navigate to the Listener Config table. 2. Re-configure the Auto Offset Reset value. 3. Set the Status column of the Listener Config to Disabled. 4. Navigate to the Event Listener State table. 5. Find the column that pertains to your data sync's Listener Config and delete it. 6. Navigate back to the Listener Config table. 7. Set the Status column of the Listener Config to Enabled in order for your new Auto Offset Reset configuration to take effect. | Latest |
Database | Cinchy |
Collection | Employee |
Pipeline Stages | Optional. This parameter allows you to specify pipeline stages with filters.
| See the Example Topic JSON below. Our example config uses a filter to return documents with an ID between 0 and 10,000 AND documents with the location set to Montreal, OR where the operation type is 'delete' |
connectionString | mongodb://localhost:9877 |
A Salesforce Platform Event is a supported Sync Source that you can use in your Cinchy data syncs. The below documentation will refer to the parameters necessary to set up your Platform Event as part of your sync configuration.
To set up an Stream Source, you must navigate to the Listener Config table and insert a new row for your data sync (Image 1). Most of the columns within the Listener Config table persist across all Stream Sources, however exceptions will be noted. You can find all of these parameters and their relevant descriptions in the tables below.
The following column parameters can be found in the Listener Config table:
The below table can be used to help create your Topic JSON needed to set up a real-time sync.
Example Topic JSON
The below table can be used to help create your Connection Attributes JSON needed to set up a real-time sync.
The Listener Configuration table lets you set up your stream source configurations for real-time syncs via the Connections UI, and you can find that information on the relative . It's also important to track your real-time syncs. As of Cinchy v5.7, you can now set up your listener configuration on the Sources tab under the Listener section of the Connections experience. If your data sync requires more than one listener, you must set up additional configurations via the Listener Config table on your Cinchy platform.
You can also use this table to manage and review all the configurations currently set up on your system (Image 1). Most of the columns within the Listener Config table persist across all Stream Sources, however exceptions will be noted. You can find all of these parameters and their relevant descriptions in the tables below.
The following column parameters can be found in the Listener Config table:
Information about setting up your Topic JSON can be found on the individual real-time sync stream source configuration pages.
The Cinchy Event Broker/CDC is an event streaming source used to listen for changes on Cinchy tables and push those changes to various data sync destinations.
To set up an Stream Source, you must navigate to the Listener Config table and insert a new row for your data sync (Image 1). Most of the columns within the Listener Config table persist across all Stream Sources, however exceptions will be noted. You can find all of these parameters and their relevant descriptions in the tables below.
The following column parameters can be found in the Listener Config table:
The below table can be used to help create your Topic JSON needed to set up a real-time sync.
If you are creating a CDC listener config for a Cinchy Event Triggered REST API data source, pay in mind the following unique constraints:
Column names in the listener config shouldn't contain spaces. If they do, they will be automatically removed. (For example, a column named First Name will become @FirstName
)
The replacement variable names are case sensitive.
Column names in the listener config shouldn't be prefixes of other column names. E_.g. if you have a column called Name, you shouldn't have another called Name2 as the value of @Name2
may end up being replaced by the value of @Name suffixed with a 2
.
Example Topic JSON
You don't need to provide Connections Attributes when using the Cinchy CDC Stream Source, however you can't leave the field blank. Instead, insert the below text into the column:
Each of your Event Listener message keys a message key. By default, this key is dictated by the Cinchy ID of the record being changed.
When the worker processes your Event Listener messages, it does so in batches, and for efficiency and to guarantee order, messages that contain the same key won't be processed in the same batch.
The messageKeyExpression property allows you to change the default message key to something else.
Ensuring records with the same message key can be updated with the proper ordering to reflect an accurate collaboration log history.
In this example, we want the message key to be based on the [Employee Id] and [Name] column of the table that CDC is enabled on.
The Cinchy Event Broker/CDC Stream Source has the unique capability to use Old and New parameters when filtering data. This filter can be a powerful tool for ensuring that you sync only the specific data that you want.
The "New" and "Old" parameters are based on updates to single records, not columns/rows.
"New" Example:
In the below filter, we only want to sync data where the [Approval State] of a record is newly 'Approved'. For example, if a record was changed from 'Draft' to 'Approved', the filter would sync the record.
Due to internal logic, newly created records will be tagged as both New and Old.
"Old" Example:
In the below filter, we only want to sync data where the [Status] of a record was 'In Progress' but has since been updated to any other [Status]. For example, if a record was changed from 'In Progress' to 'Done', the filter would sync the record.
Due to internal logic, newly created records will be tagged as both New and Old.
Mandatory. The name of your MongoDB
Mandatory. The name of your MongoDB
In MongoDB, an aggregation pipeline consists of one or more that process documents:
Mandatory. Your MongoDB
Parameter | Description | Example |
---|---|---|
Parameter | Description | Example |
---|---|---|
Parameter | Description | Example |
---|---|---|
Parameter | Description | Example |
---|
- - - - - -
Information regarding setting up your Connection Attribute(s) can be found on the individual real-time sync stream source configuration pages. - - - - - -
Parameter | Description | Example |
---|
Parameter | Description | Example |
---|
Name
Mandatory. Provide a name for your Listener Config.
SF Platform Event Real-Time Sync
Event Connector Type
Mandatory. Select your Connector type from the drop down menu.
Salesforce Platform Event
Topic
Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring.
See the Topic tab.
Connection Attributes
Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring.
See the Connection Attributes tab.
Status
Mandatory. This value refers to whether your listener config/real-time sync is turned on or off. Make sure you keep this set to Disabled until you are confident you have the rest of your data sync properly configured first.
Disabled
Data Sync Config
Mandatory. This drop down will list all of the data syncs on your platform. Select the one that you want to use for your real-time sync.
Salesforce Data Sync
Subscription Expires On
This value is only relevant for Salesforce Stream Sources. This field is a timestamp that's auto populated when it has successfully subscribed to a topic.
Message
Leave this value blank when setting up your configuration. This field will auto populate during the running of your sync with any relevant messages. For instance Cinchy listener is running, or Listener is disabled.
Auto Offset Reset
Earliest, Latest or None. In the case where the listener is started and either there is no last message ID, or when the last message ID is invalid (due to it being deleted or it's just a new listener), it will use this column as a fallback to determine where to start reading events from. Earliest will start reading from the beginning on the queue (when the CDC was enabled on the table). This might be a suggested configuration if your use case is recoverable or re-runnable and if you need to reprocess all events to ensure accuracy. Latest will fetch the last value after whatever was last processed. This is the typical configuration. None won't read start reading any events. You are able to switch between Auto Offset Reset types after your initial configuration through the below steps: 1. Navigate to the Listener Config table. 2. Re-configure the Auto Offset Reset value. 3. Set the Status column of the Listener Config to Disabled. 4. Navigate to the Event Listener State table. 5. Find the column that pertains to your data sync's Listener Config and delete it. 6. Navigate back to the Listener Config table. 7. Set the Status column of the Listener Config to Enabled in order for your new Auto Offset Reset configuration to take effect.
Latest
Name
Mandatory. The name of the Platform Event, as it appears in Salesforce, that you want to subscribe to.
Notification__e
ApiVersion
Mandatory. Your Salesforce API Version. Note that this needs to be an exact match; for instance "47.0" can't be written as simply 47
.
47.0
GrantType
This value should be set to password
.
password
ClientId
The encrypted Salesforce Client ID. You can encrypt this value using the Cinchy CLI.
Bn8UmtiLydmYQV6//qCL5dqfNUMhqchdk959hu0XXgauGMYAmYoyWN8FD+voGuMwGyJa7onrc60q1Hu6QFsQXHVA==
ClientSecret
The encrypted Salesforce Client Secret. You can encrypt this value using the Cinchy CLI.
DyU1hqde3cWwkPOwK97T6rzwqv6t3bgQeCGq/fUx+tKI=
Username
The encrypted Salesforce username. You can encrypt this value using the Cinchy CLI.
dXNlcm5hbWVAZW1haWwuY29t
Password
The encrypted Salesforce password You can encrypt this value using the Cinchy CLI.
cGFzc3dvcmRwYXNzd29yZA==
InstanceAuthUrl
The authorization URL of the Salesforce instance.
Name | Mandatory. Provide a name for your Listener Config. | CDC Real-Time Sync |
Event Connector Type | Mandatory. Select your Connector type from the drop down menu. | Cinchy CDC |
Topic | Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring. | See the Topic tab. |
Connection Attributes | Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring. | See the Connection Attributes tab. |
Status | Mandatory. This value refers to whether your listener config/real-time sync is turned on or off. Make sure you keep this set to Disabled until you are confident you have the rest of your data sync properly configured first. | Disabled |
Data Sync Config | Mandatory. This drop down will list all of the data syncs on your platform. Select the one that you want to use for your real-time sync. | CDC Data Sync |
Subscription Expires On | This value is only relevant for Salesforce Stream Sources. This field is a timestamp that's auto populated when it has successfully subscribed to a topic. |
Message | Leave this value blank when setting up your configuration. This field will auto populate during the running of your sync with any relevant messages. For instance |
Auto Offset Reset | Earliest, Latest or None. In the case where the listener is started and either there is no last message ID, or when the last message ID is invalid (due to it being deleted or it's just a new listener), it will use this column as a fallback to determine where to start reading events from. Earliest will start reading from the beginning on the queue (when the CDC was enabled on the table). This might be a suggested configuration if your use case is recoverable or re-runnable and if you need to reprocess all events to ensure accuracy. Latest will fetch the last value after whatever was last processed. This is the typical configuration. None won't read start reading any events. You are able to switch between Auto Offset Reset types after your initial configuration through the below steps: 1. Navigate to the Listener Config table. 2. Re-configure the Auto Offset Reset value. 3. Set the Status column of the Listener Config to Disabled. 4. Navigate to the Event Listener State table. 5. Find the column that pertains to your data sync's Listener Config and delete it. 6. Navigate back to the Listener Config table. 7. Set the Status column of the Listener Config to Enabled in order for your new Auto Offset Reset configuration to take effect. | Latest |
Name | Mandatory. Provide a name for your listener config. | CDC Real-Time Sync |
Event Connector Type | Mandatory. Select your Connector type from the drop down menu. | Cinchy CDC |
Topic | Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring. | See the Topic tab. |
Connection Attributes | Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring. | See the Connection Attributes tab. |
Status | Mandatory. This value refers to whether your listener config/real-time sync is turned on or off. Make sure you keep this set to Disabled until you are confident you have the rest of your data sync properly configured first. | Disabled |
Data Sync Config | Mandatory. This drop down will list all of the data syncs on your platform. Select the one that you want to use for your real-time sync. | CDC Data Sync |
Subscription Expires On | This value is only relevant for Salesforce Stream Sources. This field is a timestamp that's auto populated when it has successfully subscribed to a topic. |
Message | Leave this value blank when setting up your configuration. This field will auto populate during the running of your sync with any relevant messages. For example, **Cinchy listener is running**, or **Listener is disabled**. |
Auto Offset Reset | Earliest, Latest or None. In the case where the listener is started and either there is no last message ID, or when the last message ID is invalid (due to it being deleted or it's just a new listener), it will use this column as a fallback to determine where to start reading events from. | Latest |
Table GUID | Mandatory. The GUID of the table whose notifications you wish to consume. You can find this in the Design Table screen. | 16523e54-4242-4156-835a-0e572e862304 |
Column(s) | Name Age |
BatchSize | The desired result batch size. This will default to 1 if not passed in. The maximum batch size is 1000; using a number higher than that will result in a Bad Request response. | 10 |
Filter | Optional.When CDC is enabled, you can set a filter on columns where you are capturing changes in order to receive specific data. | Optional. When CDC is enabled, you can set a filter on columns where you are capturing changes in order to receive specific data. In the below example, we will only trigger changes on newly approved records by using the New filter to include all records where the [Approval State] is equal to 'Approved' AND the record is New. The filter also uses the "Old" filter to disinclude all records where the [Approval State] isn't equal to approved AND the record is Old, (already been synced to the target). Example: |
The page outlines the parameters that should be included in your listener configuration when setting up a real time sync with a Kafka stream source. This process currently supports **JSON string or AVRO serialized format.**To listen in on multiple topics, you will need to configure multiple listener configs.
To listen in on multiple topics, you will need to configure multiple listener configs.
To set up an Stream Source, you must navigate to the Listener Config table and insert a new row for your data sync (Image 1). Most of the columns within the Listener Config table persist across all Stream Sources, however exceptions will be noted. You can find all of these parameters and their relevant descriptions in the tables below.
The following column parameters can be found in the Listener Config table:
The below table can be used to help create your Topic JSON needed to set up a real-time sync.
Example Topic JSON
The below table can be used to help create your Connection Attributes JSON needed to set up a real-time sync.
Version 5.4 of the Cinchy platform introduced data polling, which uses the Cinchy Event Listener to continuously monitor and sync data entries from your SQLServer or DB2 server into your Cinchy table. This capability makes data polling a much easier, effective, and streamlined process and avoids implementing the complex orchestration logic that was previous necessary.
This page outlines the necessary Listener Config values that need to be used prior to setting up your data sync.
To set up an Stream Source, you must navigate to the Listener Config table and insert a new row for your data sync (Image 1). Most of the columns within the Listener Config table persist across all Stream Sources, however exceptions will be noted. You can find all of these parameters and their relevant descriptions in the tables below.
The following column parameters can be found in the Listener Config table:
The below table can be used to help create your Topic JSON needed to set up a real-time sync.
FROM [SourceTable] WHERE Id IN (SELECT TOP (100) Id FROM [SourceTable] WHERE Id > 0 AND Name IS NOT NULL ORDER BY Id) AND Id > 0 AND Name IS NOT NULL | | FromClause | Mandatory. This must contain at least the table name but can also contain Joined tables as written in SQL language. | Example: [Source Table] | | CursorColumn | Mandatory. Column name that's used in any 'WHERE' condition(s) and for ordering the result of a query | Example: [Id] | | BatchSize | Mandatory. Minimum size of a batch of data per query. This can be larger to prevent infinite loops if the CursorColumn isn't unique. | Example: 100 | | FilterCondition | All filtering options used in any 'WHERE' condition(s) of the query | Example: Name IS NOT NULL
| | Columns | Mandatory. A list of columns that we want to show in a result. | Example:Id
, Name
| | ReturnDataConfiguration |
The parameters here are used in more complex queries.
In our example, there are 2 related tables, but want to show the contents of one of them based on the CursorColumn from a second table. Since Timestamp values aren't unique, we need to find all combinations of Id, Timestamp that match the filter condition in a subquery, and then join this result with the outer-query to get the final result.
Note that in ReturnDataConfiguration, our parameters area of concern is everything outside of first open parenthesis (
and last closing parenthesis )
, i.e.:
|
Example complex query:
| | CursorAlias | Mandatory. This is the alias for a subquery result table. It's used in 'JoinClause', and can be used in 'Columns' if we want to return values from a subquery table. | Example: "t" | | JoinClause | Mandatory. Our result table to which we join the subquery result, plus the condition of the join. | Example: [Table1] ts ON ts.[Id] = t.[Id] | | FilterCondition | All filtering options used in any 'WHERE' conditions. | Example: "ts.[Id] > 0" | | OrderByClause | Mandatory. This is the column(s) that we want to order our final result by. | Example: "Id" | | Columns | Mandatory. A list of columns that we want to show in the final result. | Example: "ts.[Id]" "ts.[name]" | | Delay | Mandatory. This represents the delay, in second, between data sync cycles once it no longer finds any new data. | Example: 10 | | messageKeyExpresssion | Optional, but recommended to mitigate data loss. See Appendix A for more information on this parameter. | id |
Example Topic JSON
The below table can be used to help create your Connection Attributes JSON needed to set up a real-time sync.
The messageKeyExpression parameter is an optional, but recommended, parameter that can be used to ensure that you aren't faced with a unique constraint violation during your data sync. This violation could occur if both an insert and an update statement happened at nearly the same time. If you choose not to use the messageKeyExpression parameter, you could face data loss in your sync.
This parameter was added to the Data Polling event stream in Cinchy v5.6.
Each of your Event Listener message keys a message key. By default, this key is unique for every message in the queue.
When the worker processes your Event Listener messages it does so in batches and, for efficiency and to guarantee order, messages that contain the same key won't be processed in the same batch.
The messageKeyExpression property allows you to change the default message key to something else.
Example:
Earliest will start reading from the beginning on the queue (when the CDC was enabled on the table). This might be a suggested configuration if your use case is recoverable or re-runnable and if you need to reprocess all events to ensure accuracy. Latest will fetch the last value after whatever was last processed. This is the typical configuration. None won't read start reading any events. You are able to switch between Auto Offset Reset types after your initial configuration through the process outlined
Mandatory. The names of the columns you wish to include in your sync. Note: If you will be using the parameter in your data sync, you only need to include the Cinchy Id in the topic JSON.
"filter": "New.[Approval State] = 'Approved' AND Old.[Approval State] != 'Approved'" (Hint: Click on the below image to enlarge)
Parameter | Description | Example |
---|---|---|
Parameter | Description | Example |
---|---|---|
Parameter | Description |
---|---|
Parameter | Description | Example |
---|---|---|
Parameter | Description | Example |
---|---|---|
Parameter | Description | Example |
---|---|---|
Name
Mandatory. Provide a name for your Listener Config.
Kafka Real-Time Sync
Event Connector Type
Mandatory. Select your Connector type from the drop down menu.
Kafka Topic
Topic
Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring.
See the Topic tab.
Connection Attributes
Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring.
See the Connection Attributes tab.
Status
Mandatory. This value refers to whether your listener config/real-time sync is turned on or off. Make sure you keep this set to Disabled until you are confident you have the rest of your data sync properly configured first.
Disabled
Data Sync Config
Mandatory. This drop down will list all of the data syncs on your platform. Select the one that you want to use for your real-time sync.
Kafka Data Sync
Subscription Expires On
This value is only relevant for Salesforce Stream Sources. This field is a timestamp that's auto populated when it has successfully subscribed to a topic.
Message
Leave this value blank when setting up your configuration. This field will auto populate during the running of your sync with any relevant messages. For instance Cinchy listener is running, or Listener is disabled.
Auto Offset Reset
Earliest, Latest or None. In the case where the listener is started and either there is no last message ID, or when the last message ID is invalid (due to it being deleted or it's just a new listener), it will use this column as a fallback to determine where to start reading events from. Earliest will start reading from the beginning on the queue (when the CDC was enabled on the table). This might be a suggested configuration if your use case is recoverable or re-runnable and if you need to reprocess all events to ensure accuracy. Latest will fetch the last value after whatever was last processed. This is the typical configuration. None won't read start reading any events. You are able to switch between Auto Offset Reset types after your initial configuration through the below steps: 1. Navigate to the Listener Config table. 2. Re-configure the Auto Offset Reset value. 3. Set the Status column of the Listener Config to Disabled. 4. Navigate to the Event Listener State table. 5. Find the column that pertains to your data sync's Listener Config and delete it. 6. Navigate back to the Listener Config table. 7. Set the Status column of the Listener Config to Enabled in order for your new Auto Offset Reset configuration to take effect.
Latest
topicName
Mandatory. This is the Kafka topic name to listen messages on.
messageFormat
Optional. Put "AVRO" if your messages are serialized in AVRO, otherwise leave blank.
bootstrapServers
List the Kafka bootstrap servers in a comma-separated list. This should be in the form of host:port
saslMechanism
This will be either PLAIN, SCRAM-SHA-256, or SCRAM-SHA-512. SCRAM-SHA-256 must be formatted as: SCRAMSHA256 SCRAM-SHA-512 must be formatted as: SCRAMSHA512
saslPassword
The password for your chosen SASL mechanism
saslUsername
The username for your chosen SASL mechanism.
url
This is required if your data follows a schema when serialized in AVRO. It's a comma-separated list of URLs for schema registry instances that are used to register or lookup schemas.
basicAuthCredentialsSource
Specifies the Kafka configuration property "schema.registry.basic.auth.credentials.source" that provides the basic authentication credentials. This can be "UserInfo" | "SaslInherit"
basicAuthUserInfo
Basic Auth credentials specified in the form of username:password
sslKeystorePassword
This is the client keystore (PKCS#12) password.
securityProtocol
Kafka supports cluster encryption and authentication, which can encrypt data-in-transit between your applications and Kafka.
Use this field to specify which protocol will be used for communication between client and server. Cinchy currently supports the following options: Plaintext, SaslPlaintext, or SaslSsl. Paintext: Unauthenticated, non-encrypted. SaslPlaintext: SASL-based authentication, non-encrypted. SaslSSL: SASL-based authentication, TLS-based encryption. If no parameter is specified, this will default to Plaintext.
Name
Mandatory. Provide a name for your Listener Config.
Data Polling Real-Time Sync
Event Connector Type
Mandatory. Select your Connector type from the drop down menu.
Data Polling
Topic
Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring.
See the Topic tab.
Connection Attributes
Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring.
See the Connection Attributes tab.
Status
Mandatory. This value refers to whether your listener config/real-time sync is turned on or off. Make sure you keep this set to Disabled until you are confident you have the rest of your data sync properly configured first.
Disabled
Data Sync Config
Mandatory. This drop down will list all of the data syncs on your platform. Select the one that you want to use for your real-time sync.
Data Polling Data Sync
Subscription Expires On
This value is only relevant for Salesforce Stream Sources. This field is a timestamp that's auto populated when it has successfully subscribed to a topic.
Message
Leave this value blank when setting up your configuration. This field will auto populate during the running of your sync with any relevant messages. For instance Cinchy listener is running, or Listener is disabled.
Auto Offset Reset
Earliest, Latest or None. In the case where the listener is started and either there is no last message ID, or when the last message ID is invalid (due to it being deleted or it's just a new listener), it will use this column as a fallback to determine where to start reading events from. Earliest will start reading from the beginning on the queue (when the CDC was enabled on the table). This might be a suggested configuration if your use case is recoverable or re-runnable and if you need to reprocess all events to ensure accuracy. Latest will fetch the last value after whatever was last processed. This is the typical configuration. None won't read start reading any events. You are able to switch between Auto Offset Reset types after your initial configuration through the below steps: 1. Navigate to the Listener Config table. 2. Re-configure the Auto Offset Reset value. 3. Set the Status column of the Listener Config to Disabled. 4. Navigate to the Event Listener State table. 5. Find the column that pertains to your data sync's Listener Config and delete it. 6. Navigate back to the Listener Config table. 7. Set the Status column of the Listener Config to Enabled in order for your new Auto Offset Reset configuration to take effect.
Latest
CursorConfiguration
Mandatory. The parameters here are used in a basic query which searches for all records in a particular table.
Note that in our example we need to use a sub-query to prevent an infinite loop if the "CursorColumn" parameter isn't unique.
Example basic query:
databaseType
Mandatory. TSQL or DB2
TSQL
connectionString
Mandatory. This should be the connection string for your data source.