Cinchy Platform Documentation
Cinchy v5.7
Cinchy v5.7
  • Data Collaboration Overview
  • Release notes
    • Release notes
      • 5.7 Release Notes
      • 5.6 Release Notes
      • 5.5 Release Notes
      • 5.4 Release Notes
      • 5.3 Release Notes
      • 5.2 Release Notes
      • 5.1 Release Notes
      • 5.0 Release Notes
  • Support
  • Glossary
  • FAQ
  • Deployment guide
    • Deploying Cinchy
      • Plan your deployment
        • Deployment architecture
          • Kubernetes architecture
          • IIS architecture
        • Deployment prerequisites
          • Single Sign-On (SSO) integration
            • Enable TLS 1.2
            • Configure ADFS
            • AD Group Integration
      • Kubernetes
        • Disable your Kubernetes applications
        • Change your file storage configuration
        • Configure AWS IAM for Connections
        • Use Self-Signed SSL Certs (Kubernetes)
        • Deploy the CLI (Kubernetes)
      • IIS
  • Upgrade guide
    • Upgrade Cinchy
      • Cinchy Upgrade Utility
      • Kubernetes upgrades
        • v5.1 (Kubernetes)
        • v5.2 (Kubernetes)
        • v5.3 (Kubernetes)
        • v5.4 (Kubernetes)
        • v5.5 (Kubernetes)
        • v5.6 (Kubernetes)
        • v5.7 (Kubernetes)
        • Upgrade AWS EKS Kubernetes version
        • Update the Kubernetes Image Registry
        • Upgrade Azure Kubernetes Service (AKS)
      • IIS upgrades
        • v4.21 (IIS)
        • v4.x to v5.x (IIS)
        • v5.1 (IIS)
        • v5.2 (IIS)
        • v5.3 (IIS)
        • v5.4 (IIS)
        • v5.5 (IIS)
        • v5.6 (IIS)
        • v5.7 (IIS)
      • Upgrading from v4 to v5
  • Guides for using Cinchy
    • User Guide
      • Data Browser overview
      • The Admin panel
      • User preferences
        • Personal access tokens
      • Table features
      • Data management
      • Queries
      • Version management
        • Versioning best practices
      • Commentary
    • Builder Guide
      • Best practices
      • Create tables
        • Attach files
        • Columns
        • Data controls
          • Data entitlements
          • Data erasure
          • Data compression
        • Formatting rules
        • Indexing & partitioning
        • Linking data
        • Table and column GUIDs
        • System tables
      • Delete tables
        • Restore tables, columns, and rows
      • Saved queries
      • CinchyDXD
        • Build the data experience
        • Package the data experience
        • Install the data experience
        • Update the data experience
        • Repackage the data experience
        • Reinstall the data experience
      • Multilingual support
      • Integration guides
    • Administrator Guide
    • Additional guides
      • Monitor and Log on Kubernetes
        • Grafana
        • OpenSearch dashboards
          • Set up Alerts
        • Monitor via ArgoCD
      • Maintenance
      • Cinchy Secrets Manager
      • GraphQL (Beta)
      • System properties
      • Enable Data At Rest Encryption (DARE)
      • MDQE
      • Application experiences
        • Network map
          • Custom node results
          • Custom results in the Network Map
        • Set up experiences
  • API Guide
    • API overview
      • API authentication
      • API saved queries
      • ExecuteCQL
      • Webhook ingestion
  • CQL
    • Overview
      • CQL examples
      • CQL statements overview
        • Cinchy DML statements
        • Cinchy DDL statements
      • Cinchy supported functions
        • Cinchy functions
        • Cinchy system values
        • Cinchy User Defined Functions (UDFs)
          • Table-valued functions
          • Scalar-valued functions
        • Conversion functions
        • Date and Time types and functions
          • Return System Date and Time values
          • Return Date and Time parts
          • Return Date and Time values from their parts
          • Return Date and Time difference values
          • Modify Date and Time values
          • Validate Date and Time values
        • Logical functions
        • Math functions
        • String functions
        • Geometry and Geography data type and functions
          • OGC methods on Geometry & Geography instances
          • Extended methods on Geometry & Geography instances
        • Full Text Search functions
        • Connections functions
        • JSON functions
    • CQL functions master list
  • Meta-Forms
    • Introduction
    • Install Meta-Forms
      • Deploy Meta-Forms (Kubernetes)
      • Deploy Meta-Forms (IIS)
    • Forms data types
    • Meta-Forms Builder Guide
      • Create a dynamic meta-form with tables
      • Create a dynamic meta-form example with Form Designer
      • Add links to a form
      • Rich text editing in forms
  • Data syncs
    • Get started with data syncs
    • IIS installation
      • Prerequisites
      • Install Connections
      • Install the Worker/Listener
      • Install the CLI and the Maintenance CLI
    • Build data syncs
      • Data sync types
      • Design patterns
      • Sync actions
      • Columns and mappings
        • Calculated column examples
      • Advanced settings
        • Filters
        • Variables
        • Auth requests
        • Request headers
        • Post sync scripts
        • Pagination
      • Batch data sync example
      • Real-time sync example
      • Schedule a data sync
      • Connection functions
    • CLI commands list
    • Troubleshooting
    • Data sync sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML config example
      • Cinchy Table
        • Cinchy Table XML config example
      • Cinchy Query
        • Cinchy Query XML config example
      • Copper
      • DB2 (query and table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File-based sources
        • Binary file
        • Delimited file
        • Excel
        • Fixed width file
        • Parquet
      • Kafka Topic
        • Kafka Topic example config
        • Apache AVRO data format
      • LDAP
      • MongoDB collection
        • MongoDB collection source example
      • Mongo event
      • MongoDB collection (Cinchy event)
      • MS SQL Server (query and table)
      • ODBC Query
      • Oracle (query and table)
      • Polling event
        • Polling event example config
      • REST API
      • REST API (Cinchy event)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce platform event
      • Salesforce push topic
      • Snowflake
        • Snowflake source example config
      • SOAP 1.2 web service
    • Data sync destinations
      • Cinchy Table
      • DB2 table
      • Dynamics
      • Kafka Topic
      • MongoDB collection
      • MS SQL Server table
      • Oracle table
      • REST API
      • Salesforce
      • Snowflake table
      • SOAP 1.2 web service
    • Real-time sync stream sources
      • The Listener Config table
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Overview
  • Use Case
  • Info tab
  • Values
  • Source tab
  • Next steps
  • Appendix A - Source Parameters
  • Run Query

Was this helpful?

Export as PDF
  1. Data syncs
  2. Data sync sources

Cinchy Event Broker/CDC

PreviousData sync sourcesNextCinchy Event Broker/CDC XML config example

Last updated 1 year ago

Was this helpful?

Overview

The Cinchy Event Broker/CDC (Change Data Capture) source allows you to capture data changes on your table and use these events in your data syncs.

Use Case

To mitigate the labour and time costs of hosting information in a silo and remove the costly integration tax plaguing your IT teams, you want to connect your legacy systems into Cinchy to take advantage of the platform's sync capabilities.

To do this, you can set up a real-time sync between a Cinchy Table and Salesforce that updates Salesforce any time data is added, updated, or deleted on the Cinchy side. If you enable change notifications on your Cinchy table, you can set up a data sync and listener config with your source as the Cinchy Event Broker/CDC.

The Cinchy Event Broker/CDC supports both batch syncs and real-time syncs (most common).

Remember to set up your listener config if you are creating a real-time sync.

Info tab

You can find the parameters in the Info tab below (Image 1).

Values

Parameter
Description
Example

Title

Mandatory. Input a name for your data sync

CDC

Variables

Optional. Review our documentation on Variables here for more information about this field.

Permissions

Data syncs are role based access systems where you can give specific groups read, write, execute, and/or all the above with admin access. Inputting at least an Admin Group is mandatory.

Source tab

The following table outlines the mandatory and optional parameters you will find on the Source tab (Image 2).

The following parameters will help to define your data sync source and how it functions.

Parameter
Description
Example

Source

Mandatory. Select your source from the drop down menu.

Cinchy Event Broker/CDC

Run Query

Path to Iterate

Optional. For the Cinchy Event Broker/CDC, the Path to Iterate function can be used to provide the JSON path to the array of items that you want to sync (provided that your event message contains JSON values).

To set up a real-time sync, you must configure your Listener values. You can do so through the Connections UI.

If you are creating a CDC listener config for a Cinchy Event Triggered REST API data source, pay in mind the following unique constraints:

  • Column names in the listener config shouldn't contain spaces. If they do, they will be automatically removed. For example, a column named First Name will become @FirstName.

  • The replacement variable names are case sensitive.

  • Column names in the listener config shouldn't be prefixes of other column names. For example, if you have a column called Name, you shouldn't have another called "Name2" as the value of @Name2 may end up being replaced by the value of @Name suffixed with a 2.

Reset behaviour

Parameter
Description
Example

Auto Offset Reset

Earliest, Latest or None. In the case where the listener is started and either there is no last message ID, or when the last message ID is invalid (due to it being deleted or it's just a new listener), it will use this column as a fallback to determine where to start reading events from.

None

Topic JSON

Topic JSON parameters

Parameter
Description
Example Value

tableGuid

Mandatory. GUID of the table you are reading from.

ef6710ca-6e59-4b4a-86d3-f6d24ed7658b

fields

Array of objects specifying columns to fetch.

See fields section below.

filter

WHERE clause for filtering records.

New.[Is Valid] = 1 AND (New.[Is Excluded] = 0 OR New.[Is Excluded] IS NULL)

messageKeyExpression

The messageKeyExpression parameter specifies a key that the listener application uses to route messages into specific topics within a Kafka broker. See below for more information.

value

next_cursor

The next_cursor parameter serves as an offset marker for paginated data retrieval in API requests. It helps in fetching large data sets chunk by chunk, making the process more manageable and efficient.

ABCS

batchSize

Number of records read per request.

1000

Fields Section

The following expands on the available parameters within the fields section above.

Field Property
Description
Example Value

column

Column name to fetch from the table.

Cinchy Id

alias

Alias for the column.

CinchyId

deserializeJsonValue

Converts text to JSON on read out.

true

Topic JSON example

The following is a topic JSON example:

{
  "tableGuid": "420c1851-31ed-4ada-a71b-31659bca6f92",
  "fields": [
    {
      "column": "Cinchy Id",
      "alias": "CinchyId"
    },
    {
      "column": "First Name",
      "alias": "Firstname"
    },
    {
      "column": "Last Name",
      "alias": "Lastname"
    }
  ],
  "messageKeyExpression": "CONCAT(New.[CinchyId], '-', New.[Name])",
  "filter": "New.[Lastname] IS not null OR (New.[CinchyId] is not null)",
  "batchSize": 100
}

Listener Configuration Parameters

messageKeyExpression

Each of your Event Listener message keys a message key. By default, this key is dictated by the Cinchy ID of the record being changed.

When the worker processes your Event Listener messages, it does so in batches, and for efficiency and to guarantee order, messages that contain the same key won't be processed in the same batch.

The messageKeyExpression property allows you to change the default message key to something else.

Use Case

  • Ensuring records with the same message key can be updated with the proper ordering to reflect an accurate collaboration log history.

Example Syntax

In this example, we want the message key to be based on the [Employee Id] and [Name] column of the table that CDC is enabled on.

{ "messageKeyExpression": "CONCAT(New.[Employee Id], '-', New.[Name])", … }

Old vs New Filter

The Cinchy Event Broker/CDC Stream Source has the unique capability to use "Old" and "New" parameters when filtering data. This filter can be a powerful tool for ensuring that you sync only the specific data that you want.

The "New" and "Old" parameters are based on updates to single records, not columns/rows.

"New" Example:

In the below filter, we only want to sync data where the [Approval State] of a record is newly Approved. For example, if a record was changed from Draft to Approved, the filter would sync the record.

Due to internal logic, newly created records will be tagged as both **New** and **Old**.

"filter": "New.[Approval State] = 'Approved'

"Old" Example:

In the below filter, we only want to sync data where the [Status] of a record was In Progress but has since been updated to any other [Status]. For example, if a record was changed from In Progress to Done, the filter would sync the record.

Due to internal logic, newly created records will be tagged as both **New** and **Old**.

"filter": "Old.[Status] = `In Progress`

Connection Attributes

You don`t need to provide Connections Attributes when using the Cinchy CDC Stream Source.

If you're inputting your configuration via the Listener Config table, you must insert the below text into the column:

{}

Schema section

  • Standard

  • Calculated

  • Conditional

  • JavaScript

You can repeat the values for multiple columns.

Parameter
Description
Example

Name

Mandatory. The name of your column as it appears in the source.

Name

Alias

Optional. You may choose to use an alias on your column so that it has a different name in the data sync.

Data Type

Mandatory. The data type of the column values.

Text

Description

Optional. You may choose to add a description to your column.

Advanced parameters

Select Show Advanced for more options for the Schema section.

Parameter
Description
Example

Mandatory

  • If both Mandatory and Validated are checked on a column, then rows where the column is empty are rejected

  • If just Mandatory is checked on a column, then all rows are synced with the execution log status of failed, and the source error of "Mandatory Rule Violation"

  • If just Validated is checked on a column, then all rows are synced.

Validate Data

  • If both Mandatory and Validated are checked on a column, then rows where the column is empty are rejected

  • If just Validated is checked on a column, then all rows are synced.

Max Length

Optional if data type = text. You can input a numerical value in this field that represents the maximum length of the data that can be synced in your column. If the value is exceeded, the row will be rejected (you can find this error in the Execution Log).

Trim Whitespace

Optional if data type = text. For Text data types, you can choose whether to trim the whitespace._

                                                                                                                  |         |

String replacement

You can choose to add in a Transformation > String Replacement by inputting the following:

Parameter
Description
Example

Pattern

Mandatory if using a Transformation. The pattern for your string replacement.

Replacement

What you want to replace your pattern with.

Note that you can have more than one String Replacement

Next steps

Appendix A - Source Parameters

The following sections outline more information about specific parameters you can find on this source.

Run Query

The Run Query parameter is available as an optional value for the Cinchy Event Broker/CDC connector. If set to true it executes a saved query; whichever record triggered the event becomes a parameter in that query. Thus the query now becomes the source instead of the table itself.

You are able to use any parameters defined in your listener config.

Example

The example below is a data sync using the Event Broker/CDC as a source. Our Listener Config has been set with the CinchyID attribute (Image 4).

We can enable the Run Query function to use the saved query "CDC Product Ticket Datestamps" as our source instead (Image 5). If we change the data from Record A in our source table to trigger our event, the Query Parameters below show that the Cinchy ID of Record A will be used in the query. This query is now our source.

It would appear in the data sync config XML as follows:

  <CinchyEventBrokerDataSource runQuery="true" domain="Product" name="CDC Product Ticket Datestamp" parameters="{&quot;@id&quot;: &quot;Cinchy Id&quot; }">

Optional. If true, executes a saved query, using the Cinchy ID of the changed record as a parameter. These query results are then used as the sync source, rather than using the Cinchy table where the data change originated. Review for further details on this feature.

If there is more than one listener associated with your data sync, you will need to configure the addition listeners via

Earliest will start reading from the beginning on the queue (when the CDC was enabled on the table). This might be a suggested configuration if your use case is recoverable or re-runnable and if you need to reprocess all events to ensure accuracy. Latest will fetch the last value after whatever was last processed. This is the typical configuration. None won't read start reading any events. You are able to switch between Auto Offset Reset types after your initial configuration through the process outlined

A Topic JSON is necessary for all real-time syncs. Enter your JSON parameters through the Connections UI, or edit them directly through .

Thesection is where you define which source columns you want to sync in your connection. You have the option to add the following columns:

You have the option to add a source filter to your data sync. Please review the documentation here for more information on

Configure your .

Define your

Add in your , if required.

If more than one listener is needed for a real-time sync, configure it/them via

To run a real-time sync, enable your Listener from

the Listener Configuration table.
the Listener Configuration table.
source filters.
Destination
Sync Actions.
Post Sync Scripts
the Listener Config table.
the Execution tab.
here.
Appendix A
Schema
Image 1: The Info Tab
Image 2: The Source Tab
Image 3: Run Query
Image 4: Run Query
Image 5: Run Query