Cinchy Platform Documentation
Cinchy v5.6
Cinchy v5.6
  • Data Collaboration Overview
  • Release Notes
    • Release Notes
      • 5.0 Release Notes
      • 5.1 Release Notes
      • 5.2 Release Notes
      • 5.3 Release Notes
      • 5.4 Release Notes
      • 5.5 Release Notes
      • 5.6 Release Notes
  • Getting Help
  • Cinchy Glossary
  • Frequently Asked Questions
  • Deployment Guide
    • Deployment Installation Guides
      • Deployment Planning Overview and Checklist
        • Deployment Architecture Overview
          • Kubernetes Deployment Architecture
          • IIS Deployment Architecture
        • Deployment Prerequisites
          • Single Sign-On (SSO) Integration
            • Enabling TLS 1.2
            • Configuring ADFS
            • AD Group Integration
      • Kubernetes Deployment Installation
        • Disabling your Kubernetes Applications
        • Changing your File Storage Configuration
        • Configuring AWS IAM for Connections
        • Using Self-Signed SSL Certs (Kubernetes Deployments)
        • Deploying the CLI (Kubernetes)
      • IIS Deployment Platform Installation
    • Upgrade Guides
      • Upgrading Cinchy Versions
        • Cinchy Upgrade Utility
        • Kubernetes Upgrades
          • v5.1 (Kubernetes)
          • v5.2 (Kubernetes)
          • v5.3 (Kubernetes)
          • v5.4 (Kubernetes)
          • v5.5 (Kubernetes)
          • v5.6 (Kubernetes)
          • Upgrading AWS EKS Kubernetes Version
          • Updating the Kubernetes Image Registry
          • Upgrading AKS (Azure Kubernetes Service)
        • IIS Upgrades
          • v4.21 (IIS)
          • v4.x to v5.x (IIS)
          • v5.1 (IIS)
          • v5.2 (IIS)
          • v5.3 (IIS)
          • v5.4 (IIS)
          • v5.5 (IIS)
          • v5.6 (IIS)
      • Upgrading from v4 to v5
  • Guides for Using Cinchy
    • User Guides
      • Overview of the Data Browser
      • The Admin Panel
      • User Preferences
        • Personal Access Tokens
      • Table Features
      • Data Management
      • Queries
      • Version Management
        • Versioning Best Practices
      • Commentary
    • Builder Guides
      • Best Practices
      • Creating Tables
        • Attaching Files
        • Columns
        • Data Controls
          • Data Entitlements and Access Controls
          • Data Erasure
          • Data Compression
        • Formatting Rules
        • Indexing and Partitioning
        • Linking Data
        • Table and Column GUIDs
        • System Tables
      • Deleting Tables
        • Restoring Tables, Columns, and Rows
      • Saved Queries
      • CinchyDXD Utility
        • Building the Data Experience (CinchyDXD)
        • Packaging the Data Experience (CinchyDXD)
        • Installing the Data Experience (CinchyDXD)
        • Updating the Data Experience (CinchyDXD)
        • Repackaging the Data Experience (CinchyDXD)
        • Reinstalling the Data Experience (CinchyDXD)
      • Multi-Lingual Support
      • Integration Guides
    • Administrator Guide
    • Additional Guides
      • Monitoring and Logging on Kubernetes
        • Grafana
        • Opensearch Dashboards
          • Setting up Alerts
        • Monitoring via ArgoCD
      • Maintenance
      • System Properties
      • Enable Data At Rest Encryption
      • MDQE
      • Application Experiences
        • Network Map
          • Custom Node Results
          • Custom Results in the Network Map
        • Setting Up Experiences
  • API Guide
    • API Overview
      • API Authentication
      • API Saved Queries
      • ExecuteCQL
      • Webhook Ingestion
  • CQL
    • The Basics of CQL
      • CQL Examples
      • CQL Functions Master List
      • CQL Statements Overview
        • Cinchy DML Statements
        • Cinchy DDL Statements
      • Cinchy Supported Functions
        • Cinchy Functions
        • Cinchy System Values
        • Cinchy User Defined Functions
          • Table-Valued Functions
          • Scalar-Valued Functions
        • Conversion Functions
        • Date and Time Types and Functions
          • Return System Date and Time Values
          • Return Date and Time Parts
          • Return Date and Time Values From Their Parts
          • Return Date and Time Difference Values
          • Modify Date and Time Values
          • Validate Date and Time Values
        • Logical Functions
        • Mathematical Functions
        • String Functions
        • Geometry and Geography Data Type and Functions
          • OGC Methods on Geometry & Geography Instances
          • Extended Methods on Geometry & Geography Instances
        • Full Text Search Functions
        • Connections Functions
        • JSON Functions
  • Meta Forms
    • Introduction to Meta-Forms
    • Meta-Forms Deployment Installation Guide
      • Deploying Meta-Forms (Kubernetes)
      • Deploying Meta-Forms (IIS)
    • Forms Data Types
    • Meta-Forms Builders Guides
      • Creating a Dynamic Meta-Form (Using Tables)
      • Creating a Dynamic Meta-Form Example (Using Form Designer)
      • Adding Links to a Form
      • Rich Text Editing in Forms
  • Data Syncs
    • Getting Started with Data Syncs
    • Installation & Maintenance
      • Prerequisites
      • Installing Connections
      • Installing the Worker/Listener
      • Installing the CLI and the Maintenance CLI
    • Building Data Syncs
      • Types of Data Syncs
      • Common Design Patterns
      • Sync Behaviour
      • Columns and Mappings
        • Calculated Column Examples
      • Listener Configuration
      • Advanced Settings
        • Filters
        • Parameters
        • Auth Requests
        • Request Headers
        • Post Sync Scripts
        • Pagination
      • Batch Data Sync Example
      • Real-Time Sync Example
      • Scheduling a Data Sync
      • Connection Functions
    • CLI Commands List
    • Error Logging and Troubleshooting
    • Supported Data Sync Sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML Config Example
      • Cinchy Table
        • Cinchy Table XML Config Example
      • Cinchy Query
        • Cinchy Query XML Config Example
      • Copper
      • DB2 (Query and Table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File Based Sources
        • Binary File
        • Delimited File
        • Excel
        • Fixed Width File
        • Parquet
      • Kafka Topic
        • Kafka Topic Example Config
        • Apache AVRO Data Format
      • LDAP
      • MongoDB Collection
        • MongoDB Collection Source Example
      • MongoDB Collection (Cinchy Event Triggered)
      • MS SQL Server (Query and Table)
      • ODBC Query
      • Oracle (Query and Table)
      • Polling Event
        • Polling Event Example Config
      • REST API
      • REST API (Cinchy Event Triggered)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce Platform Event
      • Salesforce Push Topic
      • Snowflake
        • Snowflake Source Example Config
      • SOAP 1.2 Web Service
    • Supported Data Sync Destinations
      • Cinchy Table
      • DB2 Table
      • Dynamics
      • Kafka Topic
      • MongoDB Collection
      • MS SQL Server Table
      • Oracle Table
      • REST API
      • Salesforce Object
      • Snowflake Table
      • SOAP 1.2 Web Service
    • Supported Real-Time Sync Stream Sources
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • 1. Overview
  • 2. The Listener Config Table
  • Appendix A
  • messageKeyExpression
  • Appendix B
  • Old vs New Filter

Was this helpful?

Export as PDF
  1. Data Syncs
  2. Supported Real-Time Sync Stream Sources

Cinchy Event Broker/CDC

PreviousSupported Real-Time Sync Stream SourcesNextData Polling

Last updated 1 year ago

Was this helpful?

1. Overview

The Cinchy Event Broker/CDC is an event streaming source used to listen for changes on Cinchy tables and push those changes to various data sync destinations.

2. The Listener Config Table

To set up an Stream Source, you must navigate to the Listener Config table and insert a new row for your data sync (Image 1). Most of the columns within the Listener Config table persist across all Stream Sources, however exceptions will be noted. You can find all of these parameters and their relevant descriptions in the tables below.

The following column parameters can be found in the Listener Config table:

Parameter
Description
Example

Name

Mandatory. Provide a name for your Listener Config.

CDC Real-Time Sync

Event Connector Type

Mandatory. Select your Connector type from the drop down menu.

Cinchy CDC

Topic

Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring.

See the Topic tab.

Connection Attributes

Mandatory. This field is expecting a JSON formatted value specific to the connector type you are configuring.

See the Connection Attributes tab.

Status

Mandatory. This value refers to whether your listener config/real-time sync is turned on or off. Make sure you keep this set to Disabled until you are confident you have the rest of your data sync properly configured first.

Disabled

Data Sync Config

Mandatory. This drop down will list all of the data syncs on your platform. Select the one that you want to use for your real-time sync.

CDC Data Sync

Subscription Expires On

This value is only relevant for Salesforce Stream Sources. This field is a timestamp that is auto-populated when it has successfully subscribed to a topic.

Message

Leave this value blank when setting up your configuration. This field will auto-populate during the running of your sync with any relevant messages. For instance "Cinchy listener is running", or "Listener is disabled".

Auto Offset Reset

Earliest, Latest or None. In the case where the listener is started and either there is no last message ID, or when the last message ID is invalid (due to it being deleted or it's just a new listener), it will use this column as a fallback to determine where to start reading events from.

Earliest will start reading from the beginning on the queue (when the CDC was enabled on the table). This might be a suggested configuration if your use case is recoverable or re-runnable and if you need to reprocess all events to ensure accuracy. Latest will fetch the last value after whatever was last processed. This is the typical configuration. None will not read start reading any events. You are able to switch between Auto Offset Reset types after your initial configuration through the below steps: 1. Navigate to the Listener Config table. 2. Re-configure the Auto Offset Reset value. 3. Set the "Status" column of the Listener Config to "Disabled". 4. Navigate to the Event Listener State table. 5. Find the column that pertains to your data sync's Listener Config and delete it. 6. Navigate back to the Listener Config table. 7. Set the "Status" column of the Listener Config to "Enabled" in order for your new Auto Offset Reset configuration to take effect.

Latest

The below table can be used to help create your Topic JSON needed to set up a real-time sync.

If you are creating a CDC listener config for a Cinchy Event Triggered REST API data source, pay in mind the following unique constraints:

  • Column names in the listener config should not contain spaces. If they do, they will be automatically removed. E.g. a column named First Name will become @FirstName

  • The replacement variable names are case sensitive.

  • Column names in the listener config should not be prefixes of other column names. E.g. if you have a column called "Name", you shouldn't have another called "Name2" as the value of @Name2 may end up being replaced by the value of @Name suffixed with a "2".

Parameter
Description
Example

Table GUID

Mandatory. The GUID of the table whose notifications you wish to consume. You can find this in the Design Table screen.

16523e54-4242-4156-835a-0e572e862304

Column(s)

Name Age

BatchSize

The desired result batch size. This will default to 1 if not passed in. The maximum batch size is 1000; using a number higher than that will result in a Bad Request response.

10

Filter

Optional. When CDC is enabled, you can set a filter on columns where you are capturing changes in order to receive specific data.

Optional. When CDC is enabled, you can set a filter on columns where you are capturing changes in order to receive specific data.

In the below example, we will only trigger changes on newly approved records by using the "New" filter to include all records where the [Approval State] is equal to 'Approved' AND the record is New, i.e. has not been synced to the target yet. The filter also uses the "Old" filter to disinclude all records where the [Approval State] is not equal to approved AND the record is Old, i.e. has already been synced to the target. Example:

Example Topic JSON

{
    "tableGuid": "16523e54-4242-4156-835a-0e572e862304",
    "fields": [
        {
            "column": "Name"
        },
        {
            "column": "Age"
        }
    ],
"filter": "New.[Approval State] = 'Approved' AND Old.[Approval State] != 'Approved'",
   "batchSize": 10
}

You do not need to provide Connections Attributes when using the Cinchy CDC Stream Source, however you cannot leave the field blank. Instead, insert the below text into the column:

{}

Appendix A

messageKeyExpression

Each of your Event Listener message keys a message key. By default, this key is dictated by the Cinchy ID of the record being changed.

When the worker processes your Event Listener messages, it does so in batches, and for efficiency and to guarantee order, messages that contain the same key will not be processed in the same batch.

The messageKeyExpression property allows you to change the default message key to something else.

Possible Use Case

  • Ensuring records with the same message key can be updated with the proper ordering to reflect an accurate collaboration log history.

Example Syntax

In this example, we want the message key to be based on the [Employee Id] and [Name] column of the table that CDC is enabled on.

{ "messageKeyExpression": "CONCAT(New.[Employee Id], '-', New.[Name])", … }

Appendix B

Old vs New Filter

The Cinchy Event Broker/CDC Stream Source has the unique capability to use "Old" and "New" parameters when filtering data. This filter can be a powerful tool for ensuring that you sync only the specific data that you want.

The "New" and "Old" parameters are based on updates to single records, not columns/rows.

"New" Example:

In the below filter, we only want to sync data where the [Approval State] of a record is newly 'Approved'. For example, if a record was changed from 'Draft' to 'Approved', the filter would sync the record.

Due to internal logic, newly created records will be tagged as both "New" and "Old".

"filter": "New.[Approval State] = 'Approved'

"Old" Example:

In the below filter, we only want to sync data where the [Status] of a record was 'In Progress' but has since been updated to any other [Status]. For example, if a record was changed from 'In Progress' to 'Done', the filter would sync the record.

Due to internal logic, newly created records will be tagged as both "New" and "Old".

"filter": "Old.[Status] = 'In Progress'

Mandatory. The names of the columns you wish to include in your sync. Note: If you will be using the parameter in your data sync, you only need to include the Cinchy Id in the topic JSON.

"filter": "New.[Approval State] = 'Approved' AND Old.[Approval State] != 'Approved'" (Hint: Click on the below image to enlarge)

runQuery=true
Image 1: The Listener Config table