Cinchy Platform Documentation
Cinchy v5.7
Cinchy v5.7
  • Data Collaboration Overview
  • Release notes
    • Release notes
      • 5.7 Release Notes
      • 5.6 Release Notes
      • 5.5 Release Notes
      • 5.4 Release Notes
      • 5.3 Release Notes
      • 5.2 Release Notes
      • 5.1 Release Notes
      • 5.0 Release Notes
  • Support
  • Glossary
  • FAQ
  • Deployment guide
    • Deploying Cinchy
      • Plan your deployment
        • Deployment architecture
          • Kubernetes architecture
          • IIS architecture
        • Deployment prerequisites
          • Single Sign-On (SSO) integration
            • Enable TLS 1.2
            • Configure ADFS
            • AD Group Integration
      • Kubernetes
        • Disable your Kubernetes applications
        • Change your file storage configuration
        • Configure AWS IAM for Connections
        • Use Self-Signed SSL Certs (Kubernetes)
        • Deploy the CLI (Kubernetes)
      • IIS
  • Upgrade guide
    • Upgrade Cinchy
      • Cinchy Upgrade Utility
      • Kubernetes upgrades
        • v5.1 (Kubernetes)
        • v5.2 (Kubernetes)
        • v5.3 (Kubernetes)
        • v5.4 (Kubernetes)
        • v5.5 (Kubernetes)
        • v5.6 (Kubernetes)
        • v5.7 (Kubernetes)
        • Upgrade AWS EKS Kubernetes version
        • Update the Kubernetes Image Registry
        • Upgrade Azure Kubernetes Service (AKS)
      • IIS upgrades
        • v4.21 (IIS)
        • v4.x to v5.x (IIS)
        • v5.1 (IIS)
        • v5.2 (IIS)
        • v5.3 (IIS)
        • v5.4 (IIS)
        • v5.5 (IIS)
        • v5.6 (IIS)
        • v5.7 (IIS)
      • Upgrading from v4 to v5
  • Guides for using Cinchy
    • User Guide
      • Data Browser overview
      • The Admin panel
      • User preferences
        • Personal access tokens
      • Table features
      • Data management
      • Queries
      • Version management
        • Versioning best practices
      • Commentary
    • Builder Guide
      • Best practices
      • Create tables
        • Attach files
        • Columns
        • Data controls
          • Data entitlements
          • Data erasure
          • Data compression
        • Formatting rules
        • Indexing & partitioning
        • Linking data
        • Table and column GUIDs
        • System tables
      • Delete tables
        • Restore tables, columns, and rows
      • Saved queries
      • CinchyDXD
        • Build the data experience
        • Package the data experience
        • Install the data experience
        • Update the data experience
        • Repackage the data experience
        • Reinstall the data experience
      • Multilingual support
      • Integration guides
    • Administrator Guide
    • Additional guides
      • Monitor and Log on Kubernetes
        • Grafana
        • OpenSearch dashboards
          • Set up Alerts
        • Monitor via ArgoCD
      • Maintenance
      • Cinchy Secrets Manager
      • GraphQL (Beta)
      • System properties
      • Enable Data At Rest Encryption (DARE)
      • MDQE
      • Application experiences
        • Network map
          • Custom node results
          • Custom results in the Network Map
        • Set up experiences
  • API Guide
    • API overview
      • API authentication
      • API saved queries
      • ExecuteCQL
      • Webhook ingestion
  • CQL
    • Overview
      • CQL examples
      • CQL statements overview
        • Cinchy DML statements
        • Cinchy DDL statements
      • Cinchy supported functions
        • Cinchy functions
        • Cinchy system values
        • Cinchy User Defined Functions (UDFs)
          • Table-valued functions
          • Scalar-valued functions
        • Conversion functions
        • Date and Time types and functions
          • Return System Date and Time values
          • Return Date and Time parts
          • Return Date and Time values from their parts
          • Return Date and Time difference values
          • Modify Date and Time values
          • Validate Date and Time values
        • Logical functions
        • Math functions
        • String functions
        • Geometry and Geography data type and functions
          • OGC methods on Geometry & Geography instances
          • Extended methods on Geometry & Geography instances
        • Full Text Search functions
        • Connections functions
        • JSON functions
    • CQL functions master list
  • Meta-Forms
    • Introduction
    • Install Meta-Forms
      • Deploy Meta-Forms (Kubernetes)
      • Deploy Meta-Forms (IIS)
    • Forms data types
    • Meta-Forms Builder Guide
      • Create a dynamic meta-form with tables
      • Create a dynamic meta-form example with Form Designer
      • Add links to a form
      • Rich text editing in forms
  • Data syncs
    • Get started with data syncs
    • IIS installation
      • Prerequisites
      • Install Connections
      • Install the Worker/Listener
      • Install the CLI and the Maintenance CLI
    • Build data syncs
      • Data sync types
      • Design patterns
      • Sync actions
      • Columns and mappings
        • Calculated column examples
      • Advanced settings
        • Filters
        • Variables
        • Auth requests
        • Request headers
        • Post sync scripts
        • Pagination
      • Batch data sync example
      • Real-time sync example
      • Schedule a data sync
      • Connection functions
    • CLI commands list
    • Troubleshooting
    • Data sync sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML config example
      • Cinchy Table
        • Cinchy Table XML config example
      • Cinchy Query
        • Cinchy Query XML config example
      • Copper
      • DB2 (query and table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File-based sources
        • Binary file
        • Delimited file
        • Excel
        • Fixed width file
        • Parquet
      • Kafka Topic
        • Kafka Topic example config
        • Apache AVRO data format
      • LDAP
      • MongoDB collection
        • MongoDB collection source example
      • Mongo event
      • MongoDB collection (Cinchy event)
      • MS SQL Server (query and table)
      • ODBC Query
      • Oracle (query and table)
      • Polling event
        • Polling event example config
      • REST API
      • REST API (Cinchy event)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce platform event
      • Salesforce push topic
      • Snowflake
        • Snowflake source example config
      • SOAP 1.2 web service
    • Data sync destinations
      • Cinchy Table
      • DB2 table
      • Dynamics
      • Kafka Topic
      • MongoDB collection
      • MS SQL Server table
      • Oracle table
      • REST API
      • Salesforce
      • Snowflake table
      • SOAP 1.2 web service
    • Real-time sync stream sources
      • The Listener Config table
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Overview
  • Info tab
  • Source tab
  • Next steps
  • Appendix A
  • messageKeyExpression

Was this helpful?

Export as PDF
  1. Data syncs
  2. Data sync sources

Polling event

PreviousOracle (query and table)NextPolling event example config

Last updated 1 year ago

Was this helpful?

Overview

Version 5.4 of the Cinchy platform introduced data polling, a source option which uses the Cinchy Event Listener to continuously monitor and sync data entries from your Oracle, SQL Server, or DB2 server into your Cinchy table. This capability makes data polling a much easier, effective, and streamlined process and avoids implementing the complex orchestration logic that was previous necessary.

The Polling Event source supports real-time syncs.

The Polling Event Source supports Oracle, DB2 and SQL Server databases.

Info tab

You can find the parameters in the Info tab below (Image 1).

Values

Parameter
Description
Example

Title

Mandatory. Input a name for your data sync

Polling Event Sync

Variables

Permissions

Data syncs are role based access systems where you can give specific groups read, write, execute, and/or all of the above with admin access. Inputting at least an Admin Group is mandatory.

Source tab

The following table outlines the mandatory and optional parameters you will find on the Source tab (Image 2).

The following parameters will help to define your data sync source and how it functions.

Parameter
Description
Example

Source

Mandatory. Select your source from the drop down menu.

Polling Event

To set up a real-time sync, you must configure your Listener values. You can do so through the Connections UI.

Reset behaviour

Parameter
Description
Example

Auto Offset Reset

Earliest, Latest or None. In the case where the listener is started and either there is no last message ID, or when the last message ID is invalid (due to it being deleted or it's just a new listener), it will use this column as a fallback to determine where to start reading events from.

None

Optional AppSettings configurations

  • DataPollingConcurrencyIndex: This property allows only a certain number of threads to run queries against the source database, which works to reduce the load against the database.

    • The default number of threads is set to 12.

    • To configure this property, navigate to your appSettings.json deployment file > "DataPollingConcurrencyIndex": <numberOfThreads>

  • QueueWriteConcurrencyIndex: This property allows only a certain number of threads to be concurrently sending messages to the queue. This works to provide a more consistent batching by the worker and reduce your batching errors. run queries against the source database, which works to reduce the load against the database.

    • The default number of threads is set to 12.

    • To configure this property, navigate to your appSettings.json deployment file > "QueueWriteConcurrencyIndex": <numberOfThreads>.

    • Note that this index is shared across all listener configs, meaning that if it's set to 1 only one listener config will be pushing the messages to the queue at a single moment in time.

// App Settings JSON Example
// Example of the configurable propeties: DataPollingConcurrencyIndex (set to "1" and QueueWriteConcurrencyIndex (set to "1")
"AppSettings": {
    "GetNewListenerConfigsInterval": "",
    "StateFileWriteDelaySeconds": "",
    "KafkaClientConfig": {
      "BootstrapServers": ""
    },
    "KafkaRealtimeDatasyncTopic": "",
    "KafkaJobCancellationTopic": "",
    "DataPollingConcurrencyIndex":  1,
    "QueueWriteConcurrencyIndex":  1
  }

Topic JSON

The below table can be used to help create your Topic JSON needed to set up a real-time sync.

Parameter
Description
Example

CursorConfiguration

Mandatory. The parameters here are used in a basic query which searches for all records in a particular table.

Note that in our example we need to use a sub-query to prevent an infinite loop if the "CursorColumn" parameter isn't unique.

Example basic query:

FromClause

Mandatory. This must contain at least the table name but can also contain Joined tables as written in SQL language.

Example: [Source Table]

CursorColumn

Mandatory. Column name that's used in any 'WHERE' condition(s) and for ordering the result of a query

Example: [Id]

BatchSize

Mandatory. Minimum size of a batch of data per query. This can be larger to prevent infinite loops if the CursorColumn isn't unique.

Example: 100

FilterCondition

All filtering options used in any 'WHERE' condition(s) of the query

Example: Name IS NOT NULL

Columns

Mandatory. A list of columns that we want to show in a result.

Example:Id, Name

ReturnDataConfiguration

The parameters here are used in more complex queries. This example has 2 related tables, but want to show the contents of one of them based on the 'CursorColumn' from a second table. Since Timestamp values aren't unique, we need to find all combinations of Id, Timestamp that match the filter condition in a subquery, and then join this result with the outer-query to get the final result. In `ReturnDataConfiguration`, our parameters area of concern is everything outside of first open parenthesis `(` and last closing parenthesis `)`. For example:

Example complex query:

CursorAlias

Mandatory. This is the alias for a subquery result table. It's used in 'JoinClause', and can be used in 'Columns' if we want to return values from a subquery table.

Example: "t"

JoinClause

Mandatory. Our result table to which we join the subquery result, plus the condition of the join.

Example: [Table1] ts ON ts.[Id] = t.[Id]

FilterCondition

All filtering options used in any 'WHERE' conditions.

Example: "ts.[Id] > 0"

OrderByClause

Mandatory. This is the column(s) that we want to order our final result by.

Example: "Id"

Columns

Mandatory. A list of columns that we want to show in the final result.

Example: "ts.[Id]" "ts.[name]"

Delay

Mandatory. This represents the delay, in second, between data sync cycles once it no longer finds any new data.

Example: 10

messageKeyExpresssion

id

CursorConfiguration.CursorColumnDataType

Mandatory. This property works in tandem with an update that ensures that the database query always moves the offset, regardless of if the query returned the records or not—this helps to ensure that the performance of the source database isn't being weighed down by constantly running heavy queries on a wide range of records when the queries returned no data. This value of this mandatory property must match the column type of the source database system for proper casting of parameters.

int

CursorConfiguration. Distinct

Mandatory. This property is a true/false Boolean type that, when set to true, applies a distinct clause on your query to avoid any duplicate records.

true

**Example Topic JSON**

{
  "CursorConfiguration": {
    "FromClause": "[Source Table]",
    "CursorColumn": "Id",
    "BatchSize": 100,
    "FilterCondition": "Name IS NOT NULL",
    "Columns": [
      "Id", "Name"
            "Distinct": "true"
            "CursorColumnDataType" : "int"
    ]
  },
  "ReturnDataConfiguration": {
    "CursorAlias": "t",
    "JoinClause": "[Table1] ts ON ts.[id] = t.[id]",
    "FilterCondition": "ts.[id] > 0",
    "OrderByClause": "id",
    "Columns": [
      "ts.[Id]"
      "ts.[Name]",
    ]

  },
  "Delay": 10
}

Connection Attributes

The below table can be used to help create your Connection Attributes JSON needed to set up a real-time sync.

Parameter
Description
Example

databaseType

Mandatory. TSQL, Oracle, or DB2

TSQL

connectionString

Mandatory. This should be the connection string for your data source.

{
  "databaseType": "TSQL",
  "connectionString": "Server=;Database=;User ID=cinchy;password=example;Trusted_Connection=False;Connection Timeout=30;Min Pool Size=10;"
}
Parameter
Description
Example

Name

Mandatory. The name of your column as it appears in the source.

Name

Alias

Optional. You may choose to use an alias on your column so that it has a different name in the data sync.

Data Type

Mandatory. The data type of the column values.

Text

Description

Optional. You may choose to add a description to your column.

Select Show Advanced for more options for the Schema section.

Parameter
Description
Example

Mandatory

  • If both Mandatory and Validated are checked on a column, then rows where the column is empty are rejected

  • If just Mandatory is checked on a column, then all rows are synced with the execution log status of failed, and the source error of "Mandatory Rule Violation"

  • If just Validated is checked on a column, then all rows are synced.

Validate Data

  • If both Mandatory and Validated are checked on a column, then rows where the column is empty are rejected

  • If just Validated is checked on a column, then all rows are synced.

Trim Whitespace

Optional if data type = text. For Text data types, you can choose whether to trim the whitespace._

Max Length

Optional if data type = text. You can input a numerical value in this field that represents the maximum length of the data that can be synced in your column. If the value is exceeded, the row will be rejected (you can find this error in the Execution Log).

You can choose to add in a Transformation > String Replacement by inputting the following:

Parameter
Description
Example

Pattern

Mandatory if using a Transformation. The pattern for your string replacement.

Replacement

What you want to replace your pattern with.

Note that you can have more than one String Replacement

Next steps

Appendix A

messageKeyExpression

The messageKeyExpression parameter is an optional, but recommended, parameter that can be used to ensure that you aren't faced with a unique constraint violation during your data sync. This violation could occur if both an insert and an update statement happened at nearly the same time. If you choose not to use the messageKeyExpression parameter, you could face data loss in your sync.

This parameter was added to the Data Polling event stream in Cinchy v5.6.

Each of your Event Listener message keys a message key. By default, this key is unique for every message in the queue.

When the worker processes your Event Listener messages it does so in batches and, for efficiency and to guarantee order, messages that contain the same key won't be processed in the same batch.

The messageKeyExpression property allows you to change the default message key to something else.

Example:

  "MessageKeyExpression": "id"

Optional. Review our documentation on for more information about this field.

Note that If there is more than one listener associated with your data sync, you will need to configure the addition listeners via

Earliest will start reading from the beginning on the queue (when the CDC was enabled on the table). This might be a suggested configuration if your use case is recoverable or re-runnable and if you need to reprocess all events to ensure accuracy. Latest will fetch the last value after whatever was last processed. This is the typical configuration. None won't read or start reading any events. You are able to switch between Auto Offset Reset types after your initial configuration through the process outlined

Optional, but recommended to mitigate data loss. for more information on this parameter.

The section is where you define which source columns you want to sync in your connection. You can repeat the values for multiple columns.

You have the option to add a source filter to your data sync. Please review the documentation here for more information on

Configure your

Define your

Add in your , if required.

If more than one listener is needed for a real-time sync, configure it/them via

To run a real-time sync, enable your Listener from

SELECT Id, Name
FROM [SourceTable]
WHERE Id IN (SELECT TOP (100) Id
    FROM [SourceTable]
    WHERE Id > 0 AND Name IS NOT NULL
    ORDER BY Id)
 AND Id > 0 AND Name IS NOT NULL
FROM
(
...
) AS t INNER JOIN [Table1] ts ON ts.[Id] = t.[Id]
WHERE ts.[Id] > 0
ORDER BY Id
SELECT ts.[Id],ts.[Name] FROM
(
SELECT Id,Timestamp
FROM [Table2]
WHERE Timestamp IN (SELECT TOP (2)
Timestamp
FROM [Table2]
WHERE Timestamp > '2022-11-18 11:34:09 AM'
AND Timestamp <= '2022-11-19 11:34:09 AM'
AND 1=1
ORDER BY Timestamp)
AND Timestamp > '2022-11-18 11:34:09 AM'
AND Timestamp <= '2022-11-19 11:34:09 AM'
AND 1=1
) AS t
INNER JOIN [Table1] ts ON ts.[Id] = t.[Id]
WHERE ts.[Id] > 0
ORDER BY Id
the Listener Configuration table.
source filters.
Destination
Sync Actions.
Post Sync Scripts
the Listener Config table.
the Execution tab.
Variables here
here.
See Appendix A
Image 1: The Info Tab
Image 2: The Source Tab
Schema