5️⃣
Cinchy Platform Documentation
Cinchy v5.8
Cinchy v5.8
  • Data Collaboration Overview
  • Release notes
    • Release notes
      • 5.9 release notes
      • 5.8 Release Notes
      • 5.7 Release Notes
      • 5.6 Release Notes
      • 5.5 Release Notes
      • 5.4 Release Notes
      • 5.3 Release Notes
      • 5.2 Release Notes
      • 5.1 Release Notes
      • 5.0 Release Notes
  • Support
  • Glossary
  • FAQ
  • Deployment guide
    • Deploying Cinchy
      • Plan your deployment
        • Deployment architecture
          • Kubernetes architecture
          • IIS architecture
        • Deployment prerequisites
          • Single Sign-On (SSO) integration
            • Enable TLS 1.2
            • Configure ADFS
            • AD Group Integration
      • Kubernetes
        • Disable your Kubernetes applications
        • Change your file storage configuration
        • Configure AWS IAM for Connections
        • Use Self-Signed SSL Certs (Kubernetes)
        • Deploy the CLI (Kubernetes)
      • IIS
  • Upgrade guide
    • Upgrade Cinchy
      • Cinchy Upgrade Utility
      • Kubernetes upgrades
        • v5.1 (Kubernetes)
        • v5.2 (Kubernetes)
        • v5.3 (Kubernetes)
        • v5.4 (Kubernetes)
        • v5.5 (Kubernetes)
        • v5.6 (Kubernetes)
        • v5.7 (Kubernetes)
        • v5.8 (Kubernetes)
        • Upgrade AWS EKS Kubernetes version
        • Update the Kubernetes Image Registry
        • Upgrade Azure Kubernetes Service (AKS)
      • IIS upgrades
        • v4.21 (IIS)
        • v4.x to v5.x (IIS)
        • v5.1 (IIS)
        • v5.2 (IIS)
        • v5.3 (IIS)
        • v5.4 (IIS)
        • v5.5 (IIS)
        • v5.6 (IIS)
        • v5.7 (IIS)
        • v5.8 (IIS)
      • Upgrading from v4 to v5
  • Guides for using Cinchy
    • User Guide
      • Data Browser overview
      • The Admin panel
      • User preferences
        • Personal access tokens
      • Table features
      • Data management
      • Queries
      • Version management
        • Versioning best practices
      • Commentary
    • Builder Guide
      • Best practices
      • Create tables
        • Attach files
        • Columns
        • Data controls
          • Data entitlements
          • Data erasure
          • Data compression
        • Formatting rules
        • Indexing & partitioning
        • Linking data
        • Table and column GUIDs
        • System tables
      • Delete tables
        • Restore tables, columns, and rows
      • Saved queries
      • CinchyDXD
        • Overview
        • DXD workflow
        • Package the data experience
        • Install the data experience
        • Release package
        • Changelog
        • References
          • Cinchy DXD CLI reference
          • Data Experience Definitions table
          • Data Experience Reference table
      • Multilingual support
      • Integration guides
    • Administrator Guide
    • Additional guides
      • Monitor and Log on Kubernetes
        • Grafana
        • OpenSearch dashboards
          • Set up Alerts
        • Monitor via ArgoCD
      • Maintenance
      • Cinchy Secrets Manager
      • GraphQL (Beta)
      • System properties
      • Enable Data At Rest Encryption (DARE)
      • Application experiences
        • Network map
          • Custom node results
          • Custom results in the Network Map
        • Set up experiences
  • API Guide
    • API overview
      • API authentication
      • API saved queries
      • ExecuteCQL
      • Webhook ingestion
  • CQL
    • Overview
      • CQL examples
      • CQL statements overview
        • Cinchy DML statements
        • Cinchy DDL statements
      • Cinchy supported functions
        • Cinchy functions
        • Cinchy system values
        • Cinchy User Defined Functions (UDFs)
          • Table-valued functions
          • Scalar-valued functions
        • Conversion functions
        • Date and Time types and functions
          • Return System Date and Time values
          • Return Date and Time parts
          • Return Date and Time values from their parts
          • Return Date and Time difference values
          • Modify Date and Time values
          • Validate Date and Time values
        • Logical functions
        • Math functions
        • String functions
        • Geometry and Geography data type and functions
          • OGC methods on Geometry & Geography instances
          • Extended methods on Geometry & Geography instances
        • Full Text Search functions
        • Connections functions
        • JSON functions
    • CQL functions reference list
  • Meta-Forms
    • Introduction
    • Install Meta-Forms
      • Deploy Meta-Forms (Kubernetes)
      • Deploy Meta-Forms (IIS)
    • Forms data types
    • Meta-Forms Builder Guide
      • Create a dynamic meta-form with tables
      • Create a dynamic meta-form example with Form Designer
      • Add links to a form
      • Rich text editing in forms
  • Data syncs
    • Get started with data syncs
    • IIS installation
      • Install Connections
      • Install the Worker/Listener
      • Install the Connections CLI
    • Build data syncs
      • Data sync types
      • Design patterns
      • Sync actions
      • Columns and mappings
        • Calculated column examples
      • Advanced settings
        • Filters
        • Variables
        • Auth requests
        • Request headers
        • Post sync scripts
        • Pagination
      • Batch data sync example
      • Real-time sync example
      • Schedule a data sync
      • Connection functions
    • Data sync sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML config example
      • Cinchy Table
        • Cinchy Table XML config example
      • Cinchy Query
        • Cinchy Query XML config example
      • Copper
      • DB2 (query and table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File-based sources
        • Binary file
        • Delimited file
        • Excel
        • Fixed width file
        • Parquet
      • Kafka Topic
        • Kafka Topic example config
        • Apache AVRO data format
      • LDAP
      • MongoDB collection
        • MongoDB collection source example
      • Mongo event
      • MongoDB collection (Cinchy event)
      • MS SQL Server (query and table)
      • ODBC Query
      • Oracle (query and table)
      • Polling event
        • Polling event example config
      • REST API
      • REST API (Cinchy event)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce platform event
      • Salesforce push topic
      • Snowflake
        • Snowflake source example config
      • SOAP 1.2 web service
      • SOAP 1.2 web service (Cinchy Event Triggered)
    • Data sync destinations
      • Cinchy Table
      • DB2 table
      • Dynamics
      • Kafka Topic
      • MongoDB collection
      • MS SQL Server table
      • Oracle table
      • REST API
      • Salesforce
      • Snowflake table
      • SOAP 1.2 web service
    • Real-time sync stream sources
      • The Listener Config table
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
    • CLI commands list
    • Troubleshooting
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Troubleshooting
  • General troubleshooting
  • Changing your Auto Offset Reset configuration
  • Skipping messages
  • Error logging
  • Cinchy Tables
  • Execution Log Table schema
  • Execution Error Table defined
  • Logs
  • Error messages
  1. Data syncs

Troubleshooting

Troubleshooting

General troubleshooting

If your data sync configuration has failed, here are a few items of consideration:

  • Have your credentials changed in either the source or target? (For example, an expired password). You are able to validate the configuration of your credentials for certain sources/destinations by using the "Test Connection" button. This will return a "Connection Successful" pop-up for properly defined credentials, and a "Connection Failed" pop-up if improperly defined. Failed configurations will also include a link to an error log to help facilitate your troubleshooting.

  • Is your sync key unique in your source and target?

  • Is the configuration entered in the [Cinchy].[Data Sync Configurations] table?

  • If source is a file, does it exist at the specified location?

Changing your Auto Offset Reset configuration

You are able to switch between Auto Offset Reset types after your initial configuration through the below steps:

  1. Navigate to the Listener Config table.

  2. Re-configure the Auto Offset Reset value to whatever you want.

  3. Set the "Status" column of the Listener Config to Disabled

  4. Navigate to the Event Listener State table.

  5. Find the column that pertains to your data sync's Listener Config and delete it.

  6. Navigate back to the Listener Config table.

  7. Set the "Status" column of the Listener Config to Enabled in order for your new Auto Offset Reset configuration to take effect.

Skipping messages

If there are messages that have been queued after a listener config is turned off, you can skip processing them by following the below steps:

  1. Navigate to the Event Listener State table on your Cinchy platform.

  2. Navigate to the row containing the state corresponding to your listener config.

  3. Delete the above row.

  4. Navigate to the Listener Config table on your Cinchy platform.

  5. Navigate to the row containing the data sync configuration you want to configure.

  6. Set the "Auto Offset Reset" column to Latest. This ensures that when you turn your listener back on it will start listening from the latest messages and skip the queued ones.

Error logging

Cinchy Tables

When running a data sync interactively, the output screen will display the result of the job on the first line, there are two (2) potential outcomes:

  • Data sync completed successfully

  • Data sync completed with errors (see <temp folder>for error logs)

If the data sync runs on a schedule, there are two (2) tables in the Cinchy domain that can be reviewed to determine the outcome:

  1. Execution Log Table - this is where you can find the output and status of any job executed

    Please note, you can see when the job ran by the Create timestamp on the record. (Display Columns -> add Created column to the view)

  2. Execution Errors Table - this table may have one or more records for a job that failed with synchronization or validation errors

Execution Log Table schema

Column
Definition

Execution ID

This is the number assigned to the job that has been executed and is incremented by one (1) for each subsequent job that's executed

Command

This column will display the CLI command that has been executed (Data Sync, Data Export etc.)

Server Name

This is the name of the server where the CLI was executed. If you run the CLI from a personal computer this is the name of your computer.

File Path

In case of a Data Sync, if the source is a file, this field will contain a link to the file.

In case of a Data Export, the field will be a link to the file created by the export.

Note that these are paths local to the server where the CLI was executed.

Parameters

This column will display any parameters passed to the command line

State

This column will display the state of the job (Succeeded, Failed or Running)

Execution Output

This column will display the output that would have been displayed if the job was executed from the command prompt.

Execution Time

This column will display how long it took to execute the job

Data Sync Config

This column will have a link to the name of your configuration file

Execution Error Table defined

Column
Description

Error Class

This column will display the category of errors that has been generated (Invalid Row, Invalid Column, Invalid File etc.)

Error Type

This column will display the reason for the error (Unresolved Link, Invalid Format Exception, Malformed Row, Max Length Violation etc)

Column

This column will display the name of the column that generated the error

Row

This column will display the row number(s) of records from the source that generated the error

Row Count

This column will display the number of records affected by this error

Execution ID

This column will have a link that ties back to the error to the Execution Log

Tip

To automatically check if the job was successful, you have three (3) exit codes that can be checked for the job:

  • 0 - Completed without errors

  • 1 - Execution failed

  • 2 - Completed with validation errors

Sample code

$CLICommand = "dotnet C:\CinchyCLI\Cinchy.CLI.dll syncdata -q ..." 
Invoke-Expression $CLICommand 
switch ($LASTEXITCODE) { 
0 { Write-Host "Completed without errors" } 
1 { Write-Host "Execution failed" } 
2 { Write-Host "Completed with validation errors" } 
}

Logs

The syncdata command will use the folder, indicated after the -d parameter in the command line, to create and store temporary files. If the data sync is successful, all the temporary files are automatically purged. However, if there is an error the following CSV files will exist:

  • ExecutionLogID_SourceErrors.csv

  • ExecutionLogID_SyncErrors.csv

  • ExecutionLogID_TargetErrors.csv

Source & Target Error logs

The SourceErrors and TargetErrors CSV files will have the following three (3) columns:

  • Row - this column identifies the row number of the rejected record. Please note, only data rows are counted, if the source is a file with a number of header rows, this number needs to be added to the row number to get the actual row in the source that's causing the failure.

  • Rejected - will be either a Yes or No. If the field is Yes, this indicates that full record has been skipped. If the field is No, valid fields are inserted/updated and fields with validation errors aren't inserted / updated

  • Errors - this column contains a list of fields causing validation errors or an error affecting the whole record, like “Malformed Row”

Sync Error log

The SyncErrors file also has three (3) columns:

  • Failed Operations - this column will display the operation (INSERT, UPDATE or DELETE)

  • Error - this column will provide a error message as to why the operation failed

  • Record Id - Unique ID in the target system. For Cinchy this is the Cinchy ID. Most systems will have their own unique identifier (such as Salesforce ID).

Error messages

Source/Target errors

Error
Description

Duplicate Key

The sync key values aren't unique & duplicated records are rejected

Malformed Row

The row couldn't be parsed based on the source schema. For example the record may not have the number of columns mentioned in the source section of the CLI configuration.

Invalid Format Exception

Check the value for this column, there may be a mismatched data type (such as inserting a non-digit character in a number column)

Max Length Violation

The text you are trying to insert or update a target field with is too long

Mandatory Rule Violation

No (or incorrect) value provided for a mandatory column

Unresolved Link

Check if the values the CLI is trying to insert/update exist in the linked Cinchy table target

Sync errors

Records may fail to insert, update or get deleted due to sync errors, these come from the target system when the CLI tries to write data to it. Each target system will return its own errors, here are some examples from Cinchy, note that these are the same errors you see when doing a paste in the Manage Data screen to that table:

Error
Description

Value must be a number

Check the value for this column, there may be a mismatched data type, like trying to insert an non-digit character in a Number column

Value must be a valid date

No (or incorrect) value provided for a mandatory column

Value must be Yes or No

The value passed wasn't a Boolean

Value must be selected from the available options

The value from the source doesn't correspond to the values in the Cinchy Target choice column

PreviousCLI commands list

Last updated 1 year ago