Cinchy Platform Documentation
Cinchy v5.6
Cinchy v5.6
  • Data Collaboration Overview
  • Release Notes
    • Release Notes
      • 5.0 Release Notes
      • 5.1 Release Notes
      • 5.2 Release Notes
      • 5.3 Release Notes
      • 5.4 Release Notes
      • 5.5 Release Notes
      • 5.6 Release Notes
  • Getting Help
  • Cinchy Glossary
  • Frequently Asked Questions
  • Deployment Guide
    • Deployment Installation Guides
      • Deployment Planning Overview and Checklist
        • Deployment Architecture Overview
          • Kubernetes Deployment Architecture
          • IIS Deployment Architecture
        • Deployment Prerequisites
          • Single Sign-On (SSO) Integration
            • Enabling TLS 1.2
            • Configuring ADFS
            • AD Group Integration
      • Kubernetes Deployment Installation
        • Disabling your Kubernetes Applications
        • Changing your File Storage Configuration
        • Configuring AWS IAM for Connections
        • Using Self-Signed SSL Certs (Kubernetes Deployments)
        • Deploying the CLI (Kubernetes)
      • IIS Deployment Platform Installation
    • Upgrade Guides
      • Upgrading Cinchy Versions
        • Cinchy Upgrade Utility
        • Kubernetes Upgrades
          • v5.1 (Kubernetes)
          • v5.2 (Kubernetes)
          • v5.3 (Kubernetes)
          • v5.4 (Kubernetes)
          • v5.5 (Kubernetes)
          • v5.6 (Kubernetes)
          • Upgrading AWS EKS Kubernetes Version
          • Updating the Kubernetes Image Registry
          • Upgrading AKS (Azure Kubernetes Service)
        • IIS Upgrades
          • v4.21 (IIS)
          • v4.x to v5.x (IIS)
          • v5.1 (IIS)
          • v5.2 (IIS)
          • v5.3 (IIS)
          • v5.4 (IIS)
          • v5.5 (IIS)
          • v5.6 (IIS)
      • Upgrading from v4 to v5
  • Guides for Using Cinchy
    • User Guides
      • Overview of the Data Browser
      • The Admin Panel
      • User Preferences
        • Personal Access Tokens
      • Table Features
      • Data Management
      • Queries
      • Version Management
        • Versioning Best Practices
      • Commentary
    • Builder Guides
      • Best Practices
      • Creating Tables
        • Attaching Files
        • Columns
        • Data Controls
          • Data Entitlements and Access Controls
          • Data Erasure
          • Data Compression
        • Formatting Rules
        • Indexing and Partitioning
        • Linking Data
        • Table and Column GUIDs
        • System Tables
      • Deleting Tables
        • Restoring Tables, Columns, and Rows
      • Saved Queries
      • CinchyDXD Utility
        • Building the Data Experience (CinchyDXD)
        • Packaging the Data Experience (CinchyDXD)
        • Installing the Data Experience (CinchyDXD)
        • Updating the Data Experience (CinchyDXD)
        • Repackaging the Data Experience (CinchyDXD)
        • Reinstalling the Data Experience (CinchyDXD)
      • Multi-Lingual Support
      • Integration Guides
    • Administrator Guide
    • Additional Guides
      • Monitoring and Logging on Kubernetes
        • Grafana
        • Opensearch Dashboards
          • Setting up Alerts
        • Monitoring via ArgoCD
      • Maintenance
      • System Properties
      • Enable Data At Rest Encryption
      • MDQE
      • Application Experiences
        • Network Map
          • Custom Node Results
          • Custom Results in the Network Map
        • Setting Up Experiences
  • API Guide
    • API Overview
      • API Authentication
      • API Saved Queries
      • ExecuteCQL
      • Webhook Ingestion
  • CQL
    • The Basics of CQL
      • CQL Examples
      • CQL Functions Master List
      • CQL Statements Overview
        • Cinchy DML Statements
        • Cinchy DDL Statements
      • Cinchy Supported Functions
        • Cinchy Functions
        • Cinchy System Values
        • Cinchy User Defined Functions
          • Table-Valued Functions
          • Scalar-Valued Functions
        • Conversion Functions
        • Date and Time Types and Functions
          • Return System Date and Time Values
          • Return Date and Time Parts
          • Return Date and Time Values From Their Parts
          • Return Date and Time Difference Values
          • Modify Date and Time Values
          • Validate Date and Time Values
        • Logical Functions
        • Mathematical Functions
        • String Functions
        • Geometry and Geography Data Type and Functions
          • OGC Methods on Geometry & Geography Instances
          • Extended Methods on Geometry & Geography Instances
        • Full Text Search Functions
        • Connections Functions
        • JSON Functions
  • Meta Forms
    • Introduction to Meta-Forms
    • Meta-Forms Deployment Installation Guide
      • Deploying Meta-Forms (Kubernetes)
      • Deploying Meta-Forms (IIS)
    • Forms Data Types
    • Meta-Forms Builders Guides
      • Creating a Dynamic Meta-Form (Using Tables)
      • Creating a Dynamic Meta-Form Example (Using Form Designer)
      • Adding Links to a Form
      • Rich Text Editing in Forms
  • Data Syncs
    • Getting Started with Data Syncs
    • Installation & Maintenance
      • Prerequisites
      • Installing Connections
      • Installing the Worker/Listener
      • Installing the CLI and the Maintenance CLI
    • Building Data Syncs
      • Types of Data Syncs
      • Common Design Patterns
      • Sync Behaviour
      • Columns and Mappings
        • Calculated Column Examples
      • Listener Configuration
      • Advanced Settings
        • Filters
        • Parameters
        • Auth Requests
        • Request Headers
        • Post Sync Scripts
        • Pagination
      • Batch Data Sync Example
      • Real-Time Sync Example
      • Scheduling a Data Sync
      • Connection Functions
    • CLI Commands List
    • Error Logging and Troubleshooting
    • Supported Data Sync Sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML Config Example
      • Cinchy Table
        • Cinchy Table XML Config Example
      • Cinchy Query
        • Cinchy Query XML Config Example
      • Copper
      • DB2 (Query and Table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File Based Sources
        • Binary File
        • Delimited File
        • Excel
        • Fixed Width File
        • Parquet
      • Kafka Topic
        • Kafka Topic Example Config
        • Apache AVRO Data Format
      • LDAP
      • MongoDB Collection
        • MongoDB Collection Source Example
      • MongoDB Collection (Cinchy Event Triggered)
      • MS SQL Server (Query and Table)
      • ODBC Query
      • Oracle (Query and Table)
      • Polling Event
        • Polling Event Example Config
      • REST API
      • REST API (Cinchy Event Triggered)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce Platform Event
      • Salesforce Push Topic
      • Snowflake
        • Snowflake Source Example Config
      • SOAP 1.2 Web Service
    • Supported Data Sync Destinations
      • Cinchy Table
      • DB2 Table
      • Dynamics
      • Kafka Topic
      • MongoDB Collection
      • MS SQL Server Table
      • Oracle Table
      • REST API
      • Salesforce Object
      • Snowflake Table
      • SOAP 1.2 Web Service
    • Supported Real-Time Sync Stream Sources
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Troubleshooting
  • General Troubleshooting
  • Changing your Auto Offset Reset configuration
  • Skipping Messages
  • Error Logging
  • Cinchy Tables
  • Execution Log Table Schema
  • Execution Error Table Defined
  • Logs
  • Error Messages

Was this helpful?

Export as PDF
  1. Data Syncs

Error Logging and Troubleshooting

Troubleshooting

General Troubleshooting

If your data sync configuration has failed, here are a few items of consideration:

  • Have your credentials changed in either the source or target (e.g. expired password)?

  • Is your sync key unique in your source and target?

  • Is the configuration entered in the [Cinchy].[Data Sync Configurations] table?

  • If source is a file, does it exist at the specified location?

Changing your Auto Offset Reset configuration

You are able to switch between Auto Offset Reset types after your initial configuration through the below steps:

  1. Navigate to the Listener Config table.

  2. Re-configure the Auto Offset Reset value to whatever you want.

  3. Set the "Status" column of the Listener Config to "Disabled".

  4. Navigate to the Event Listener State table.

  5. Find the column that pertains to your data sync's Listener Config and delete it.

  6. Navigate back to the Listener Config table.

  7. Set the "Status" column of the Listener Config to "Enabled" in order for your new Auto Offset Reset configuration to take effect.

Skipping Messages

If there are a large number of messages that have been queued after a listener config is turned off, you can skip processing them by following the below steps:

  1. Navigate to the Event Listener State table on your Cinchy platform.

  2. Navigate to the row containing the state corresponding to your listener config.

  3. Delete the above row.

  4. Navigate to the Listener Config table on your Cinchy platform.

  5. Navigate to the row containing the data sync configuration you want to configure.

  6. Set the "Auto Offset Reset" column to "Latest". This will ensure that when you turn your listener back on it will start listening from the latest messages, thus skipping the queued ones.

Error Logging

Cinchy Tables

When running a datasync interactively, the output screen will display the result of the job on the first line, there are two (2) potential outcomes:

  • Data sync completed successfully

  • Data sync completed with errors (see <temp folder>for error logs)

If the data sync runs on a schedule, there are two (2) tables in the Cinchy domain that can be reviewed to determine the outcome:

  1. Execution Log Table - this is where you can find the output and status of any job executed

    Please note, you can see when the job ran by the Create timestamp on the record. (Display Columns -> add Created column to the view)

  2. Execution Errors Table - this table may have one or more records for a job that failed with synchronization or validation errors

Execution Log Table Schema

Column

Definition

Execution ID

This is the number assigned to the job that has been executed and is incremented by one (1) for each subsequent job that is executed

Command

This column will display the CLI command that has been executed (e.g.Data Sync, Data Export etc.)

Server Name

This is the name of the server where the CLI was executed. If you run the CLI from a personal computer this is the name of your computer.

File Path

In case of a Data Sync, if the source is a file, this field will contain a link to the file.

In case of a Data Export, the field will be a link to the file created by the export.

Note that these are paths local to the server where the CLI was executed.

Parameters

This column will display any parameters passed to the command line

State

This column will display the state of the job (e.g. Succeeded, Failed or Running)

Execution Output

This column will display the output that would have been displayed if the job was executed from the command prompt.

Execution Time

This column will display how long it took to execute the job

Data Sync Config

This column will have a link to the name of your configuration file

Execution Error Table Defined

Column

Description

Error Class

This column will display the category of errors that has been generated (e.g. Invalid Row, Invalid Column, Invalid File etc.)

Error Type

This column will display the reason for the error (e.g. Unresolved Link, Invalid Format Exception, Malformed Row, Max Length Violation etc)

Column

This column will display the name of the column that generated the error

Row

This column will display the row number(s) of records from the source that generated the error

Row Count

This column will display the number of records affected by this error

Execution ID

This column will have a link that ties back to the error to the Execution Log

Tip

To automatically check if the job was successful, you have three (3) exit codes that can be checked for the job:

  • 0 - Completed without errors

  • 1 - Execution failed

  • 2 - Completed with validation errors

Sample Code

$CLICommand = "dotnet C:\CinchyCLI\Cinchy.CLI.dll syncdata -q ..." 
Invoke-Expression $CLICommand 
switch ($LASTEXITCODE) { 
0 { Write-Host "Completed without errors" } 
1 { Write-Host "Execution failed" } 
2 { Write-Host "Completed with validation errors" } 
}

Logs

The syncdata command will use the folder, indicated after the -d parameter in the command line, to create and store temporary files. If the data sync is successful, all the temporary files are automatically purged. However, if there is an error the following CSV files will exist:

  • ExecutionLogID_SourceErrors.csv

  • ExecutionLogID_SyncErrors.csv

  • ExecutionLogID_TargetErrors.csv

Source & Target Error Logs

The SourceErrors and TargetErrors CSV files will have the following three (3) columns:

  • Row - this column identifies the row number of the rejected record. Please note, only data rows are counted, if the source is a file with a number of header rows, this number needs to be added to the row number to get the actual row in the source that is causing the failure.

  • Rejected - will be either a Yes or No. If the field is Yes, this indicates that full record has been skipped. If the field is No, valid fields are inserted/updated and fields with validation errors are not inserted / updated

  • Errors - this column contains a list of fields causing validation errors or an error affecting the whole record, like “Malformed Row”

Sync Error Log

The SyncErrors file also has three (3) columns:

  • Failed Operations - this column will display the operation (e.g. INSERT, UPDATE or DELETE)

  • Error - this column will provide a error message as to why the operation failed

  • Record Id - Unique ID in the target system. For Cinchy this is the Cinchy ID. Most systems will have their own unique identifier (e.g. Salesforce ID).

Error Messages

Source/Target Errors

Error

Description

Duplicate Key

The sync key values are not unique & duplicated records are rejected

Malformed Row

The row could not be parsed based on the source schema. For example the record may not have the number of columns mentioned in the source section of the CLI configuration.

Invalid Format Exception

Check the value for this column, there may be a mismatched data type (e.g.inserting a non-digit character in a number column)

Max Length Violation

The text you are trying to insert or update a target field with is too long

Mandatory Rule Violation

No (or incorrect) value provided for a mandatory column

Unresolved Link

Check if the values the CLI is trying to insert/update exist in the linked Cinchy table target

Sync Errors

Records may fail to insert, update or get deleted due to sync errors, these come from the target system when the CLI tries to write data to it. Each target system will return its own errors, here are some examples from Cinchy, note that these are the same errors you see when doing a paste in the Manage Data screen to that table:

Error

Description

Value must be a number

Check the value for this column, there may be a mismatched data type, like trying to insert an non-digit character in a Number column

Value must be a valid date

No (or incorrect) value provided for a mandatory column

Value must be Yes or No

The value passed was not a Bool

Value must be selected from the available options

The value from the source does not correspond to the values in the Cinchy Target choice column

PreviousCLI Commands ListNextSupported Data Sync Sources

Last updated 1 year ago

Was this helpful?