If your data sync configuration has failed, here are a few items of consideration:
Have your credentials changed in either the source or target? (For example, an expired password). You are able to validate the configuration of your credentials for certain sources/destinations by using the "Test Connection" button. This will return a "Connection Successful" pop-up for properly defined credentials, and a "Connection Failed" pop-up if improperly defined. Failed configurations will also include a link to an error log to help facilitate your troubleshooting.
Is your sync key unique in your source and target?
Is the configuration entered in the [Cinchy].[Data Sync Configurations] table?
If source is a file, does it exist at the specified location?
You are able to switch between Auto Offset Reset types after your initial configuration through the below steps:
Navigate to the Listener Config table.
Re-configure the Auto Offset Reset value to whatever you want.
Set the "Status" column of the Listener Config to Disabled
Navigate to the Event Listener State table.
Find the column that pertains to your data sync's Listener Config and delete it.
Navigate back to the Listener Config table.
Set the "Status" column of the Listener Config to Enabled in order for your new Auto Offset Reset configuration to take effect.
If there are messages that have been queued after a listener config is turned off, you can skip processing them by following the below steps:
Navigate to the Event Listener State table on your Cinchy platform.
Navigate to the row containing the state corresponding to your listener config.
Delete the above row.
Navigate to the Listener Config table on your Cinchy platform.
Navigate to the row containing the data sync configuration you want to configure.
Set the "Auto Offset Reset" column to Latest. This ensures that when you turn your listener back on it will start listening from the latest messages and skip the queued ones.
When running a data sync interactively, the output screen will display the result of the job on the first line, there are two (2) potential outcomes:
Data sync completed successfully
Data sync completed with errors (see <temp folder>for error logs)
If the data sync runs on a schedule, there are two (2) tables in the Cinchy domain that can be reviewed to determine the outcome:
Execution Log Table - this is where you can find the output and status of any job executed
Please note, you can see when the job ran by the Create timestamp on the record. (Display Columns -> add Created column to the view)
Execution Errors Table - this table may have one or more records for a job that failed with synchronization or validation errors
To automatically check if the job was successful, you have three (3) exit codes that can be checked for the job:
0 - Completed without errors
1 - Execution failed
2 - Completed with validation errors
The syncdata
command will use the folder, indicated after the -d parameter in the command line, to create and store temporary files. If the data sync is successful, all the temporary files are automatically purged. However, if there is an error the following CSV files will exist:
ExecutionLogID_SourceErrors.csv
ExecutionLogID_SyncErrors.csv
ExecutionLogID_TargetErrors.csv
The SourceErrors and TargetErrors CSV files will have the following three (3) columns:
Row - this column identifies the row number of the rejected record. Please note, only data rows are counted, if the source is a file with a number of header rows, this number needs to be added to the row number to get the actual row in the source that's causing the failure.
Rejected - will be either a Yes or No. If the field is Yes, this indicates that full record has been skipped. If the field is No, valid fields are inserted/updated and fields with validation errors aren't inserted / updated
Errors - this column contains a list of fields causing validation errors or an error affecting the whole record, like “Malformed Row”
The SyncErrors file also has three (3) columns:
Failed Operations - this column will display the operation (INSERT, UPDATE or DELETE)
Error - this column will provide a error message as to why the operation failed
Record Id - Unique ID in the target system. For Cinchy this is the Cinchy ID. Most systems will have their own unique identifier (such as Salesforce ID).
Records may fail to insert, update or get deleted due to sync errors, these come from the target system when the CLI tries to write data to it. Each target system will return its own errors, here are some examples from Cinchy, note that these are the same errors you see when doing a paste in the Manage Data screen to that table:
Column | Definition |
---|---|
Column | Description |
---|---|
Error | Description |
---|---|
Error | Description |
---|---|
Execution ID
This is the number assigned to the job that has been executed and is incremented by one (1) for each subsequent job that's executed
Command
This column will display the CLI command that has been executed (Data Sync, Data Export etc.)
Server Name
This is the name of the server where the CLI was executed. If you run the CLI from a personal computer this is the name of your computer.
File Path
In case of a Data Sync, if the source is a file, this field will contain a link to the file.
In case of a Data Export, the field will be a link to the file created by the export.
Note that these are paths local to the server where the CLI was executed.
Parameters
This column will display any parameters passed to the command line
State
This column will display the state of the job (Succeeded, Failed or Running)
Execution Output
This column will display the output that would have been displayed if the job was executed from the command prompt.
Execution Time
This column will display how long it took to execute the job
Data Sync Config
This column will have a link to the name of your configuration file
Error Class
This column will display the category of errors that has been generated (Invalid Row, Invalid Column, Invalid File etc.)
Error Type
This column will display the reason for the error (Unresolved Link, Invalid Format Exception, Malformed Row, Max Length Violation etc)
Column
This column will display the name of the column that generated the error
Row
This column will display the row number(s) of records from the source that generated the error
Row Count
This column will display the number of records affected by this error
Execution ID
This column will have a link that ties back to the error to the Execution Log
Duplicate Key
The sync key values aren't unique & duplicated records are rejected
Malformed Row
The row couldn't be parsed based on the source schema. For example the record may not have the number of columns mentioned in the source section of the CLI configuration.
Invalid Format Exception
Check the value for this column, there may be a mismatched data type (such as inserting a non-digit character in a number column)
Max Length Violation
The text you are trying to insert or update a target field with is too long
Mandatory Rule Violation
No (or incorrect) value provided for a mandatory column
Unresolved Link
Check if the values the CLI is trying to insert/update exist in the linked Cinchy table target
Value must be a number
Check the value for this column, there may be a mismatched data type, like trying to insert an non-digit character in a Number column
Value must be a valid date
No (or incorrect) value provided for a mandatory column
Value must be Yes or No
The value passed wasn't a Boolean
Value must be selected from the available options
The value from the source doesn't correspond to the values in the Cinchy Target choice column