5️⃣
Cinchy Platform Documentation
Cinchy v5.8
Cinchy v5.8
  • Data Collaboration Overview
  • Release notes
    • Release notes
      • 5.9 release notes
      • 5.8 Release Notes
      • 5.7 Release Notes
      • 5.6 Release Notes
      • 5.5 Release Notes
      • 5.4 Release Notes
      • 5.3 Release Notes
      • 5.2 Release Notes
      • 5.1 Release Notes
      • 5.0 Release Notes
  • Support
  • Glossary
  • FAQ
  • Deployment guide
    • Deploying Cinchy
      • Plan your deployment
        • Deployment architecture
          • Kubernetes architecture
          • IIS architecture
        • Deployment prerequisites
          • Single Sign-On (SSO) integration
            • Enable TLS 1.2
            • Configure ADFS
            • AD Group Integration
      • Kubernetes
        • Disable your Kubernetes applications
        • Change your file storage configuration
        • Configure AWS IAM for Connections
        • Use Self-Signed SSL Certs (Kubernetes)
        • Deploy the CLI (Kubernetes)
      • IIS
  • Upgrade guide
    • Upgrade Cinchy
      • Cinchy Upgrade Utility
      • Kubernetes upgrades
        • v5.1 (Kubernetes)
        • v5.2 (Kubernetes)
        • v5.3 (Kubernetes)
        • v5.4 (Kubernetes)
        • v5.5 (Kubernetes)
        • v5.6 (Kubernetes)
        • v5.7 (Kubernetes)
        • v5.8 (Kubernetes)
        • Upgrade AWS EKS Kubernetes version
        • Update the Kubernetes Image Registry
        • Upgrade Azure Kubernetes Service (AKS)
      • IIS upgrades
        • v4.21 (IIS)
        • v4.x to v5.x (IIS)
        • v5.1 (IIS)
        • v5.2 (IIS)
        • v5.3 (IIS)
        • v5.4 (IIS)
        • v5.5 (IIS)
        • v5.6 (IIS)
        • v5.7 (IIS)
        • v5.8 (IIS)
      • Upgrading from v4 to v5
  • Guides for using Cinchy
    • User Guide
      • Data Browser overview
      • The Admin panel
      • User preferences
        • Personal access tokens
      • Table features
      • Data management
      • Queries
      • Version management
        • Versioning best practices
      • Commentary
    • Builder Guide
      • Best practices
      • Create tables
        • Attach files
        • Columns
        • Data controls
          • Data entitlements
          • Data erasure
          • Data compression
        • Formatting rules
        • Indexing & partitioning
        • Linking data
        • Table and column GUIDs
        • System tables
      • Delete tables
        • Restore tables, columns, and rows
      • Saved queries
      • CinchyDXD
        • Overview
        • DXD workflow
        • Package the data experience
        • Install the data experience
        • Release package
        • Changelog
        • References
          • Cinchy DXD CLI reference
          • Data Experience Definitions table
          • Data Experience Reference table
      • Multilingual support
      • Integration guides
    • Administrator Guide
    • Additional guides
      • Monitor and Log on Kubernetes
        • Grafana
        • OpenSearch dashboards
          • Set up Alerts
        • Monitor via ArgoCD
      • Maintenance
      • Cinchy Secrets Manager
      • GraphQL (Beta)
      • System properties
      • Enable Data At Rest Encryption (DARE)
      • Application experiences
        • Network map
          • Custom node results
          • Custom results in the Network Map
        • Set up experiences
  • API Guide
    • API overview
      • API authentication
      • API saved queries
      • ExecuteCQL
      • Webhook ingestion
  • CQL
    • Overview
      • CQL examples
      • CQL statements overview
        • Cinchy DML statements
        • Cinchy DDL statements
      • Cinchy supported functions
        • Cinchy functions
        • Cinchy system values
        • Cinchy User Defined Functions (UDFs)
          • Table-valued functions
          • Scalar-valued functions
        • Conversion functions
        • Date and Time types and functions
          • Return System Date and Time values
          • Return Date and Time parts
          • Return Date and Time values from their parts
          • Return Date and Time difference values
          • Modify Date and Time values
          • Validate Date and Time values
        • Logical functions
        • Math functions
        • String functions
        • Geometry and Geography data type and functions
          • OGC methods on Geometry & Geography instances
          • Extended methods on Geometry & Geography instances
        • Full Text Search functions
        • Connections functions
        • JSON functions
    • CQL functions reference list
  • Meta-Forms
    • Introduction
    • Install Meta-Forms
      • Deploy Meta-Forms (Kubernetes)
      • Deploy Meta-Forms (IIS)
    • Forms data types
    • Meta-Forms Builder Guide
      • Create a dynamic meta-form with tables
      • Create a dynamic meta-form example with Form Designer
      • Add links to a form
      • Rich text editing in forms
  • Data syncs
    • Get started with data syncs
    • IIS installation
      • Install Connections
      • Install the Worker/Listener
      • Install the Connections CLI
    • Build data syncs
      • Data sync types
      • Design patterns
      • Sync actions
      • Columns and mappings
        • Calculated column examples
      • Advanced settings
        • Filters
        • Variables
        • Auth requests
        • Request headers
        • Post sync scripts
        • Pagination
      • Batch data sync example
      • Real-time sync example
      • Schedule a data sync
      • Connection functions
    • Data sync sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML config example
      • Cinchy Table
        • Cinchy Table XML config example
      • Cinchy Query
        • Cinchy Query XML config example
      • Copper
      • DB2 (query and table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File-based sources
        • Binary file
        • Delimited file
        • Excel
        • Fixed width file
        • Parquet
      • Kafka Topic
        • Kafka Topic example config
        • Apache AVRO data format
      • LDAP
      • MongoDB collection
        • MongoDB collection source example
      • Mongo event
      • MongoDB collection (Cinchy event)
      • MS SQL Server (query and table)
      • ODBC Query
      • Oracle (query and table)
      • Polling event
        • Polling event example config
      • REST API
      • REST API (Cinchy event)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce platform event
      • Salesforce push topic
      • Snowflake
        • Snowflake source example config
      • SOAP 1.2 web service
      • SOAP 1.2 web service (Cinchy Event Triggered)
    • Data sync destinations
      • Cinchy Table
      • DB2 table
      • Dynamics
      • Kafka Topic
      • MongoDB collection
      • MS SQL Server table
      • Oracle table
      • REST API
      • Salesforce
      • Snowflake table
      • SOAP 1.2 web service
    • Real-time sync stream sources
      • The Listener Config table
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
    • CLI commands list
    • Troubleshooting
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Overview
  • Example use case
  • Info tab
  • Source tab
  • Next steps
  1. Data syncs
  2. Data sync sources
  3. File-based sources

Binary file

Overview

A binary file is a computer file that's not a text file, and whose content is in a binary format consisting of a series of sequential bytes, each of which is eight bits in length.

You can use binary files from a Local upload, Amazon S3, or Azure Blob Storage in your data syncs.

Some benefits of using binary files include:

  • Better efficiency via compression

  • Better Security through the ability to create custom encoding standards.

  • Unmatched Speed, since the data is stored in a raw format, and isn't encoded using any character encoding standards, it's faster to read and store.

Example use case

You have a binary file that contains your Employee information. You want to use a batch sync to pull this info into a Cinchy table and liberate your data.

The Binary File source supports batch syncs.

Info tab

You can find the parameters in the Info tab below (Image 1).

Values

Parameter
Description
Example

Title

Mandatory. Input a name for your data sync

Employee Sync

Variables

@Filepath

Permissions

Data syncs are role based access systems where you can give specific groups read, write, execute, and/or all of the above with admin access. Inputting at least an Admin Group is mandatory.

Source tab

The following table outlines the mandatory and optional parameters you will find on the Source tab (Image 2).

The following parameters will help to define your data sync source and how it functions.

Parameter
Description
Example

(Sync) Source

Mandatory. Select your source from the drop down menu.

Binary File

Source

The location of the source file. Either a Local upload, Amazon S3, or Azure Blob StorageThe following authentication methods are supported per source:Amazon S3: Access Key ID/Secret Access KeyAzure Blob Storage: Connection String

Local

Header Rows to Ignore

Mandatory. The number of records from the top of the file to ignore before the data starts (includes column header).

1

Footer Rows to Ignore

Mandatory. The number of records from the bottom of the file to ignore

0

Encoding

Optional. The encoding of the file. This default to UTF8, however also supports: UTF8_BOM, UTF16, ASCII.

Path

Mandatory. The path to the source file to load. To upload a local file, you must first insert a Variable in the Info tab of the connection (ex: filepath). Then, you would reference that same value in this location (Ex: @Filepath). This will then trigger a File Upload option to import your file.

@Filepath

AuthType

This field defines the authentication type for your data sync. Cinchy supports "Access Key" and "IAM" role. When selecting Access Key, you must provide the key and key secret. When selecting IAM role, a new field will appear for you to paste in the role's Amazon Resource Name (ARN). You also must ensure that:The role must be configured to have at least read access to the source. The Connections pods' role must have permission to assume the role specified in the data sync config.

Test Connection

You can use the "Test Connection" button to ensure that your credentials are properly configured to access your source. If configured correctly, a "Connection Successful" pop-up will appear. If configured incorrectly, a "Connection Failed" pop-up will appear along with a link to the applicable error logs to help you troubleshoot.

Parameter
Description
Example

Name

Mandatory. The name of your column as it appears in the source.

Name

Alias

Optional. You may choose to use an alias on your column so that it has a different name in the data sync.

Data Type

Mandatory. The data type of the column values.

Text

Description

Optional. You may choose to add a description to your column.

Parse Content By (Only for Standard Columns)

Binary File sources have a unique, mandatory parameter for Standard Columns:

Parse Content By - Choose from the following three options to define how you want to parse your content:

  • Byte Length - The content length in number of bytes

  • Trailing Byte Sequence - the trailing sequence in base64 that indicates the end of the field

  • Succeeding Byte Sequence - the trailing sequence in base64 that indicates the start of the next field, and thus the end of this one.

Byte Length

Select Show Advanced for more options for the Schema section.

Parameter
Description
Example

Mandatory

  • If both Mandatory and Validated are checked on a column, then rows where the column is empty are rejected

  • If just Mandatory is checked on a column, then all rows are synced with the execution log status of failed, and the source error of "Mandatory Rule Violation"

  • If just Validated is checked on a column, then all rows are synced.

Validate Data

  • If both Mandatory and Validated are checked on a column, then rows where the column is empty are rejected

  • If just Validated is checked on a column, then all rows are synced.

Trim Whitespace

Optional if data type = text. For Text data types, you can choose whether to trim the whitespace._

Max Length

Optional if data type = text. For Text data types, you can choose whether to trim the whitespace._

You can choose to add in a Transformation > String Replacement by inputting the following:

Parameter
Description
Example

Pattern

Mandatory if using a Transformation. The pattern for your string replacement.

Replacement

What you want to replace your pattern with.

Note that you can have more than one String Replacement

Next steps

  • Click Jobs > Start a Job to begin your sync.

PreviousFile-based sourcesNextDelimited file

Last updated 1 year ago

Optional. Review our documentation on for more information about this field. When uploading a local file, set this to filepath.

The section is where you define which source columns you want to sync in your connection. You can repeat the values for multiple columns.

You have the option to add a source filter to your data sync. Please review the documentation here for more information on

Configure your

Define your

Add in your , if required.

source filters.
Destination
Sync Actions.
Post Sync Scripts
Variables here
Schema
Image 1: The Info Tab
Image 2: Define your Source