5️⃣
Cinchy Platform Documentation
Cinchy v5.8
Cinchy v5.8
  • Data Collaboration Overview
  • Release notes
    • Release notes
      • 5.9 release notes
      • 5.8 Release Notes
      • 5.7 Release Notes
      • 5.6 Release Notes
      • 5.5 Release Notes
      • 5.4 Release Notes
      • 5.3 Release Notes
      • 5.2 Release Notes
      • 5.1 Release Notes
      • 5.0 Release Notes
  • Support
  • Glossary
  • FAQ
  • Deployment guide
    • Deploying Cinchy
      • Plan your deployment
        • Deployment architecture
          • Kubernetes architecture
          • IIS architecture
        • Deployment prerequisites
          • Single Sign-On (SSO) integration
            • Enable TLS 1.2
            • Configure ADFS
            • AD Group Integration
      • Kubernetes
        • Disable your Kubernetes applications
        • Change your file storage configuration
        • Configure AWS IAM for Connections
        • Use Self-Signed SSL Certs (Kubernetes)
        • Deploy the CLI (Kubernetes)
      • IIS
  • Upgrade guide
    • Upgrade Cinchy
      • Cinchy Upgrade Utility
      • Kubernetes upgrades
        • v5.1 (Kubernetes)
        • v5.2 (Kubernetes)
        • v5.3 (Kubernetes)
        • v5.4 (Kubernetes)
        • v5.5 (Kubernetes)
        • v5.6 (Kubernetes)
        • v5.7 (Kubernetes)
        • v5.8 (Kubernetes)
        • Upgrade AWS EKS Kubernetes version
        • Update the Kubernetes Image Registry
        • Upgrade Azure Kubernetes Service (AKS)
      • IIS upgrades
        • v4.21 (IIS)
        • v4.x to v5.x (IIS)
        • v5.1 (IIS)
        • v5.2 (IIS)
        • v5.3 (IIS)
        • v5.4 (IIS)
        • v5.5 (IIS)
        • v5.6 (IIS)
        • v5.7 (IIS)
        • v5.8 (IIS)
      • Upgrading from v4 to v5
  • Guides for using Cinchy
    • User Guide
      • Data Browser overview
      • The Admin panel
      • User preferences
        • Personal access tokens
      • Table features
      • Data management
      • Queries
      • Version management
        • Versioning best practices
      • Commentary
    • Builder Guide
      • Best practices
      • Create tables
        • Attach files
        • Columns
        • Data controls
          • Data entitlements
          • Data erasure
          • Data compression
        • Formatting rules
        • Indexing & partitioning
        • Linking data
        • Table and column GUIDs
        • System tables
      • Delete tables
        • Restore tables, columns, and rows
      • Saved queries
      • CinchyDXD
        • Overview
        • DXD workflow
        • Package the data experience
        • Install the data experience
        • Release package
        • Changelog
        • References
          • Cinchy DXD CLI reference
          • Data Experience Definitions table
          • Data Experience Reference table
      • Multilingual support
      • Integration guides
    • Administrator Guide
    • Additional guides
      • Monitor and Log on Kubernetes
        • Grafana
        • OpenSearch dashboards
          • Set up Alerts
        • Monitor via ArgoCD
      • Maintenance
      • Cinchy Secrets Manager
      • GraphQL (Beta)
      • System properties
      • Enable Data At Rest Encryption (DARE)
      • Application experiences
        • Network map
          • Custom node results
          • Custom results in the Network Map
        • Set up experiences
  • API Guide
    • API overview
      • API authentication
      • API saved queries
      • ExecuteCQL
      • Webhook ingestion
  • CQL
    • Overview
      • CQL examples
      • CQL statements overview
        • Cinchy DML statements
        • Cinchy DDL statements
      • Cinchy supported functions
        • Cinchy functions
        • Cinchy system values
        • Cinchy User Defined Functions (UDFs)
          • Table-valued functions
          • Scalar-valued functions
        • Conversion functions
        • Date and Time types and functions
          • Return System Date and Time values
          • Return Date and Time parts
          • Return Date and Time values from their parts
          • Return Date and Time difference values
          • Modify Date and Time values
          • Validate Date and Time values
        • Logical functions
        • Math functions
        • String functions
        • Geometry and Geography data type and functions
          • OGC methods on Geometry & Geography instances
          • Extended methods on Geometry & Geography instances
        • Full Text Search functions
        • Connections functions
        • JSON functions
    • CQL functions reference list
  • Meta-Forms
    • Introduction
    • Install Meta-Forms
      • Deploy Meta-Forms (Kubernetes)
      • Deploy Meta-Forms (IIS)
    • Forms data types
    • Meta-Forms Builder Guide
      • Create a dynamic meta-form with tables
      • Create a dynamic meta-form example with Form Designer
      • Add links to a form
      • Rich text editing in forms
  • Data syncs
    • Get started with data syncs
    • IIS installation
      • Install Connections
      • Install the Worker/Listener
      • Install the Connections CLI
    • Build data syncs
      • Data sync types
      • Design patterns
      • Sync actions
      • Columns and mappings
        • Calculated column examples
      • Advanced settings
        • Filters
        • Variables
        • Auth requests
        • Request headers
        • Post sync scripts
        • Pagination
      • Batch data sync example
      • Real-time sync example
      • Schedule a data sync
      • Connection functions
    • Data sync sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML config example
      • Cinchy Table
        • Cinchy Table XML config example
      • Cinchy Query
        • Cinchy Query XML config example
      • Copper
      • DB2 (query and table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File-based sources
        • Binary file
        • Delimited file
        • Excel
        • Fixed width file
        • Parquet
      • Kafka Topic
        • Kafka Topic example config
        • Apache AVRO data format
      • LDAP
      • MongoDB collection
        • MongoDB collection source example
      • Mongo event
      • MongoDB collection (Cinchy event)
      • MS SQL Server (query and table)
      • ODBC Query
      • Oracle (query and table)
      • Polling event
        • Polling event example config
      • REST API
      • REST API (Cinchy event)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce platform event
      • Salesforce push topic
      • Snowflake
        • Snowflake source example config
      • SOAP 1.2 web service
      • SOAP 1.2 web service (Cinchy Event Triggered)
    • Data sync destinations
      • Cinchy Table
      • DB2 table
      • Dynamics
      • Kafka Topic
      • MongoDB collection
      • MS SQL Server table
      • Oracle table
      • REST API
      • Salesforce
      • Snowflake table
      • SOAP 1.2 web service
    • Real-time sync stream sources
      • The Listener Config table
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
    • CLI commands list
    • Troubleshooting
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Overview
  • Loading data into Snowflake
  • Considerations
  • Destination tab
  • Next steps
  1. Data syncs
  2. Data sync destinations

Snowflake table

PreviousSalesforceNextSOAP 1.2 web service

Last updated 1 year ago

Overview

t provides a single platform for data warehousing, data lakes, data engineering, data science, data application development, and secure sharing and consumption of real-time/shared data.

Snowflake enables data storage, processing, and analytic solutions.

Before you set up your data sync destination,

The Snowflake Table destination supports batch and real-time syncs.

Loading data into Snowflake

For batch syncs of 10 records or less, single Insert/Update/Delete statements are executed to perform operations against the target Snowflake table.

For batch syncs exceeding 10 records, the operations are performed in bulk.

The bulk operation process consists of:

  1. Generating a CSV containing a batch of records

  2. Creating a temporary table in Snowflake

  3. Copying the generated CSV into the temp table

  4. If needed, performing Insert operations against the target Snowflake table using the temp table

  5. If needed, performing Update operations against the target Snowflake table using the temp table

  6. If needed, performing Delete operations against the target Snowflake table using the temp table

  7. Dropping the temporary table

Real time sync volume size is based on a dynamic batch size up to configurable threshold.

Considerations

  • The temporary table generated in the bulk flow process for high volume scenarios transforms all columns of data type Number to be of type NUMBER(38, 18). This may cause precision loss if the number scale in the target table is higher

Destination tab

The following table outlines the mandatory and optional parameters you will find on the Destination tab (Image 1).

The following parameters will help to define your data sync destination and how it functions.

Parameter
Description
Example

Destination

Mandatory. Select your destination from the drop down menu.

Snowflake Table

Connection String

Mandatory. The encrypted connection string used to connect to your Snowflake instance. You can review Snowflake's Connection String guide and parameter descriptions here.

Unencrypted example: account=wr38353.ca-central-1.aws;user=myuser;password=mypassword;db=CINCHY;schema=PUBLIC

Table

Mandatory. The name of the Table in Snowflake that you wish to sync.

Employees

ID Column

Mandatory if you want to use "Delete" action in your sync behaviour configuration. The name of the identity column that exists in the target (OR a single column that's guaranteed to be unique and automatically populated for every new record).

Employee ID

ID Column Data Type

Mandatory if using the ID Column parameter. The data type of the above ID Column. Either: Text, Number, Date, Boolean, Geography, or Geometry

Number

Test Connection

You can use the "Test Connection" button to ensure that your credentials are properly configured to access your destination. If configured correctly, a "Connection Successful" pop-up will appear. If configured incorrectly, a "Connection Failed" pop-up will appear along with a link to the applicable error logs to help you troubleshoot.

Parameter
Description
Example

Source Column

Mandatory. The name of your column as it appears in the source.

Name

Target Column

Mandatory. The name of your column as it appears in the destination.

Name

Next steps

  • If you are running a batch sync, click Jobs > Start a Job to begin your sync.

The section is where you define which source columns you want to sync to which destination columns. You can repeat the values for multiple columns. When specifying the Target Column in the Column Mappings section, all names are case-sensitive.

You have the option to add a destination filter to your data sync. Please review the documentation here for more information on

Define your

Add in your , if required.

Define your .

If you are running a real-time sync, and enable it to begin your sync.

make sure to configure your Source.
Sync Actions.
Post Sync Scripts
set up your Listener Config
Permissions
destination filters.
Image 2: Define your Destination
Column Mapping