Cinchy Platform Documentation
Cinchy v5.7
Cinchy v5.7
  • Data Collaboration Overview
  • Release notes
    • Release notes
      • 5.7 Release Notes
      • 5.6 Release Notes
      • 5.5 Release Notes
      • 5.4 Release Notes
      • 5.3 Release Notes
      • 5.2 Release Notes
      • 5.1 Release Notes
      • 5.0 Release Notes
  • Support
  • Glossary
  • FAQ
  • Deployment guide
    • Deploying Cinchy
      • Plan your deployment
        • Deployment architecture
          • Kubernetes architecture
          • IIS architecture
        • Deployment prerequisites
          • Single Sign-On (SSO) integration
            • Enable TLS 1.2
            • Configure ADFS
            • AD Group Integration
      • Kubernetes
        • Disable your Kubernetes applications
        • Change your file storage configuration
        • Configure AWS IAM for Connections
        • Use Self-Signed SSL Certs (Kubernetes)
        • Deploy the CLI (Kubernetes)
      • IIS
  • Upgrade guide
    • Upgrade Cinchy
      • Cinchy Upgrade Utility
      • Kubernetes upgrades
        • v5.1 (Kubernetes)
        • v5.2 (Kubernetes)
        • v5.3 (Kubernetes)
        • v5.4 (Kubernetes)
        • v5.5 (Kubernetes)
        • v5.6 (Kubernetes)
        • v5.7 (Kubernetes)
        • Upgrade AWS EKS Kubernetes version
        • Update the Kubernetes Image Registry
        • Upgrade Azure Kubernetes Service (AKS)
      • IIS upgrades
        • v4.21 (IIS)
        • v4.x to v5.x (IIS)
        • v5.1 (IIS)
        • v5.2 (IIS)
        • v5.3 (IIS)
        • v5.4 (IIS)
        • v5.5 (IIS)
        • v5.6 (IIS)
        • v5.7 (IIS)
      • Upgrading from v4 to v5
  • Guides for using Cinchy
    • User Guide
      • Data Browser overview
      • The Admin panel
      • User preferences
        • Personal access tokens
      • Table features
      • Data management
      • Queries
      • Version management
        • Versioning best practices
      • Commentary
    • Builder Guide
      • Best practices
      • Create tables
        • Attach files
        • Columns
        • Data controls
          • Data entitlements
          • Data erasure
          • Data compression
        • Formatting rules
        • Indexing & partitioning
        • Linking data
        • Table and column GUIDs
        • System tables
      • Delete tables
        • Restore tables, columns, and rows
      • Saved queries
      • CinchyDXD
        • Build the data experience
        • Package the data experience
        • Install the data experience
        • Update the data experience
        • Repackage the data experience
        • Reinstall the data experience
      • Multilingual support
      • Integration guides
    • Administrator Guide
    • Additional guides
      • Monitor and Log on Kubernetes
        • Grafana
        • OpenSearch dashboards
          • Set up Alerts
        • Monitor via ArgoCD
      • Maintenance
      • Cinchy Secrets Manager
      • GraphQL (Beta)
      • System properties
      • Enable Data At Rest Encryption (DARE)
      • MDQE
      • Application experiences
        • Network map
          • Custom node results
          • Custom results in the Network Map
        • Set up experiences
  • API Guide
    • API overview
      • API authentication
      • API saved queries
      • ExecuteCQL
      • Webhook ingestion
  • CQL
    • Overview
      • CQL examples
      • CQL statements overview
        • Cinchy DML statements
        • Cinchy DDL statements
      • Cinchy supported functions
        • Cinchy functions
        • Cinchy system values
        • Cinchy User Defined Functions (UDFs)
          • Table-valued functions
          • Scalar-valued functions
        • Conversion functions
        • Date and Time types and functions
          • Return System Date and Time values
          • Return Date and Time parts
          • Return Date and Time values from their parts
          • Return Date and Time difference values
          • Modify Date and Time values
          • Validate Date and Time values
        • Logical functions
        • Math functions
        • String functions
        • Geometry and Geography data type and functions
          • OGC methods on Geometry & Geography instances
          • Extended methods on Geometry & Geography instances
        • Full Text Search functions
        • Connections functions
        • JSON functions
    • CQL functions master list
  • Meta-Forms
    • Introduction
    • Install Meta-Forms
      • Deploy Meta-Forms (Kubernetes)
      • Deploy Meta-Forms (IIS)
    • Forms data types
    • Meta-Forms Builder Guide
      • Create a dynamic meta-form with tables
      • Create a dynamic meta-form example with Form Designer
      • Add links to a form
      • Rich text editing in forms
  • Data syncs
    • Get started with data syncs
    • IIS installation
      • Prerequisites
      • Install Connections
      • Install the Worker/Listener
      • Install the CLI and the Maintenance CLI
    • Build data syncs
      • Data sync types
      • Design patterns
      • Sync actions
      • Columns and mappings
        • Calculated column examples
      • Advanced settings
        • Filters
        • Variables
        • Auth requests
        • Request headers
        • Post sync scripts
        • Pagination
      • Batch data sync example
      • Real-time sync example
      • Schedule a data sync
      • Connection functions
    • CLI commands list
    • Troubleshooting
    • Data sync sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML config example
      • Cinchy Table
        • Cinchy Table XML config example
      • Cinchy Query
        • Cinchy Query XML config example
      • Copper
      • DB2 (query and table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File-based sources
        • Binary file
        • Delimited file
        • Excel
        • Fixed width file
        • Parquet
      • Kafka Topic
        • Kafka Topic example config
        • Apache AVRO data format
      • LDAP
      • MongoDB collection
        • MongoDB collection source example
      • Mongo event
      • MongoDB collection (Cinchy event)
      • MS SQL Server (query and table)
      • ODBC Query
      • Oracle (query and table)
      • Polling event
        • Polling event example config
      • REST API
      • REST API (Cinchy event)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce platform event
      • Salesforce push topic
      • Snowflake
        • Snowflake source example config
      • SOAP 1.2 web service
    • Data sync destinations
      • Cinchy Table
      • DB2 table
      • Dynamics
      • Kafka Topic
      • MongoDB collection
      • MS SQL Server table
      • Oracle table
      • REST API
      • Salesforce
      • Snowflake table
      • SOAP 1.2 web service
    • Real-time sync stream sources
      • The Listener Config table
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Download the CinchyDXD utility
  • Initial setup: PowerShell
  • Cinchy DXD tables overview
  • Define the data experience
  • Define the reference data
  • Export the data experience
  • Validate export

Was this helpful?

Export as PDF
  1. Guides for using Cinchy
  2. Builder Guide
  3. CinchyDXD

Package the data experience

This page outlines Step 2 of Deploying CinchyDXD: Packaging the Data Experience

PreviousBuild the data experienceNextInstall the data experience

Last updated 1 year ago

Was this helpful?

Download the CinchyDXD utility

The CinchyDXD utility takes all the components (tables, queries, views, formatting rules) of a DX and package them up so they can be moved from one environment to another.

Remember that all objects need to be created in one source environment (ex: DEV). From there, DXD will be used to push them into others (ex: SIT, UAT, Production).

The CinchyDXD utility is only required (made accessible) for the environment that's packing up the data experience. It's not required for the destination (or target) environment.

For CinchyDXD to work, you must have CinchyCLI installed. For further installation instructions please refer to CLI () documentation

To access the Data Experience Deployment utility please contact Cinchy support (support@cinchy.com).

To download the Utility:

  1. Login to Cinchy

  2. Navigate to the Releases Table

  3. Select the Experience Deployment Utility View

  4. Locate and download the utility (Cinchy DXD v1.7.0.zip)

The CinchyDXD utility is only upwards compatible with Cinchy version 4.6+

  1. Unzip the utility and place the folder at any location on a computer that also has CinchyCLI installed

  2. Create a new folder in the same directory that will hold all of the DX exports generated (CinchyDXD*Output) *(Image 1)._

This folder will then hold all your deployment packages.

  1. Launch a PowerShell console window

  2. From the console, navigate to the CinchyDXD directory (Image 2 and 3).

From within your file explorer window, type “PowerShell” into the file path. It will launch a PowerShell window already at the folder path

Initial setup: PowerShell

PowerShell requires an initial setup when using CinchyDXD.

  1. From your PowerShell window type cin

  2. Hit Tab on your keyboard (Image 4).

  1. Hit Enter on your keyboard (Image 5).

You will get an error message (above) that CinchyDXD.ps1 can't be loaded because the running script is disabled.

To resolve this error:

  1. From your start menu, search for PowerShell and select Run as Administrator (Image 6).

  1. When prompted if you want to allow this app to make changes on your device, select Yes.

  2. In your PowerShell Administrator window enter Set-ExecutionPolicy RemoteSigned (Image 7).

  1. Hit Enter on your keyboard (Image 8).

  1. When prompted with the Execution Policy Changes, enter A for “Yes to All”

  2. Close the PowerShell Administrator window

  3. Navigate back to your PowerShell window for the CinchDXD window

  4. From your PowerShell window type cin

  5. Hit Tab and then Enter on your keyboard (Image 9).

The basic CinchyDXD instructions will be displayed. You will be able to execute commands such as exporting and installing a Data Experience.

Cinchy DXD tables overview

Cinchy uses four tables for packing up and deploying a Data Experience (Image 10).

The Data Experience is defined and packed in what will be referred to moving forward as the Source Environment. Where the environment that the Data Experience will be deployed to will be referenced to as the Target Environment.

  1. Data Experience Definition Table: Where the data experience is defined (tables, queries, views, formatting rules, UDF’s etc.)

  2. Data Experience Reference Data Table: Where we define any data that needs to move with the Data Experience for the experience to work (lookup values, static values that may need to exist in tables - it typically would not be the physical data itself)

  3. Data Experience Releases Table: Once a Data Experience is exported, an entry is created in this table for the export containing:

    • Version Number

    • Release Binary is the location where you can archive/backup your release history in Cinchy Please Note: if you have your own release management system, you do have the option to opt out of archiving the releases in Cinchy and check the release into your own source control

    • Release Name

    • Data Experience

  4. Data Experience Release Artifact Table: Stores all of the files that are part of the Data Experience package as individual records along with all of the binary for each record

Define the data experience

When setting up a Data Experience definition, you will need one definition for each Data Experience you wish to package and deploy to a given number of Target Environments.

  1. Locate and open the Data Experience Definitions table (Image 11).

Column
Definition

GUID

This value is calculated, please note this value will be required as one of your export parameters in PowerShell

Name

This is the Name of your Data Experience

Tables

Select all tables that are part of the Data Experience

Views

Select all views (in the data browser) that are a part of the Data Experience

Integrated Clients

Select any integrated clients (For example: Tableau, PowerBI, custom integrations) that are part of the Data Experience

Data Sync Configurations

Select any data syncs (CLI’s experience needs to work) that are part of the Data Experience

Listener Configurations

Select any Listener Config rows that refer to a Data Sync Configuration which is a part of the Data Experience

Reference Data

Select any reference data that's part of the Data Experience. Please note that the setup of the reference data is done in the table called Data Experience Reference Data (see step 2 below for setup details)

Secrets

Select any Secrets you'd like to include that are used Data Sync Configurations or Listener Configs which are a part of this Data Experience.

Webhooks

Select any Webhooks that are a part of this data experience

User Defined Functions

Select any user defined functions (For example: validate phone, validate email) that are part of the Data Experience

Models

Select any custom models that override columns or tables in your Data Experience, if there are none - leave blank

Groups

Select any groups that are part of the Data Experience (when moving groups, it will also move all table access [design] controls)

System Colours

Select a system colour (if defined) for the Data Experience

Saved Queries

Select any queries that are part of the Data Experience

Applets

Select any applets that are part of the Data Experience

Pre-install Scripts

Select any Pre-install Scripts (Saved Queries) that should run before the installation of this Data Experience.

Post-install Scripts

Select any Post-install Scripts (Saved Queries) that should run after to the installation of this Data Experience. A common use-case is to rectify data that may be different between environments.

Formatting Rules

Select any formatting rules that are part of the Data Experience

Literal Groups

Select any literals associated to the Data Experience (For example: key values with English and French definitions)

Builders

Select the builder(s) who have permission to export the Data Experience

Builder Groups

Select the builder group(s) that have permission to export the Data Experience

Note: Best Practice is to use a Group over a User. Users within groups can fluctuate, where the Group (or Role) will remain. This will require less maintenance moving forward

Sync GUID

Leave this column blank

2. Complete the following (Image 12):

Column
Value

Name

Currency Converter

Tables

Currency Exchange Rate (Sandbox)

Saved Queries

Currency Converter

Builder Groups

Currency Converters

If you make changes to the DX in the future, you aren't required to build a new Data Experience Definition in this table, you will update the existing definition. If you need to review what the definition looked like historically, you can view it via the Collaboration log.

Define the reference data

When setting up a Data Experience Reference Data definition, you will need one (1) definition for each Reference Data table you wish to package and deploy with your Data Experience to the Target Environment.

This table set up is similar to setting up a CLI.

  1. Locate and open the Data Experience Reference Data table (Image 13).

Column
Definition

Name

This is the Name of your Reference Data Table, note this name can be anything and doesn't have to replicate the actual table name

Ordinal

The ordinal number assigned will identify the order in which the data is loaded and required based on dependencies within the data experience. For example if you have tables that have hierarchies in them, you will need to load the parent records first and then load your child records which would then resolve any links in the table.

Filter

This is where a WHERE clause would be required. For example, if you have a table that has hierarchies, you would require two rows within the Data Experience Reference Data table. One to load the parent data and one to load the children data. In the parent record a filter WHERE clause would be needed to filter all parent records. In the second record in the filter column a WHERE clause in another in the second record that would be needed to filter the children records.

New Records

Identify the behaviour of a new record (INSERT, UPDATE, DELETE, IGNORE)

Change Records

Identify the behaviour of a changed record (INSERT, UPDATE, DELETE, IGNORE)

Dropped Records

Identify the behaviour of a dropped record (INSERT, UPDATE, DELETE, IGNORE)

Table

Identify the table that you are exporting data from

Sync Key

Required (need definition)

Expiration Timestamp Field

If Dropped Records is set to “Expire” then a timestamp column is required

Based on the configuration set up in this table, Cinchy will export the data and create CSV and CLI files.

This example doesn't have Reference Data as part of the Data Experience.

Export the data experience

Using PowerShell you will now export the Data Experience you have defined within Cinchy.

  1. Launch PowerShell and navigate to your CinchyDXD folder (Image 14).

Reminder: you can launch PowerShell right from your file explorer window in the CinchyDXD folder by entering in the folder path “PowerShell” and hitting enter on your keyboard. Saving you an extra step of navigating to the CinchyDXD folder manually in PowerShell (Image 15).

  1. In the PowerShell window type in cin and hit Tab on your keyboard (Image 16).

  1. Hit Enter on your keyboard, you will see a list of commands that are available to execute (Image 17).

  1. In the PowerShell command line hit your “up” arrow key to bring back the last command and type export next to it (Image 18).

  1. Hit Enter on your keyboard (Image 19).

The PowerShell window will provide you with the required and optional components to export the data experience.

  1. You must now set up any mandatory export parameters

The parameters executed in PowerShell can exist on one line in PowerShell, however for legibility (below) the parameters have been put on separate lines. If you are putting your parameters on separate lines you will be required to have backticks quote ` for the parameters to execute.

Please ensure that you are using the sample below as a sample. You will be required to provide values that correspond to:

  • the URL of the source environment

  • the User ID for the user who is performing the export

  • the Password for the user who is performing the export

  • your folder path for where CLI is stored

  • your folder path for where the CLI output files are written to

  • the GUID for the Data Experience that's generated in the Data Experience Definition table

  • your own version naming convention

  • your folder path for where your CinchyDXD output files are written to

Sample: .\CinchyDXD.ps1 export ` -s "<cinchy source url>" ` -u "<source user id>" ` -p "<source passsword>" ` -c "C:\Cinchy CLI v4.0.2" ` -d "C:\CLI Output Logs" ` -g "8C4D08A1-C0ED-4FFC-A695-BBED068507E9" ` -v "1.0.0" ` -o "C:\CinchyDXD_Output" `\

  1. Enter the export parameters into the PowerShell window (Image 20).

  1. Hit Enter on your keyboard to run the export command.

PowerShell will begin to process the export. Once the export is complete, PowerShell will provide you with an export complete message (Image 21).

Validate export

  1. Ensure that the DXD Export Folder is populated (Image 22).

2. Ensure that the Data Experience Release table is populated in the source environment (Image 23).

3. Ensure that the Data Experience Release Artifacts table is populated in the source environment (Image 24).

https://cli.docs.cinchy.com/
Image 1: Creating your new folder
Image 2: Navigate to your directory
Image 3: Navigate to your directory
Image 4: Setting up
Image 5: Setting up, cont.
Image 6: Run as administrator
Image 7: Set-ExecutionPolicy RemoteSigned
Image 8: Hit Enter
Image 9: Finishing your set up
Image 10: Data Experience tables
Image 11: Data Experience Definitions table
Image 12: Enter in information
Image 13: Data Experience Reference Data table
Image 14: Launch PowerShell
Image 15: Launch PowerShell
Image 16: Type cin
Image 17: List of commands
Image 18
Image 19
Image 20: Enter your export parameters
Image 21: Wait for the export to complete
Image 22: Validate that your DXD Export Folder is populated
Image 23: Validate that the Data Experience Release Table is populated
Image 24: Validate that the Data Experience Release Artifacts table is populated