5️⃣
Cinchy Platform Documentation
Cinchy v5.8
Cinchy v5.8
  • Data Collaboration Overview
  • Release notes
    • Release notes
      • 5.9 release notes
      • 5.8 Release Notes
      • 5.7 Release Notes
      • 5.6 Release Notes
      • 5.5 Release Notes
      • 5.4 Release Notes
      • 5.3 Release Notes
      • 5.2 Release Notes
      • 5.1 Release Notes
      • 5.0 Release Notes
  • Support
  • Glossary
  • FAQ
  • Deployment guide
    • Deploying Cinchy
      • Plan your deployment
        • Deployment architecture
          • Kubernetes architecture
          • IIS architecture
        • Deployment prerequisites
          • Single Sign-On (SSO) integration
            • Enable TLS 1.2
            • Configure ADFS
            • AD Group Integration
      • Kubernetes
        • Disable your Kubernetes applications
        • Change your file storage configuration
        • Configure AWS IAM for Connections
        • Use Self-Signed SSL Certs (Kubernetes)
        • Deploy the CLI (Kubernetes)
      • IIS
  • Upgrade guide
    • Upgrade Cinchy
      • Cinchy Upgrade Utility
      • Kubernetes upgrades
        • v5.1 (Kubernetes)
        • v5.2 (Kubernetes)
        • v5.3 (Kubernetes)
        • v5.4 (Kubernetes)
        • v5.5 (Kubernetes)
        • v5.6 (Kubernetes)
        • v5.7 (Kubernetes)
        • v5.8 (Kubernetes)
        • Upgrade AWS EKS Kubernetes version
        • Update the Kubernetes Image Registry
        • Upgrade Azure Kubernetes Service (AKS)
      • IIS upgrades
        • v4.21 (IIS)
        • v4.x to v5.x (IIS)
        • v5.1 (IIS)
        • v5.2 (IIS)
        • v5.3 (IIS)
        • v5.4 (IIS)
        • v5.5 (IIS)
        • v5.6 (IIS)
        • v5.7 (IIS)
        • v5.8 (IIS)
      • Upgrading from v4 to v5
  • Guides for using Cinchy
    • User Guide
      • Data Browser overview
      • The Admin panel
      • User preferences
        • Personal access tokens
      • Table features
      • Data management
      • Queries
      • Version management
        • Versioning best practices
      • Commentary
    • Builder Guide
      • Best practices
      • Create tables
        • Attach files
        • Columns
        • Data controls
          • Data entitlements
          • Data erasure
          • Data compression
        • Formatting rules
        • Indexing & partitioning
        • Linking data
        • Table and column GUIDs
        • System tables
      • Delete tables
        • Restore tables, columns, and rows
      • Saved queries
      • CinchyDXD
        • Overview
        • DXD workflow
        • Package the data experience
        • Install the data experience
        • Release package
        • Changelog
        • References
          • Cinchy DXD CLI reference
          • Data Experience Definitions table
          • Data Experience Reference table
      • Multilingual support
      • Integration guides
    • Administrator Guide
    • Additional guides
      • Monitor and Log on Kubernetes
        • Grafana
        • OpenSearch dashboards
          • Set up Alerts
        • Monitor via ArgoCD
      • Maintenance
      • Cinchy Secrets Manager
      • GraphQL (Beta)
      • System properties
      • Enable Data At Rest Encryption (DARE)
      • Application experiences
        • Network map
          • Custom node results
          • Custom results in the Network Map
        • Set up experiences
  • API Guide
    • API overview
      • API authentication
      • API saved queries
      • ExecuteCQL
      • Webhook ingestion
  • CQL
    • Overview
      • CQL examples
      • CQL statements overview
        • Cinchy DML statements
        • Cinchy DDL statements
      • Cinchy supported functions
        • Cinchy functions
        • Cinchy system values
        • Cinchy User Defined Functions (UDFs)
          • Table-valued functions
          • Scalar-valued functions
        • Conversion functions
        • Date and Time types and functions
          • Return System Date and Time values
          • Return Date and Time parts
          • Return Date and Time values from their parts
          • Return Date and Time difference values
          • Modify Date and Time values
          • Validate Date and Time values
        • Logical functions
        • Math functions
        • String functions
        • Geometry and Geography data type and functions
          • OGC methods on Geometry & Geography instances
          • Extended methods on Geometry & Geography instances
        • Full Text Search functions
        • Connections functions
        • JSON functions
    • CQL functions reference list
  • Meta-Forms
    • Introduction
    • Install Meta-Forms
      • Deploy Meta-Forms (Kubernetes)
      • Deploy Meta-Forms (IIS)
    • Forms data types
    • Meta-Forms Builder Guide
      • Create a dynamic meta-form with tables
      • Create a dynamic meta-form example with Form Designer
      • Add links to a form
      • Rich text editing in forms
  • Data syncs
    • Get started with data syncs
    • IIS installation
      • Install Connections
      • Install the Worker/Listener
      • Install the Connections CLI
    • Build data syncs
      • Data sync types
      • Design patterns
      • Sync actions
      • Columns and mappings
        • Calculated column examples
      • Advanced settings
        • Filters
        • Variables
        • Auth requests
        • Request headers
        • Post sync scripts
        • Pagination
      • Batch data sync example
      • Real-time sync example
      • Schedule a data sync
      • Connection functions
    • Data sync sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML config example
      • Cinchy Table
        • Cinchy Table XML config example
      • Cinchy Query
        • Cinchy Query XML config example
      • Copper
      • DB2 (query and table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File-based sources
        • Binary file
        • Delimited file
        • Excel
        • Fixed width file
        • Parquet
      • Kafka Topic
        • Kafka Topic example config
        • Apache AVRO data format
      • LDAP
      • MongoDB collection
        • MongoDB collection source example
      • Mongo event
      • MongoDB collection (Cinchy event)
      • MS SQL Server (query and table)
      • ODBC Query
      • Oracle (query and table)
      • Polling event
        • Polling event example config
      • REST API
      • REST API (Cinchy event)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce platform event
      • Salesforce push topic
      • Snowflake
        • Snowflake source example config
      • SOAP 1.2 web service
      • SOAP 1.2 web service (Cinchy Event Triggered)
    • Data sync destinations
      • Cinchy Table
      • DB2 table
      • Dynamics
      • Kafka Topic
      • MongoDB collection
      • MS SQL Server table
      • Oracle table
      • REST API
      • Salesforce
      • Snowflake table
      • SOAP 1.2 web service
    • Real-time sync stream sources
      • The Listener Config table
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
    • CLI commands list
    • Troubleshooting
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Overview
  • Prerequisites
  • Define the data experience
  • Best practices
  • Updating data experiences
  • Define the reference data
  • Export the data experience
  • Export arguments
  • Example
  • Validate export process
  • Export failures list
  • Next steps
  1. Guides for using Cinchy
  2. Builder Guide
  3. CinchyDXD

Package the data experience

PreviousDXD workflowNextInstall the data experience

Last updated 1 year ago

Overview

This section covers everything you need to export a release package from a data experience.

Prerequisites

Before you start, make sure you have access to the following tables:

Define the data experience

You must define your data experience in the Data Experience Definitions Table. Each row in the table is a definition for each Data Experience you want to package and export.

A definition contains the entities that make up the experience. Some examples of entities are tables, domains, reference data, and user defined functions (UDFs) .

For a complete list of all fields, see the reference page.

Best practices

Cinchy recommends packaging entities based on their dependencies on one another. For example entities such as Applets, System Colours, Literal Groups, and Models can be packaged separately. This makes versioning these entities easier when exporting to other environments.

Updating data experiences

If you make changes to the DX in the future, update the relevant Data Experience definition. You don't need to create a new definition. If you need to review what the definition looked like historically, you can view it via the Collaboration log.

Define the reference data

For each table data reference, you must define an entry in the Data Experience Reference Data system table. This entry will be deployed alongside your Data Experience Definition (DX).

Treat this reference data similarly to a Data Sync Configuration for batch synchronization. It should move data from a CSV file to a Cinchy Table with matching attributes. The sync key column should contain unique values and shouldn't be a system or calculated column.

Export the data experience

After you define the data experience and the reference data you want with it, you can now use PowerShell to export your experience.

You must have PowerShell 7 or later to use CinchyDXD

To export your data experience:

  1. Launch PowerShell and navigate to your CinchyDXD folder (such asC:\DxDvX.X.X)

  2. Run .\CinchyDXD.ps1 export with the required and optional parameters.

Export arguments

Use the arguments below to create your data export with CinchyDXD.

-s "<cinchy source url>" 
-u "<source user name>" 
-p "<source password>" 
-d "<folder path for where the CLI output files are written to>" 
-g "<the GUID for the DX that is generated in the Data Experience Definition table." 
-v "<version of the experience package, update every time if there are changes in the DX>" 
-o "<folder path for where your CinchyDXD output files to install are written to>"

Example

The following example shows the use of arguments for a data package export in Cinchy.

.\CinchyDXD.ps1 export -s "sandbox.cinchy.net/source-url-environment" -u "JohnDoe" -p "123456" -d "C:\Logs\CLI-Output-Logs -g "d98a1f18-f4bf-8695-ef04ghfa47" -v "1.0.0" -o "C:\CinchyDXDOutput"

Validate export process

To validate your export, navigate to the target output path and make sure it's populated with the necessary files.

Export failures list

Error: Release already exists

The export will fail if a record with the same Name, GUID, and Version already exists in the Data Experience Releases system table in the lower environment.

Solutions:

  1. Update package version: Required if the Data Experience definition has been modified since the last installed version in the higher environment.

  2. Delete the release: Required if the version isn't yet installed in the higher environment.

Error: References to deleted metadata

This error indicates that Data Experience Definitions are pointing to a deleted entity (such as a saved query in the Recycle Bin).

Solution:

Manually update the definition to remove the reference to the deleted data.

Next steps

For a complete list of all columns, please see the .

For a list of the available parameters, see the Export section of the reference page for more information.

Data Experience Definitions Table
Data Experience Reference Data Table
Data Experience Definitions
Data Experience Reference Table
CinchyDXD commands
Install the data experience