5️⃣
Cinchy Platform Documentation
Cinchy v5.8
Cinchy v5.8
  • Data Collaboration Overview
  • Release notes
    • Release notes
      • 5.9 release notes
      • 5.8 Release Notes
      • 5.7 Release Notes
      • 5.6 Release Notes
      • 5.5 Release Notes
      • 5.4 Release Notes
      • 5.3 Release Notes
      • 5.2 Release Notes
      • 5.1 Release Notes
      • 5.0 Release Notes
  • Support
  • Glossary
  • FAQ
  • Deployment guide
    • Deploying Cinchy
      • Plan your deployment
        • Deployment architecture
          • Kubernetes architecture
          • IIS architecture
        • Deployment prerequisites
          • Single Sign-On (SSO) integration
            • Enable TLS 1.2
            • Configure ADFS
            • AD Group Integration
      • Kubernetes
        • Disable your Kubernetes applications
        • Change your file storage configuration
        • Configure AWS IAM for Connections
        • Use Self-Signed SSL Certs (Kubernetes)
        • Deploy the CLI (Kubernetes)
      • IIS
  • Upgrade guide
    • Upgrade Cinchy
      • Cinchy Upgrade Utility
      • Kubernetes upgrades
        • v5.1 (Kubernetes)
        • v5.2 (Kubernetes)
        • v5.3 (Kubernetes)
        • v5.4 (Kubernetes)
        • v5.5 (Kubernetes)
        • v5.6 (Kubernetes)
        • v5.7 (Kubernetes)
        • v5.8 (Kubernetes)
        • Upgrade AWS EKS Kubernetes version
        • Update the Kubernetes Image Registry
        • Upgrade Azure Kubernetes Service (AKS)
      • IIS upgrades
        • v4.21 (IIS)
        • v4.x to v5.x (IIS)
        • v5.1 (IIS)
        • v5.2 (IIS)
        • v5.3 (IIS)
        • v5.4 (IIS)
        • v5.5 (IIS)
        • v5.6 (IIS)
        • v5.7 (IIS)
        • v5.8 (IIS)
      • Upgrading from v4 to v5
  • Guides for using Cinchy
    • User Guide
      • Data Browser overview
      • The Admin panel
      • User preferences
        • Personal access tokens
      • Table features
      • Data management
      • Queries
      • Version management
        • Versioning best practices
      • Commentary
    • Builder Guide
      • Best practices
      • Create tables
        • Attach files
        • Columns
        • Data controls
          • Data entitlements
          • Data erasure
          • Data compression
        • Formatting rules
        • Indexing & partitioning
        • Linking data
        • Table and column GUIDs
        • System tables
      • Delete tables
        • Restore tables, columns, and rows
      • Saved queries
      • CinchyDXD
        • Overview
        • DXD workflow
        • Package the data experience
        • Install the data experience
        • Release package
        • Changelog
        • References
          • Cinchy DXD CLI reference
          • Data Experience Definitions table
          • Data Experience Reference table
      • Multilingual support
      • Integration guides
    • Administrator Guide
    • Additional guides
      • Monitor and Log on Kubernetes
        • Grafana
        • OpenSearch dashboards
          • Set up Alerts
        • Monitor via ArgoCD
      • Maintenance
      • Cinchy Secrets Manager
      • GraphQL (Beta)
      • System properties
      • Enable Data At Rest Encryption (DARE)
      • Application experiences
        • Network map
          • Custom node results
          • Custom results in the Network Map
        • Set up experiences
  • API Guide
    • API overview
      • API authentication
      • API saved queries
      • ExecuteCQL
      • Webhook ingestion
  • CQL
    • Overview
      • CQL examples
      • CQL statements overview
        • Cinchy DML statements
        • Cinchy DDL statements
      • Cinchy supported functions
        • Cinchy functions
        • Cinchy system values
        • Cinchy User Defined Functions (UDFs)
          • Table-valued functions
          • Scalar-valued functions
        • Conversion functions
        • Date and Time types and functions
          • Return System Date and Time values
          • Return Date and Time parts
          • Return Date and Time values from their parts
          • Return Date and Time difference values
          • Modify Date and Time values
          • Validate Date and Time values
        • Logical functions
        • Math functions
        • String functions
        • Geometry and Geography data type and functions
          • OGC methods on Geometry & Geography instances
          • Extended methods on Geometry & Geography instances
        • Full Text Search functions
        • Connections functions
        • JSON functions
    • CQL functions reference list
  • Meta-Forms
    • Introduction
    • Install Meta-Forms
      • Deploy Meta-Forms (Kubernetes)
      • Deploy Meta-Forms (IIS)
    • Forms data types
    • Meta-Forms Builder Guide
      • Create a dynamic meta-form with tables
      • Create a dynamic meta-form example with Form Designer
      • Add links to a form
      • Rich text editing in forms
  • Data syncs
    • Get started with data syncs
    • IIS installation
      • Install Connections
      • Install the Worker/Listener
      • Install the Connections CLI
    • Build data syncs
      • Data sync types
      • Design patterns
      • Sync actions
      • Columns and mappings
        • Calculated column examples
      • Advanced settings
        • Filters
        • Variables
        • Auth requests
        • Request headers
        • Post sync scripts
        • Pagination
      • Batch data sync example
      • Real-time sync example
      • Schedule a data sync
      • Connection functions
    • Data sync sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML config example
      • Cinchy Table
        • Cinchy Table XML config example
      • Cinchy Query
        • Cinchy Query XML config example
      • Copper
      • DB2 (query and table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File-based sources
        • Binary file
        • Delimited file
        • Excel
        • Fixed width file
        • Parquet
      • Kafka Topic
        • Kafka Topic example config
        • Apache AVRO data format
      • LDAP
      • MongoDB collection
        • MongoDB collection source example
      • Mongo event
      • MongoDB collection (Cinchy event)
      • MS SQL Server (query and table)
      • ODBC Query
      • Oracle (query and table)
      • Polling event
        • Polling event example config
      • REST API
      • REST API (Cinchy event)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce platform event
      • Salesforce push topic
      • Snowflake
        • Snowflake source example config
      • SOAP 1.2 web service
      • SOAP 1.2 web service (Cinchy Event Triggered)
    • Data sync destinations
      • Cinchy Table
      • DB2 table
      • Dynamics
      • Kafka Topic
      • MongoDB collection
      • MS SQL Server table
      • Oracle table
      • REST API
      • Salesforce
      • Snowflake table
      • SOAP 1.2 web service
    • Real-time sync stream sources
      • The Listener Config table
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
    • CLI commands list
    • Troubleshooting
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Data entry
  • Insert/Delete data rows
  • Import data
  • Import errors
  • Cinchy original row number
  • Export data
  • 5. Approve/Reject Data
  • Collaboration log
  • Data erasure and compression policies
  • Audit for data synchronization
  • Collaboration log performance considerations
  • Recycle bin
  1. Guides for using Cinchy
  2. User Guide

Data management

This page goes over the several ways to work with (enter, update, remove, load and extract) data from Cinchy tables.

PreviousTable featuresNextQueries

Last updated 1 year ago

Data entry

Users are only able to enter data into Cinchy based on their access. Users can also copy and paste data from external sources.

Insert/Delete data rows

Users are only able to insert or delete rows based on their access. If you have the ability to insert and/or delete a row of data it will be visible when right-clicking on a row of data (Image 1).

Import data

Importing data allows you to add new rows of data into a table. If you want to perform a sync, refer to the CLI. Importing data acts as a smart copy-and-paste of new data into an existing table.

Importing the first row of your CSV as a header row will match the headers to the column names within your table. Columns that can't be matched are ignored, as well as any columns you don't have edit permissions for.

Users can import data from a CSV file to an existing table in Cinchy. Importing data into a Cinchy table only adds records to the table. This data import type doesn't update or append existing records.

To import data into a table, complete the following:

  1. From within the table, click the Import button on the top toolbar of the table (Image 2).

  1. Click Choose File to locate and import your file.

  2. Validate the imported columns and click next (Image 3).

  1. Click the Import button

  2. Click the OK button on the Import confirmation window

Import errors

If there are import errors, click the download button next to Rejected Rows on the Import Succeeded with Errors window (Image 4).

You will get a file back with all the rejected rows, as well as the 2 columns added called ‘Cinchy Import Errors' and 'Cinchy Import Original Row Number’.

Cinchy original row number

This provides a reference to the row number in the original file you imported in case you need to check it.

You can simply fix any errors in your error log followed by importing the error log since successful rows are omitted.

Export data

You can export your data from a table view in CSV or TSV format. This export starts at the first record. Cinchy doesn't currently support pagination, so the maximum export is 250,000 records. To export a table of more than 250,000 records, you can use CLI to export your entire table at once.

When data is exported out of the network, it's now just a copy and no longer connected to Cinchy.

To export data from a table, complete the following:

  1. From within the table, click the Export button in the table toolbar

  2. Select the Export file type (CSV or TSV) (Image 5).

  3. Open your file in Excel, or any other CSV software, to view.

5. Approve/Reject Data

Cinchy can have data change approvals for when data is added or removed from a table view. A change approval process can be put into place for the addition or removal of specific data. If you have been identified as an "Approval" of data you will have the ability to:

  • Approve a cell of data

  • Approve a row of data

  • Reject a row of data

To approve or reject a cell/row of data, complete the following:

  1. Right-click on the desired row/cell

  2. Select Approve row/cell or Reject row/cell

Collaboration log

The Collaboration log is accessible from every table within Cinchy (including metadata). It shows the version history of ALL changes that have been made to an individual row of data.

To access the Cinchy Collaboration Log:

  1. Open the desired table

  2. Locate the desired row > Right Click > View Collaboration Log (Image 6).

Once the Collaboration Log is open you have the ability to view ALL changes with a version history for the row selected within the table.

Users have the ability to revert to a prior version of the record. To do so, click the Revert button for the desired version (Image 7).

A record can have a Revert button. This indicates that version record is identical to the current version of the record in the table. Hovering over the Revert button displays a tool-tip.

Data erasure and compression policies

By default, Cinchy doesn't delete any data or metadata from within the Data Fabric.

Click here for more information on Data Erasure & Compression Policies in Cinchy

Audit for data synchronization

Audit Logging of data loaded into Cinchy via Data Synchronization such as batch or real-time using the Cinchy CLI, or through data changes by any Saved Queries exposed as APIs to external clients, is recorded the same way as if a user entered the data into Cinchy. All data synced into Cinchy will have corresponding line items in the Collaboration Log similarly to how it's handled when data is entered / modified in Cinchy by a User.

Collaboration log performance considerations

The Collaboration Log data is also stored within Cinchy as data, allowing the logs to be available for use through a query or for any downstream consumers. The logs have no separate performance considerations needed, as it relies on the Cinchy platform’s performance measures.

Recycle bin

All data records that have been deleted are put into Cinchy’s Recycle Bin. Data that resides in the Recycle Bin can be restored if required.

To restore data from the recycle bin:

  1. From the left-hand navigation, click Recycle Bin (Image 8)

  1. Locate the row for restoring

  2. Right-click and select Restore Row.

The restored row will now be visible in your table.

If Change Approvals are turned on, that row will need to be approved.

Image 1: Inserting/Deleting Rows
Image 2: Step 1, Clicking the import button
Image 3: Step 3, validate your columns
Image 4: Import Errors
Image 5: Exporting Data
Image 6: Step 2, Open the Collaboration Log
Image 7: Reverting Data in the Collaboration Log
Image 8: The Recycle Bin