Only this pageAll pages
Powered by GitBook
1 of 83

Cinchy v4.0

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Cinchy Guides

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

API Guide

Loading...

Loading...

Loading...

Loading...

Loading...

Deployment Guide

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Other Resources

4.2 Release Notes

Summary Stats

We now display the total number of records in a table in the bottom right hand corner. Additionally, we will also display the sum and average of any numeric fields selected, as well as a count (cells and record level) of the selection.

Miscellaneous

Fixed a few small bugs relating to exporting data from the Manage Data screen, selecting columns in the View Editor, logic in when recalculate existing values is displayed, and resizing the query builder sections.

4.8 Release Notes

Miscellaneous

Toggling Change Approval

You will be prevented from turning off change approval if you still have any records in draft state. Note that you may have to check in the Recycle Bin for draft records if there are none currently in the All Data view.

Formatting Rule Bug

Fixed a bug where some viewable row filters made it so the formatting was not being applied.

Editable Row Filter Bug

Fixed a bug where changing data that caused a cell to be editable is not being reflected on the UI.

UDFs Calling External APIs

Fixed a bug with making certain HTTP requests from a UDF.

French Translations

Tweaked some translations for consistency.

Real Time Sync

4.8.1 includes changes that allows you to set up a real time data sync. Right now this capability is only available for listening to push topics from Salesforce but a Cinchy source will be added soon.

For more information, click here.

ExecuteSavedQuery

There is a new Cinchy built in function, executeSavedQuery, which allows you to get back a scalar value or a list of values. This is useful for replacing subqueries both for cleanliness and performance.

For more details, click here.

Average and sum of numbers, count of selected non-empty cells.

Dataware Overview

This page provides a brief overview of Dataware.

You are currently browsing the Cinchy v4.0 platform documentation. For documentation pertaining to other versions of the platform, please navigate to the relevant space(s) via the drop down menu in the upper left of the page.

What’s the purpose of Dataware?

Dataware solves the costly, time-consuming, and ineffective integration processes born from silo-ing your data in a traditional app-centric environment. Instead of your data serving your applications, dataware refocuses and pivots to a model where your data is at the forefront and is being served by your apps.

Harnessing the power of Data Collaboration, Cinchy connects unlimited data sources within a networked architecture, offering persistent delivery of real-time solutions, without complex integrations. And the more data you connect into your data network, the more powerful your processes can be.

The root causes of IT delay and frustration

When a new IT project is green-lit, you often pay a hefty fine called the integration tax where you're continuously building new integrations between applications, just to reuse data that is already available in your systems.

Over time, this never-ending cycle of copying data between fragmented apps gets more complex, resulting in delayed launches, budget overruns, and “shadow IT” projects.

This process of making copies now consumes up to 50% of the resources on large IT projects, and it's the reason that delivery often takes months, sometimes years, and costs millions.

How can this be fixed?

Data Collaboration: Using Dataware for accelerated solutions delivery

With Dataware, you shift your approach from integration for data sharing, to access for data collaboration.

.

For every app you build where you leverage dataware, you’re able to access the network to reuse information for future apps. Your IT team will find that previous projects have already connected many of the data sources they need to the network.

Cinchy's Dataware platform does for application development what the power grid does for individual buildings. In the same way that buildings no longer need to generate their own power thanks to the power grid, with a dataware platform applications no longer need to manage, integrate, and protect their own data.

Organizations can build applications on dataware in half the time, while enabling effortless and copyless data sharing across applications. Using Cinchy is unique in that it eliminates the "integration tax" that today consumes half of most enterprise IT budgets -- that is to say, data collaboration makes integration obsolete.

Now, instead of connecting apps to gain access to data through costly point-to-point integration, your apps serve your data by leveraging and connecting it all together via Cinchy. In this way, you can still gain the best usage out of your apps through zero-copy integration while avoiding the disadvantages of data silo-ing. You have both full access to and full control of your data.

Not just connected, but autonomous.

​Simply putting pipes between data silos, and centralizing a few housekeeping tasks, is not data collaboration. What that's actually doing is leading you down a path of managing endless copies. True data collaboration not only connects your data but upgrades it as part of an interconnected, autonomous network.

Autonomous data exists independently of any application. It is self-controlled, self-protected, and self-describing. This creates a number of benefits compared to traditional app-dependent data, including the ability to simplify cross-application usage and reporting. And when you use autonomous data in an interconnected network, wherein individual contributors maintain their roles and priorities as they apply their unique skills and leadership autonomy in a problem solving process, you get: Collaborative Autonomy.

Collaborative Autonomy is thus the principle underpinning Collaborative Intelligence, the entire basis of Dataware and Cinchy.

Individuals are not homogenized, as in consensus-driven processes, nor equalized through quantitative data processing, as in collective intelligence. Consensus is not required. Problem resolution is achieved through systematic convergence toward coherent results. Collaborative intelligence relies on the principle of collaborative autonomy to overcome “the consensus barrier” and succeed where other methods have failed.

Universal access controls, automated governance

One of the most significant advantages of dataware is the ease with which data owners can set universal data access controls at the cellular level, and automate data quality (Data Governance) with a “golden record” of data.

In effect, it is removing the need to maintain access controls within individual apps and centralizing these functions in an incredibly efficient way.

Compare this with designing and maintaining controls within thousands of apps and systems independently. It’s not only incredibly challenging and costly, but virtually impossible to maintain consistency.

Game Changer: Network Effects for IT delivery

Dataware is a game changer for IT delivery: it produces network effects, where each new solution actually speeds up delivery times and reduces costs

Network-based designs scale beautifully and become more efficient as they expand. Consider the human brain; its neuroplasticity helps it learn more as it grows. The more interconnected it gets, the better. The neural pathways are reorganizing themselves such that the fewer connections, the higher the intelligence, because information is more efficient to operationalize.

It's the same with dataware. The more you connect your data, the harder and better it works. It's the ability to have the platform take care of your whole data journey and transformation. You don't have to manage the changes of all your applications, regression testing, the QA you have to go through, etc.

And it's also your time machine - you can have applications based on different points in time of your data, and it's all done through network-based design.

Now that you know how efficient and secure things can be there is no going back.

Let's build the connected future, together.

4.4 Release Notes

Create from Existing View

You can now create a view from an existing view rather than starting from scratch each time. This will pre-fill the view with existing display columns, sorts, and filter.

Collaboration Log

We made some improvements to the collaboration log, including:

  • Freezing the first few columns as well as the header row

  • Allowing you to more easily copy and paste values

  • Keyboard navigation through the grid (tab and arrow keys)

Keyboard Navigation for Design Query and Query Builder

At Cinchy, we are constantly looking to improve our accessibility and usability. The Query Builder screen, including Design Query, are now navigable through keyboard only.

Please reach out if you have any suggestions to improve accessibility or usability.

Miscellaneous

  • If you execute a saved query through the CLI or another query, it will now return in the result type of the saved query (i.e. Approved/Draft/Version History) rather than the context of what is making that request.

  • Display seconds in the system Modified/Created/Deleted columns.

  • When creating a filter in Manage Data, it defaults to contains rather than equals

4.1 Release Notes

SSO Changes

Support for SAML2 Grant Type

You can now obtain a Cinchy bearer token using your SAML2 token. Simply pass in your base64 encoded saml token instead of your username and password. See for details.

Update Groups on Log In

User group memberships will now be updated anytime a user logs in. Note that group memberships can still be synced via the CLI (see for details) at a regular interval or after large group membership changes.

Restoring Deleted Users when Auto User Creation is On

Now when deleted users attempt to login to Cinchy when the SSO Auto User Creation has been turned on, their user will be restored from the Recycle bin. If you want to revoke a user's access to Cinchy but keep their SSO account active, you can disable the user by checking off the Is Disabled box in the Cinchy.Users table.

Additional App Settings

The documentation page has been revamped. 3 additional attributes are added to app settings so that you can set your max SAML request size. This was originally preventing users with a large amount of groups from being synced into Cinchy.

Syncing Users with Multiple Groups

In 4.0 and earlier, we support syncing multiple groups where the attributes are separate:

Now we also support comma separated lists in the response.

Tableau

Some tweaks have been made to our Tableau Web Connector to be compatible with more versions of Tableau. A few bugs have also been fixed.

Please see on how to get started.

4.5 Release Notes

GETDATE() and User Timezone

User Preferences

The Language User Link Table has been renamed to User Preferences, and a time zone column has been added. Your timezone can be set in your profile.

System Properties

A new system property has been added called Default Time Zone. On an upgrade, this will be automatically added and set to Eastern Standard Time for backwards compatibility. If this value is removed or invalid, the default will be UTC.

GETDATE()

getdate() previously returned in your application server's time, it will now return in the user's time zone (and if that is not set, the system default time zone).

Cinchy System [Created], [Modified] and [Deleted] columns

These columns were previously returning in Eastern Standard Time. They will now display in a user's time zone.

User created Date fields are not converted, so if you are making a comparison between a user created date column and the current time or a system column, it is recommended you use GETUTCDATE() instead for consistency.

Miscellaneous

  • Links and Hierarchies have been merged into Columns. They are now simply a drop down option.

  • Fixed a bug with nested display columns in a link dropdown, as well as not being able to search those columns.

4.7 Release Notes

View Changes

Continuing from last release, here are additional changes to Manage Data.

Manual Row Resize

In addition to the row height presets, you can now also manually resize a row (or multiple rows if you select more of them). You can also double click to auto expand the row height.

Filters

Filters are now applied to the table rather than the view. This allows you to flip between different views (ex. All Open vs My Open) while keeping your filters.

Recently Viewed

Now when you click on a recently viewed table, it will take you to the last state it was in, rather than resetting to the default view.

Can Delete Default View

Default views can now be deleted, since there is always a system All Data view. This will also allow you to clean up tables where a specific view was never specified and a custom All Data view was created.

Miscellaneous

Version Number in Collaboration Log

When comparing two values in the Collaboration Log, you can now see the version you are looking at to revert the correct version.

<saml:AttributeValue>GROUP-1</saml:AttributeValue>
<saml:AttributeValue>GROUP-2</saml:AttributeValue>
<saml:AttributeValue>GROUP-3</saml:AttributeValue>
<saml:AttributeValue>GROUP-1,GROUP-2,GROUP-3</saml:AttributeValue>
Authentication
AD Group Integration
Configuring ADFS
Tableau

Release Notes

Cinchy 4.0 comes with Custom Data Network Visualizer, Multi-Lingual Support, the ability to use User Defined Functions in Calculated Columns and Docker support.

New Features

‌Custom Data Network Visualizer

You can visualize your own custom data networks using Cinchy. See Custom Data Network Visualizer.

Network of Networks

Multi-Lingual Support

Cinchy now has multi-lingual support. You can go to your user profile and change your language and region preferences. French translations are pre-populated, but you can also add other language translations.

The same system can also be used by Applets through our language API.

See Multi-Lingual Support for details.

User Defined Functions in Calculated Columns

You can call a scalar-valued user defined function in a calculated column. This allows you to do a variety of operations:

  • Create more complex functions or aggregations from other tables

  • Trigger an API call to retrieve external data

  • Create a notification or create a record in another table

Docker Support

Cinchy is now available via Docker. For deployment instructions please see Docker Deployment.

Improvements

New Options for Calculated Columns

Cache

Current calculated columns are cached, meaning they are stored for fast retrieval for querying. In some cases, you may want to run the calculations in real time (for example, if the calculated column calls a User Defined Functions, which calls an external API for immediate real-time information). In these cases, you can uncheck the Cache option.

A calculated column that is not cached is not persisted, and calculated on the fly. It works essentially like a saved formula, and is executed when you read the record.

Calculate Existing Values

Another option now available on Calculated Columns is calculate existing values. This option is available when you change a formula for a persisted calculated column on an existing table. Previously we would always recalculate all the values in the column.

If you uncheck the box, the column will not be calculated on save and will be blank. Any updates to that record or newly inserted records will have the column calculated. You can use this to set up triggers (since the calculation will run whenever a change occurs on the record) or schedule the calculations separately in batches in case you have a lot of data and may hit external API rate limits during the calculation.

Query Builder

Result Type

We included the result type drop down in the query builder screen so it is easier to switch result types when running your query. This is also useful when running different sections of your query for debugging purposes.

Query Timeout

The timeout for a query is now configurable. The default is 30 seconds. However if you find that you need to run a query that is known to take longer and needs to be executed you can modify the timeout. This can be saved so the query will always allow for that longer timeout. This is generally only recommended after you have already ensured your query is correct on a smaller dataset, and you know the query will take longer than 30 seconds to execute.

Nested Display Columns

Additional display columns can now come from links further down, rather than only from the linked table.

Keyboard Navigation for Manage Data & My Network

At Cinchy, we are constantly looking to improve our accessibility and usability. The My Network screen as well as the Manage Data screen can now both be navigated entirely via the keyboard.

‌

4.6 Release Notes

View Changes

All Data View

There is now always an All Data view.

  • Similar to Recycle Bin, this is a system view that is not editable

  • This view has no filters or sorts, and contains all the columns in the table (access controls still apply)

  • Includes the Cinchy Id column

Links

Links in Cinchy now take you to the All Data view rather than the default view. This allows you to set your default view to a more filtered down view.

Permissions

Views now have permissions, so you can limit who can see a view. They default to All Users, since that is the previous behavior.

Miscellaneous

HTML <a> Tag

Previously we auto-detect hyperlinks and allow them to be clickable. We now support limited functionality with <a> tags within Cinchy text fields.

Supported Attributes:

  • href

  • download

  • target

Default target is blank for absolute URLs.

_self, _parent, _top are only available when it is a relative URL (destination is within Cinchy).

Expand Row Height

You can now expand the height of rows.

  • Collapsed is the default (1 row only)

  • Expanded will open to a max height of 5

  • Full View will open to a max height of 9

This is an experimental feature.

Look out for more UI improvements to the feature in upcoming sprints.

4.10 Release Notes

Pivot Table Reporting URLs

You can now save pivoted reports you create from a Saved Query. This can then be added as an applet to be bookmarked and shared with other users. For more details, see .

Custom Masking on Date Fields

You can now choose your own date/time masking on Date fields (we also added a few more defaults to pick from).

SameSite Changes with Google Chrome

Updated CinchySSO to work with changes that Google made to SameSite settings.

4.11 Release Notes

Bug Fixes

Link with "\" in Collaboration Log

Fixed a bug in the collaboration log where if you had a link with a "\" in it, it would not be able to render the collaboration log.

Sorting by System Columns

Fixed a bug where a user could not sort by certain system columns if they did not have a View/Edit/Approve All Columns entitlement.

ResolveLink

Fixed a bug where using the resolveLink function was not using the same connection and transaction.

Tableau Web Browser

Removed an overly strict restriction that was causing problems for using the Tableau Connector with Cinchy due to the browser name coming from Tableau.

Accessibility

Cinchy completed its first iteration of WCAG 2.0 Accessibility. We are always looking to improve the usability of the product, so we would love to hear from you at [email protected].

If you would like a copy of our report, please speak with your Account Manager.

Other Integrations

In this section you will find other methods of connecting to Cinchy, for example via ODBC and JDBC.

Advanced Settings

Here are some additional features that require system setup to use.

4.17 Release Notes

New Features

  • We now support screen readers on Cinchy tables

Changes

  • Enhancements have been made to the model loader:

    • Resolved issue where reloading the same model throws an error

    • Added logging to model load operation to indicate which tables are updated/created

    • Removed re-saving of tables that have no changes

    • Resolved an inconsistency issue when loading large models

  • Saved Queries supports name-based routing:<baseurl>/query/execute/domain/savedqueryname

  • DXD Reference Data table now has an additional column called Target Filters

Bug Fixes

  • Collaboration log now supports the quotation mark character in linked field records

  • Fixed the filters applied to columns with the yes/no data type where the UI was reset

  • Fixed retry logic in Listener when retrieving the Salesforce access token fails

  • Better error handling for unique constraint violations

Integration

BI Integrations

Displaying Query Results
Reorder your columns in Design Table to change the order in All Data
<a href="https://www.google.com" target="_blank"> Click Me! </a>

Building Forms

For more information on building forms within Cinchy, please

4.12 Release Notes

Added Expiry to ExecuteSavedQuery()

ExecuteSavedQuery now takes in a 3rd parameter as the cache expiry time for that particular instance of the query. This can be used to improve performance on slow changing data (ex. retrieving an employee's team). See Cinchy Functions for more details.

Index can now include Included Columns

This is equivalent to:

CREATE INDEX included_columns
ON product.changes(status,developer)
INCLUDE(summary_for_release_notes);

Miscellaneous

  • Platform performance improvements

  • Model changes to support Salesforce Platform Events in Real Time Sync (see CLI documentation).

4.14 Release Notes

Data Erasure

We've introduced the ability for data owners to delete data for compliance and regulatory requirements. For more details, see .

Miscellaneous

  • You can also set a policy.

  • Changed the default row height in Manage Data to expanded rather than collapsed

  • Resolved a few accessibility bugs

  • Resolved a few bugs with formatting rules

  • Fixed a bug where double clicking a row to adjust the height did not take into account link columns

  • Fixed a bug where the UI export only exports the first page even when you're on a different page

  • Fixed a bug with subqueries where the subquery was only returning 100 records when called through the UI

Data Controls (formerly Design Controls)

Data Controls allow you to set up permissions for who can view, edit, or approve data within a table.

Data Controls can be selected in the left navigation menu.

Currently anyone in the Cinchy Administrators group has access to perform any action on any objects.

You can use multiple rows to provide entitlements to a user.

In the above scenario, John Smith is part of the Developers group. He is able to view all columns via the entitlement to the Developers group, and he is able to edit both the First Name and Last Name column through different entitlements.

Click for a detailed description of the available entitlement options.

Power BI

Below are the steps you can follow to establish a connection to Cinchy from PowerBI.

  1. Queries in Cinchy are what PowerBI connects to. If you don't have one that represents your dataset, you'll need to create that first. In this example, we will use a query called API Test:

  2. When you open the query you'll notice on the right-hand side a green button that says REST API

  3. Click on the REST API button and you'll see the below popup. In the textbox is the URL for the API endpoint. You can click the clipboard icon to copy the URL.

  4. In this example the URL is: http://your.cinchy.instance.domain/API/YouQueryDomain/API Test You'll notice that the structure here is <your Cinchy instance URL>/API/<the name of your query> and optionally at the end you may have querystring parameters. For access via PowerBI we're going to use Basic authentication and a result format of CSV, which uses a slightly different URL endpoint. Instead of /API/ in the URL, it's going to be /BasicAuthAPI/ and we're going to add a querystring parameter - ResultFormat=CSV. For this example your URL for accessing this dataset through PowerBI is going to be: http://your.cinchy.instance.domain/BasicAuthAPI/YourQueryDomain/API Test?ResultFormat=CSV

  5. Launch PowerBI. To access the dataset, click on Get Data and then select Blank Query from the menu:

  6. In the window that launches there is a textbox where you can enter an expression. Here you will enter the below text (note your modified URL from Step 4 between quotes): =Csv.Document(Web.Contents("http://your.cinchy.instance.domain/BasicAuthAPI/YourQueryDomain/API Test?ResultFormat=CSV"))

  7. Once you've entered that text either click the checkmark to the left of the input box or click away and it will automatically attempt to run the expression. What you should then see is a prompt to edit credentials.

  8. Click the Edit Credentials button and you should see the following popup:

  9. Select Basic on the left, enter the credentials for a Cinchy User Account that has access to run this query and select the level at which to apply these settings (only if you want to, by default it's the root URL). This process of entering your credentials won't occur with each query, it's just the first time and then they're saved locally.

  10. Once you click Connect you should see the data

  11. You can now apply any transformations to the dataset that you wish. For instance, in this example we need to click the button at the top that says Use First Row as Headers, but you may have additional changes. In this example we also changed the name from Query1 to Product Roadmap

  12. Once you're done, click Close & Apply. Now the metadata shows up on the right hand side and you can begin to use it to create your visualizations

APIs

You can query or manipulate any data on the Cinchy Platform through CQL (see for more details on syntax). You can save that as a Saved Query which allows it to be accessed via an API endpoint.

You will need to first a Cinchy bearer token (see ), and then you can either access a pre-defined Saved Query via the , or perform freeform querying via the .

Note that regardless of how you query or manipulate data on the platform, it is associated back to an account.

Table Level Entitlements

These are entitlements that apply to the entire table.

Marketplace

Marketplace allows a user to see the table within the marketplace. They can search for the table on the My Network screen.

This entitlement also allows a user to see and interact with the Manage Data screen for that table. There are specific scenarios where you may want a user to be able to access the data but not be able to do so via the Manage Data screen.

Bulk Export

This permission allows a user to export data from the table via the Manage Data screen.

Direct Query

This permission allows a user to query the data from the table directly in the Query Builder.

Design Table

This permission allows a user to alter the structure of the table.

This is a builder/administrative function and should not generally be granted to end users.

Design Controls

This permission allows a user to change the permissions on the table.

This is a builder/administrative function and should not generally be granted to end users.

Pre-Requisites

  1. An instance of SQL Server 2012+

  2. A Windows Server 2012+ machine with IIS 7.5+ installed

    Install these IIS features in addition to the defaults:

    • Application Development Features:

      • ASP.NET 3.5

      • ASP.NET 4.7

    • Common HTTP Features

      • HTTP Redirection

    • Health & Diagnostics

      • Request Monitor

    • Performance Features

      • Dynamic Content Compression

    • Web Management Tools

      • IIS Management Scripts and Tools

      • IIS Management Service

  3. Install .Net Framework 4.7.2 on the server

  4. Install .net core Hosting bundle Version 3.1 -

    • Specifically, install: ASP.NET Core/.NET Core:

Cinchy Platform 4.18.0+ uses .NET Core 3.1, previous versions use .NET Core 2.1

See the for additional details on the architecture and system requirements.

Maintenance

Requires CLI v4.7+

Cinchy performs maintenance tasks through the CLI. This currently includes the data erasure and data compression deletions.

To schedule maintenance from 2am to 5am every day, use a scheduling program to run the command above at 2am every day with the -t parameter set to 180 (3 hours = 180 minutes).

4.20 Release Notes

New Features

  • Connections: It is now possible to create, edit, and run data syncs via the Connections experience in the Cinchy marketplace. For more information, see the .

  • Data Change Notifications: Users can now enable Data Change Notifications on any Cinchy table. When enabled, data changes are logged in the Change Log table, which can be queried via the Cinchy Data Change (CDC) API.

Enhancements

  • CQL support for Multi.Multi notation: This supports more specific queries in scenarios with multiple linked data elements.

  • Increased left nav width: Allows longer view names and menu options to be more legible.

  • Firefox support: The Cinchy data browser now fully supports Mozilla Firefox; previous issues related to the scroll bar and favourites icon have been resolved.

  • New Event Connector Type options in Listener Config table: Support for Cinchy CDC and AWS SQS connectors is now available.

Bug Fixes

  • Resolved an issue where users could not sort by a linked display column in the "All Data" view.

  • Improved a specific querying scenario related to formatting rules caused by the use of try-cast, try-convert and over clause in CQL.

Cinchy.CLI.exe maintenance -s "cinchyBaseURL" -u username -p "encryptedPassword" -t 180

Parameter

Value

-s

Server, i.e. Cinchy Base URL (ex. cinchy.com/Cinchy/)

-u

Username, this will need to be an account that is part of the Cinchy Administrators group

-p

Encrypted password (you can encrypt your password by using Cinchy.CLI.exe encrypt -t "plaintextpassword"

-t

Set a maintenance time window in minutes. Maintenance tasks will stop executing after the allotted time frame. This allows you to run this during an allotted maintenance window.

-h

This flag must be added if you are accessing Cinchy over https.

Erasure
data compression
Cinchy Query Language
Authentication
Saved Query endpoints
ExecuteCQL endpoint
https://www.microsoft.com/net/download/dotnet-core/3.1
Runtime & Hosting Bundle
Deployment Planning Guide

4.15 Release Notes

API Changes

A few changes have been made to improve the Saved Query API experience.

401 Reponse when your credentials are incorrect

The Saved Query API endpoint will return a 401 error code when you provide invalid credentials. This includes not providing credentials, expired credentials, and incorrect credentials.

RETURN is now supported in CQL

This allows you to use raiserror to return a specific error based on any validation errors you run into for your saved query.

ELMAH Logging

Any 400 errors that are from a Saved Query API will show up in the ELMAH log as a 400 HTTP error.

Model Loader

Calculated Columns

  • Model Loader now updates the formula in a calculated column

  • After the formula is updated if it is a cached column it will recalculate during the model load

Indexes

  • Model Loader now loads indexes, unique indexes, and included columns

Enhancements

  • Row height and summary statistics are added to Saved Queries grid

Forbidden Passwords

A forbidden passwords list ([Cinchy].[Forbidden Passwords]) has been added. You will not be able to set your password to one of these. If you are an administrator, you can change the list. For more details, see System Properties.

Bug Fixes

  • Resolved an issue with searching link column dropdowns when you do not have access to view one of the display columns in the dropdown.

  • Resolved a bug with parameters and Saved Queries when there are INSERT statements

  • Relative links respect the target attribute (when set to parent or top)

  • Percentage copy and paste behaves like regular numbers

  • Resolved an issue with displaying Create View on Mac and certain zoom levels

  • Resolved an issue with reverting hierarchy links in the collaboration log

Tableau Bug Fixes

  • Resolved a bug that was causing Saved Queries with trailing whitespaces to cause problems with Tableau Web Connector

Miscellaneous

  • Fixed Cinchy to work with new SameSite updates in Google Chrome

  • Applets have been renamed to Experiences

System Columns

Cinchy contains system columns used to perform various functionality. These columns cannot be modified directly by a user. You cannot create a column with the same name.

Cinchy Id

Cinchy Id is a unique identifier assigned automatically to all records within a normal table. The Cinchy Id is associated with the record permanently and is never reassigned even if the record is deleted.

Version & Draft Version

Version and Draft Version are used to track changes and versions.

Without Maker/Checker Enabled

Any changes made to a record increments the Version. Draft Version is always 0.

With Maker/Checker Enabled

Any data approval increments Version and resets Draft Version to 0. Any proposed changes increments the Draft Version.

Approval State

Legacy column - always blank.

Created By & Created

Created By is a linked column to the [Cinchy].[Users] table, of the user who created the record.

Created is the time when the record was created.

Without Maker/Checker Enabled

Created By and Created will be the same for all records with the same Cinchy Id.

With Maker/Checker Enabled

Created By and Created is based on the first user to make changes on an approved record. So the user where Draft Version = 1.

Modified By & Modified

Modified By is a linked column to the [Cinchy].[Users] table, of the user who last modified the record.

Without Maker/Checker Enabled

The last user to modify the record, and when it happened.

With Maker/Checker Enabled

The last user to either modify the record (Draft Version != 0) or approve the record (Draft Version = 0). The timestamp for when that version was generated.

Deleted By & Deleted

If a record is deleted, it will show up in the Recycle Bin.

Without Maker/Checker Enabled

A deleted record will have Deleted By and Deleted filled in.

With Maker/Checker Enabled

Deleted By and Deleted are based on the user/time when the Delete Request was created, not when it was approved.

Replaced

There is always only one latest/up to date record at a time. Anytime changes are made to a record, a new version (normal or draft) is created, and the previous version is updated with a Replaced timestamp.

Any record where Replaced is empty is the current version of that record.

Saved Queries

Saved queries allows you to query any data within Cinchy (respecting entitlements) and save them to be used as APIs by external systems.

You can access your Saved Query directly by either CinchyID or the domain and name of the Saved Query.

<baseurl>/Query/Execute?queryId=<cinchyid>

<baseurl>/Query/Execute/<domain>/<saved query name>

For more information and documentation on Cinchy Query Language (CQL), please click here.

Column Level Entitlements

These are entitlements that apply to specific columns.

View All Columns

This permission allows a user to view all columns within the table.

Note that this applies to any new columns that are added to the table after providing this permission as well.

View Specific Columns

This is a drop down where you can select the specific columns you want to grant view access to for users.

Edit All Columns

This permission allows a user to edit all columns within the table.

Note that this applies to any new columns that are added to the table after providing this permission as well.

Giving a user edit permission will also give them view permission.

Edit Specific Columns

This is a drop down where you can select the specific columns you want to grant edit access to for users.

Giving a user edit permission will also give them view permission.

Approve All Columns

This permission allows a user to approve all columns within the table. This also allows users to approve Create and Delete requests.

Note that this applies to any new columns that are added to the table after providing this permission as well.

Approve permissions only apply with Maker/Checker turned on.

Giving a user approve permission will also give them view permission.

Approve Specific Columns

This is a drop down where you can select the specific columns you want to grant approve access to for users.

Approve permissions only apply with Maker/Checker turned on.

Giving a user approve permission will also give them view permission.

Link Columns

Link columns require both permission to the column within this table, as well as the column in the link column itself.

4.16 Release Notes

New Features

  • The CompressJSON flag is now useable with the ExecuteCQL endpoint as well

  • We now support the ability to encrypt or sign SAML responses

Changes

  • Significant changes have been made to the model loader to deal with more complex scenarios.

  • We now ignore Cinchy tables in a model import - so you can load models from lower Cinchy versions to higher Cinchy versions without worrying about corrupting the system tables. We do not recommend modifying system tables.

  • CreateDXVersion (and DXD) now check a user's permission for the selected tables, rather than using the builders and builder groups column

  • "allow-downloads" is now enabled with embedded applets so you can build an integrated applet that allows the user to download files

  • We've made some changes to percent to improve the user experience. Here's a quick demo of what you can expect from typing/copying/pasting in a number versus a percent column.

Bug Fixes

  • Navigating and using Data Controls no longer creates a 404 HTTP error in ELMAH logs

  • Yes/No columns now load properly into Tableau from Cinchy.

  • Link display columns now have proper encoding for < >, so you can display linked hyperlinks properly.

  • Fixed a link dropdown type ahead issue where there are link filters with an OR clause.

Table Level Entitlements
Connections documentation
Create, edit and run connections via an intuitive user interface.
Notifications can be enabled in the Design Table menu.

4.21 Release Notes

New Features

  • Webhook Ingestion: It is now possible to ingest webhook notifications from external applications and platforms.

    • A new system table called Webhooks (only available to administrators) allows you to create a unique key for each webhook, and specify a saved query that can be run when the event is triggered.

    • See our step-by-step instructions for configuring webhooks to learn more.

Configure webhook ingestion in the Webhooks table.
  • Anonymous User: This release creates a new Cinchy user called "Anonymous".

    • This user can be given permissions directly, or added to groups, similar to any other user. However, note that the anonymous user is programmatically limited to read-only access. Even if it is directly given insert or update privileges, or added to a group that has such entitlements, these will not apply when accessing Cinchy anonymously.

    • Unauthenticated calls to the ExecuteCQL and ExecuteSavedQuery APIs will now run based on the permissions granted to the anonymous user; i.e. Cinchy will return any and all data that the anonymous user has permission to view.

    • Note that API calls that provide incorrect credentials will still return a 401 - Unauthorized response.

    • Currently, there is no way for the anonymous user to access Cinchy directly via the data browser.

Enhancements

  • Login Screen and Home Page Improvements: Watch the below video for a quick tour of what's changed.

  • Minimum Contrast: Contrast ratios have been adjusted to support WCAG guidelines.

Bug Fixes

  • Fixed error when trying to pass a variable to a table-valued User Defined Function (UDF)

  • Resolved error when filtering on a linked date column

  • Resolved an issue that occasionally caused session timeouts for data browser users

  • Adjusted timeout length on executeMatch UDF (used in Cinchy MDM experience)

User

Autonomous Data Fabric Platform

For ‘End-Users’

Cinchy’s Autonomous Data Fabric technology introduces an entirely new way for an increasingly diverse set of end-users (employees, customers, and partners) to manage their data. The technology was created on the belief that there should be an easier way for people, systems, and AI/ML to collaborate in a secure manner across the enterprise. In the traditional technology paradigm, the only way for users to interact with data is through the ‘application experience’ which is rigid, limiting, and limits our interactions to the data that 'lives' within the app. There are two types of individuals that use the platform. The first is ‘Builders’ who are really the creators of the Data Fabric. Builders are responsible for building out the organization's Fabric including table schemas, establishing access controls, and building application experiences for end-users. The second are end-users who are able to view and manage all data that they have been granted access through Cinchy’s Universal Data Browser.

View and Manage Data Through a Single UI

Cinchy’s Autonomous Data Fabric platform features a Universal Data Browser that allows users to view, change, analyze, and otherwise interact with all data on the Fabric. The Data Browser even enables non-technical business users to manage and update data, build models, and set controls, all through an easy and intuitive UI.

Only See the Data You Have Been Granted Access

Data on the Fabric is protected by cellular-level access controls, data-driven entitlements, and superior data governance. This means that users can only view, edit, or operationalize data that has been granted access to from the data owner.

Track Version History for Full Lifecycle of Data

All data is automatically version-controlled and can easily be reverted to previous states with the proper permissions. On all data tables, you can see changes made by users, systems, or external applications through Data Synchronization or by using a Collaboration Log.

Access Saved Queries on the Fabric

Users can easily access and run saved queries that are available to them through the Data Marketplace. All queries respect Universal Access Controls meaning you will only see the data that you have access to.

Bookmark and Manage Data in a Secure Marketplace

Users can also access all accessible tables, queries, and applets through the Cinchy Marketplace. Here you can also order tiles and bookmark favourites.

Access Data through Rich Application Experiences

Users can also experience data on the Fabric through custom application experiences that are created by Builders on the platform. All application experiences also respect Universal Access Controls meaning you will only be able to see the data you have been granted access.

4.19 Release Notes

New Features

Spatial tables are now available. You can create geometry and geography columns in a spatial table, as well as geospatial indexes (directly on the column itself). Note that all existing tables are standard tables, which support partitions but not geospatial columns.

  • Creating a Spatial Table

  • Geography and Geometry Column Types

  • dataType to use when syncing geospatial data

  • How to interact with geospatial data in CQL

New Create Table modal with Spatial Table option
An example of what geometry and geography data looks like in a table.

Enhancements

  • Upgrading a model via the model loader now no longer deletes any columns that were not part of the previous model. This applies to Cinchy system tables as well so you can extend Cinchy tables with your own custom columns (ex. create a calculated column in Cinchy.Tables to navigate to the table directly) without an upgrade deleting those columns. Note that any modifications made to a column from a model will only be overwritten if the new model has changes on the column.

  • There is a new system property (defaulted to 0) that indicates whether an administrator has scheduled a maintenance job to run on the Cinchy instance. When this is set to 0, a warning message will appear when someone sets up a Data Erasure or Data Compression policy to let them know that this feature has not been enabled. They will still be able to save their settings, so if maintenance is set up at a later date they do not have to come back and save the policy afterwards. If you are an administrator and have been asked to set up maintenance, please see: Maintenance

The system property Maintenance Enabled simply toggles the warning message above, it does not serve any functional purpose. It is the administrator's job to ensure the maintenance job is running successfully at a regular interval. Maintenance can also be ran ad hoc, so if you do not wish to permanently delete records or version history in a table, do not turn on Data Erasure or Compression even if you see that banner.

  • Additional logging has been added to the IdP to facilitate troubleshooting SSO issues.

  • A script is now shipped with the Cinchy Platform that enables set up of a Cinchy database user without db_owner privileges.

  • Added support for try_parse, try_convert, and try_cast

Bug Fixes

  • Resolved a limitation of data erasure on tables with a lot of columns

Displaying Query Results

1. Displaying a Saved Query

  1. Once you've set up your saved query, you can find it on your homepage.

2. Clicking the query will allow you to "Execute Query" and show you the result set (if there is a SELECT at the end). Sometimes the query will have parameters you need to fill out first before executing.

1.1 Pivot Reports

  1. Once you execute a query, you can switch the Display to Pivot Mode to see different visualizations of the data.

1.2 Pivot URLs

If you want to share the report, you can click the Pivot URL button on the top right to copy the URL to that pivoted report. Simply add it as an applet and bookmark it to return to the pivoted view!

Erasure

Data erasure allows you to permanently delete data in Cinchy. As the data owner, you can set an erasure policy on your table if you need to delete data for compliance reasons.

The actual erasing of data happens during the maintenance window. Please check with your system administrators to confirm when maintenance is scheduled.

Once data is erased, any links pointing to erased data will look like this:

Change Approval Enabled Tables

The time is counted based on the record's modified time stamp, not the deleted time stamp. This means for change approval records it is the time when the pending delete request was approved and moved to the Recycle Bin, not when the delete request was made.

Query

Queries are requests for information within Cinchy which users can create. This section covers the following:

  • Execute Queries

  • Generate Pivot Tables

  • Generate Charts

  • Build Shared Visualizations

Create Queries

For more information on how to create queries, see creating your first query.

Execute Queries

Users can execute pre-built queries based on their access. The result can be exported into a CSV or TSV format.

Generate Pivot Tables

Once you have executed the query, click the Grid drop down list and select Pivot. Here is where you can take your standard table view and slice and dice your data.

Generate Charts

From within your pivot view, open the drop down list with the value “table” and select the type of chart you want to use to display the data.

Build Shared Visualizations

Once you have a desired visualization, that visualization can be made available for others as an applet in Cinchy. Grab the Pivot URL and send it to your Cinchy builder to create your mini applet that can be shared and leveraged!

To copy the Pivot URL to build have a visualization created, complete the following:

  1. From within the Pivot, locate the blue Pivot URL

  2. Click Pivot URL button

  3. Click the Copy button

  4. Send the copied URL to your Cinchy builder to create your applet that can be shared and leveraged!

You can also open that visualization by clicking Open in new tab if need be.

My Data Network

Legacy Data Network Visualizer Applet

The legacy standalone applet is no longer supported. For instructions on the standalone applet version, see Setup Data Network Visualizer in v2.2.0.

My Data Network

The Data Network Visualizer now ships with Cinchy as a system applet called My Data Network. It uses the user's entitlements for viewable tables and linked columns. You will find the My Data Network data experience in the Marketplace:

Cinchy Production Data Network

My Data Network is another way to view and navigate the data you have access to within Cinchy.

Each node represents a table you have access to within Cinchy, and each edge is one link between two tables. The size of the table is determined by the number of links referencing that table. The timeline on the bottom allows you to check out your data network at a point in the past and look at the evolution of your network.

Cinchy Employees Table

When you click on a node, you will see its description in the top right hand corner. You can click the Open button to navigate to the table.

Custom Data Network Visualizer

If you want to create your own custom data network visualizer, see Custom Data Network Visualizer

Application Experiences

Rather than traditional code-centric applications which creates data silos, you can build application experiences on the Cinchy platform which looks and feels like regular applications, but persists its data on the data fabric autonomously, rather than managing its own persistence.

Once you deploy your UI, API and logic, you will need to create an integrated client to leverage the data fabric for persistence and controls. If you would like a link to your experience from the data fabric, you will need to create an Experience in the Applets table. See Setting Up Experiences to see how to set up both.

The Cinchy Platform also comes with a built-in Experience called My Data Network, this is a tool to help you visualize your data through its connections. You can read more about My Data Network, or create your own custom data network visualizer on your own data.

Enable Data At Rest Encryption

How to enable and other information in relation to REST Encryption

Cinchy 2.0 has added the feature to encrypt data at rest. This means that you can encrypt data in the database such that users with access to view data in the database will see ciphertext in those columns. However, all users with authorized access to the data via Cinchy will see the data as plain text.

In order to use this feature, your database administrator will be need to create a database master key (see below for instructions).

Create Master Key in Database

Connect directly to the database Cinchy is currently using.

Run the below query to create your master key - password to be used should adhere to your organization's password policy.

CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'password'; 

You can now encrypt data via the user interface

Now Encryption option is available for column types supported via UI.

Backup Master Key

After you have created your master key you can create a backup file of that key in case any data corruption occurs in future. You will need the password you used to create your master key in order to complete this operation.

BACKUP MASTER KEY TO FILE = 'path_to_file'   
    ENCRYPTION BY PASSWORD = 'password' 

Further documentation.

Restore Master Key

In the use case where you require to restore your master key due to data corruption use the command below to do so. You will need the password you used to create you master key in order to complete this operation.

RESTORE MASTER KEY FROM FILE = 'path_to_file'   
    DECRYPTION BY PASSWORD = 'password'  
    ENCRYPTION BY PASSWORD = 'password'  
    [ FORCE ] 

Further documentation.

Builder

Autonomous Data Fabric Platform

For ‘Builders’

Cinchy’s Autonomous Data Fabric platform introduces a complete re-think to the ‘integration-based’ paradigm where up to 50% of the resources on IT projects are spent on integration alone. The platform combines a network-based architecture with the unique ability to decouple data from applications (“Autonomous Data”). This powerful combination renders the 40-year old paradigm of “integration” obsolete, which means what used to require half your time and energy now takes just a tiny fraction.

There are two types of individuals that use the platform. The first are end-users who are able to view and manage all data that they have been granted access through Cinchy’s Universal Data Browser. The second type is ‘Builders’ who are really the creators of the Data Fabric. Builders are responsible for building out the organization's Fabric including table schemas, establishing access controls, and building application experiences for end-users. By significantly reducing the heavy reliance on integration, Builders can produce enterprise-grade solutions in what would have taken months in weeks for an unlimited number of end-users.

Builders can leverage Cinchy as one platform to simplify solutions delivery

  1. Connect

  2. Protect

  3. Collaborate

  4. Build

  5. Reuse

Connect

Create your new network of data

An Autonomous Data Fabric eliminates point-to-point integration, reducing cost and complexity (including real-time, batch, on-prem, cloud, legacy, and SaaS services) and allowing custom data-sync configurations. This drives faster time to market, lower costs and improved usability.

Connect data to make silos and integration obsolete

Once a data source is connected to your Data Fabric, its data can be used in conjunction with data from any other source on the Fabric with no further integration efforts. The more sources you connect, the more powerful and efficient your Data Fabric becomes. You can extend data on the Fabric with attributes and entirely new entities, including calculated data, derived data, AI models, and user-managed data.

Protect

Manage and protect data down to the individual cell

Data on your Autonomous Data Fabric is protected by cellular-level access controls, data-driven entitlements, and superior data governance. This includes meta architecture, versioning, and write-specific business functions that restrict user views, such as a managed hierarchy. Owner-defined permissions are universally enforced, significantly reducing the effort of managing them at the enterprise level. You can use existing Active Directory and SSO access policies to set controls for an individual user, external system, or user-defined functions (such as approving updates row by row or using bulk approvals).

Collaborate

Track version history for full lifecycle of data

All data is automatically version-controlled and can easily be reverted to previous states. You can see changes made by users, systems, or external applications through Data Synchronization or by using a Collaboration Log.

View and manage data through a single UI

Use the universal Data Browser to view, change, analyze, and otherwise interact with ALL data on the Fabric. Non-technical business users can manage and update data, build models, and set controls, all through an easy and intuitive UI.

Build

Create and share queries

Cinchy’s Autonomous Data Fabric platform features an intuitive Drag and Drop Query Builder that allows Builders to create using the Cinchy Query Language (), a proprietary language specific to Cinchy’s Autonomous Data Fabric technology. All queries can be easily saved and shared, and query results automatically generate a full no-code API.

Consolidate legacy systems, create new solutions

By decoupling the data from the application, our Autonomous Data Fabric lets you consolidate legacy applications to increase operational agility and reduce overhead. You can create enterprise-grade solutions using the Application SDK as quickly and easily as you would build Excel-based solutions and without the operational risk. Application SDK (React Native, Angular, and REST SDKs) lets you build custom applications for end users.

Add third-party visualization tools

For even more flexibility, connect your Data Fabric to third-party Data Visualization tools. You’ll be able to run cross-queries against all data on the Fabric while maintaining universal access policies for users, systems, and external applications.

Re-use

The more you use your Data Fabric, the more it’s capable of doing.

Any new data you add to the Fabric will work in conjunction with any previously existing data, instantly and easily. This means you can re-use data in new ways, with no time-consuming integration efforts. Teams can collaborate directly via the Fabric in real-time, allowing it to act as a central data hub while simplifying integration. Unlike traditional data architecture projects, which grow more complicated as they involve more data sources, the Autonomous Data Fabric delivers solutions faster and faster as more data is added to it.

User Preferences

Your Cinchy Profile has three (3) components that can be changed from user preferences:

My Photo

To add a photo to your profile, complete the following:

  1. From My Network, click the Avator icon

  2. Select My Profile

  3. From the settings page click on the my photo image

  4. Locate and upload your photo

My Profile

From the settings page in the My Profile section you are able to update the language, region, and time zone.

My Password

If you do not see the password option in my profile, you must be logging on to Cinchy using Single Sign-On and will not need to update your password within Cinchy.

To change your password, complete the following:

  1. In the Old Password field, enter in your existing password

  2. In the New Password field, enter a new password

  3. In the Confirm New Password field, re-enter in your new password

Cinchy password must be a minimum of 8 characters long and must contain a number and a symbol

Excel

Below are the steps you can follow to establish a connection to Cinchy from Microsoft Excel.

  1. Queries in Cinchy are what Excel connects to. If you don't have one that represents your dataset, you'll need to create that first. In this example, we will use a query called API Test:

  2. When you open the query you'll notice on the right-hand side a green button that says REST API

  3. Click on the REST API button and you'll see the below popup. In the textbox is the URL for the API endpoint. You can click the clipboard icon to copy the URL.

  4. In this example the URL is: http://your.cinchy.instance.domain/API/YourQueryDomain/API Test You'll notice that the structure here is <your Cinchy instance URL>/API/<the name of your query> and optionally at the end you may have querystring parameters. For access via Excel we're going to use Basic authentication and a result format of CSV, which uses a slightly different URL endpoint. Instead of /API/ in the URL, it's going to be /BasicAuthAPI/ and we're going to add a querystring parameter - ResultFormat=CSV. For this example your URL for accessing this dataset through PowerBI is going to be: http://your.cinchy.instance.domain/BasicAuthAPI/YourQueryDomain/API Test?ResultFormat=CSV

  5. Launch Excel. To access the dataset, click on Data in the menu bar, then Get Data > From Other Sources > Blank Query from the menu:

  6. In the window that launches there is a textbox where you can enter an expression. Here you will enter the below text (note your modified URL from Step 4 between quotes): =Csv.Document(Web.Contents("http://your.cinchy.instance.domain/BasicAuthAPI/YourQueryDomain/API Test?ResultFormat=CSV"))

  7. Once you've entered that text either click the check mark to the left of the input box or click away and it will automatically attempt to run the expression. The data may return in HTML format initially and not be what you're expecting: To correct this, click the Data Source Settings button, then under Credentials click the Edit button:

  8. Select Basic on the left, enter the credentials for a Cinchy User Account that has access to run this query and click OK. Click OK in the Edit Permissions dialog and click Close in the Data Source Settings Dialog. This process of entering your credentials won't occur with each query, it's just the first time and then they're saved locally.

  9. Click Refresh Preview and you should see the data. Click Close & Load and your dataset will now be displayed in the Excel worksheet.

CQL
My Photo
My Profile
My Password

My Network

My Network in Cinchy is your home page. This is where you will access and organize all of your tables, queries and applets in Cinchy. This section will cover the following features:

  • My Network

  • My Marketplace

  • Searching

  • My Bookmarks

  • My Data Network

My Network

Once you log in to Cinchy, you'll be on the My Network screen. From here, you can navigate to a variety of tables, queries, and applets you have access to.

You can return to the My Network page at any time by clicking the Cinchy logo in the top left corner.

My Marketplace

All objects you have access to within Cinchy will show up in my marketplace. You can find the objects that you are interested in through searching and filtering.

Searching

‌All objects you have access to in your Marketplace (including bookmarks) are searchable and can be filtered by typing the partial or full name of the object you are searching for in the search bar.

The ‘Reset’ button on the right side of the search bar can be used to clear the search.

You can also search by removing or adding object types from your My Network simply by selecting and deselecting an object type like Queries from the toolbar.

My Bookmarks

‌You can bookmark your most often used objects and rearrange them to your liking within your bookmarks.

To bookmark an object, complete the following:

  1. Locate the object

  2. Click the yellow star in the top right corner

The object will pop into your “My Bookmark” section.

To rearrange your bookmarks simply drag and drop the object into the desired order.

My Data Network

Your “My Data Network” shows a visualization of all tables in Cinchy you have access to and how they are all connected.

Each of the coloured circles represents an object in Cinchy. The lines between them demonstrate the links between them.

You are able to search and open tables from this view using the search bar on the left.

You can see what the network looked like in the past by clicking and dragging the pink circle along the timeline at the bottom.

Tableau

Web Data Connector

Cinchy exposes a Tableau Web Data Connector that provides access to Cinchy Saved Queries as data sources in Tableau. Tableau versions 2019.2+ are supported.

An active internet connection is required in order to use the Web Data Connector.

Pre-requisites

To get started, you must add a record into the Integrated Clients table in the Cinchy domain with the below values.

Column

Value

Client Id

tableau-connector

Client Name

Tableau

Grant Type

Implicit

Permitted Login Redirect URLs

Permitted Logout Redirect URLs

Permitted Scopes

Id, OpenId, Email, Profile, Roles

Access Token Lifetime (seconds)

3600

Show Cinchy Login Screen

Checked

Enabled

Checked

Connecting From Tableau

  1. Launch Tableau

  2. Under Connect -> To a Server select the Web Data Connector option

  3. Enter the URL from the Permitted Login Redirect URLs field on the Integrated Clients record created under the Pre-requisites section above

  4. The Cinchy login screen will appear, enter your credentials

  5. Select one or more queries to add to your data set. The result of each query will be available as a Table in Tableau. If a query has parameters, you will be prompted to provide the parameter values before you can add it to your collection.

  6. Click the Load button

The Cinchy query results will now be accessible for you to create your visualization.

Webhook Ingestion

This page describes the Cinchy Webhook ingestion, including a video walkthrough and step-by-step guide

Compatibility: Webhook ingestion was introduced in Cinchy platform v4.21. Note that previous Cinchy versions will not include a Webhooks system table, and will not support this feature.

Context: A webhook uses a trigger event to initiate a data transfer that can then be ingested by another application. Many applications support webhooks to provide data updates in real time with minimal configuration required. Cinchy users can subscribe to and ingest webhooks via configuring a unique API endpoint. When the external application addresses this endpoint, a pre-identified saved query can be run, under an authorized user account, to ingest the data and insert or update it into Cinchy.

The following video walks you through a specific configuration, with a general step-by-step guide below:

Walkthrough Video

Step-by-Step Instructions for Configuring a Webhook

  1. Identify the table in Cinchy that you want the information to be pushed to. (Image 1)

Image 1: Identifying your Table (Step 1)

Tip: Click here for instructions on creating a table within Cinchy.

2. Create your query in Cinchy. This query should take data from the webhook event, and push it into the Cinchy table that you identified in Step 1. (Image 2)

  • Your query will be running under a specific user account that you will assign in the next step. Ensure that whichever user you choose for this purpose has the correct permissions to execute the query, and to insert / update data in the target table.

Image 2: Creating your Query (Step 2)

Tip: Click here for instructions on creating a query within Cinchy.

Note: You will need administrator access to your Cinchy platform to perform Step 3.

3. Navigate to the webhooks table in Cinchy, and populate the following columns (Image 3):

  1. Key: This can be any string (we recommend that you treat this like a password and make it random and unguessable).

  2. Run As: Insert the user who will be running this query here. This should be the same user to whom you gave permissions in Step 2a.

  3. Saved Query: Select the saved query that you created in Step 2.

  4. Forward Payload as Parameter: This column will depend on the manner in which you would like to ingest the webhook payload.

    • If you are configuring individual parameters in the webhook payload (for example: @name, @url, etc.), you may leave this column blank.

    • If you are not configuring individual parameters, as an alternative you can ingest the entire payload under one parameter and specify it in this field. In the below image, we have defined it as “JSON”. This means that the full payload (which happens to be a JSON file in this case) will be parameterized as @JSON and then inserted into a table column named JSON.

Image 3: Configuring your Table (Step 3)

4. In your source application, navigate to the webhook settings and configure the following:

  • URL: This will be your Base URL + /API/callback?key= + the key value that you assigned in Step 3a.

    Example: {baseURL}/API/callback?key=rGh5tfgHK8989J5

Creating your first table

Follow these instructions to create a new table within Cinchy.

From the home screen, select Create in the top left hand corner to get started.

Click Create
You can choose either a standard or spatial table.
For a standard table, you can either create a table from scratch or start with a CSV import.

Standard vs Spatial Table

A spatial table allows you to create geography and geometry column types, as well as geospatial indexes. You will not be able to create partitions on a spatial table.

A standard table will not allow you to create geography and geometry columns. (Any existing tables created before installing Cinchy Platform v4.19.0 are standard tables).

You cannot convert from one type to another and will have to either recreate your table or link to another table with geospatial columns.

Starting from Scratch

Info Tab

Table Name

Mandatory field. Must be unique within the Domain.

I.e. You can have a [Football].[Teams] table and a [Basketball].[Teams] table, but not two tables named [Teams] under the Football domain.

Icon

You can optionally pick an icon, as well as color for your table. This will be displayed on the home screen.

Domain

You need to select a Domain your table will reside under. As an admin, you can also create new domains in this screen.

An admin can select the option to create a new domain when creating a new table.

Description

You can give your table a description. This description will be displayed on the home screen to users browsing the marketplace.

Example

Example name, icon, domain and description.
What it looks like in the Marketplace.
What it looks like in your bookmarks.

Columns/Links Tab

Your table must have at least one column to start. See Column Types to decide what type of column you should create and how to create it. Depending on the type, you will create it either under the columns or the links tab.

You can now click Save to create your first table!

Click Save to create the table.

Import a CSV

Domain

You need to select a Domain your table will reside under. As an admin, you can also create new domains in this screen.

File

You must upload a .csv file. The column names must not conflict with System Columns.

Default Settings

When creating a table via Import a CSV, a few settings will be set by default. These can be modified after the table is imported through the Design Table tab.

Default Table Name

The name of the file will be used as the name of the table (a number will be appended if there is a duplicate - ex. uploading Teams.csv will create a table named Teams 1, then Team 2 if uploaded again). You can always rename the table after it has been created.

Default Table Icon

The icon defaults to a green paintbrush.

Default Column Types

Columns by default will be created as a text field, with a maximum length of the longest value in the column. If a column has only numeric values in it, it will be created as a numeric column.

Default View

When you first create a table, a default view called All Data will be created for you under Manage Data.

You can create additional views or edit the All Data view under Managing Data.

Bookmarks and Marketplace

Once you create a table, it will be added to your bookmarks by default. Other users (or if you un-star the table from your bookmarks) will see it in the Marketplace if they have permissions to.

You can click the table in the home screen to get back to the Data Management screen for your table.

Commentary

Comments are used in Cinchy to provide context to your data along with providing a means of collaborating directly with and on the data. This section covers the following:

Entering Comments

Anyone who can view or edit a cell can comment on it.

To add a comment, complete the following:

  1. Locate the desired cell

  2. Right-click and select comment

  3. In the comment window, enter the comment

  4. Click the Comment button

Editing Comments

Comments can be modified only by only the individual(s) that have created the comment(s).

To edit a comment, complete the following:

  1. Hover over the comment

  2. Click the pencil icon

  3. Make the appropriate edit

  4. Click the Submit button to save the change

Deleting Comments

Comments can be deleted only by the individual(s) who has created the comment(s).

To delete a comment, complete the following:

  1. Hover over the comment

  2. Click the garbage bin icon

Archiving Comments

  • A User with the Approve Select Cell permission has the ability to archive comments on that specific cell

  • A User with the Approve All permission has the ability to archive any cell comments.

  • A User can archive his own comment regardless of approve permissions

To achieve all comments in a cell, complete the following:

  1. Hover over the comment

  2. Click the Archive All button

You can also archive just one comment in a comment string, click the archive icon for the specific comment you wish to archive in the thread.

System Table: Comments

Comments are stored in the [Cinchy].[Comments] table. You can see the structure below.

Multi-Lingual Support

Translate API

Translation API

POST <Cinchy-URL>/API/Translate

Pass in a list of literal GUIDs, along with a language and region. If translations are found in that language, they will be returned.

Request Body

Name
Type
Description

Logic

If the translation exists in the language and region specified, it will be returned.

If the translation exists in the language but not the specified region, it will still be translated and returned.

If the GUID exists but it is not available in the specified language, the default text in the Literals table will return.

If the GUID does not exist or you do not have permission to it, it will return the GUID back as the translation.

System Tables

There are 3 tables in Cinchy to provide language support. [Cinchy].[Literal Groups], [Cinchy].[Literals], and [Cinchy].[Literal Translations].

Literal Groups

This table can optionally be used to group the translations. The default Cinchy strings belong to the Cinchy literal group. We recommend you create one literal group per applet or UI so you can retrieve the full list of GUIDs required for that page/applet easily.

Literals

This table defines all the strings that you want to translate.

Default Text

String that displays if no translation is found for the language specified.

GUID

GUID used to refer to the literal. A UUID will be generated by default, but can be overrode using the Guid Override field to something more human-readable.

Literal Group

As mentioned above, this can be used to group your strings so they can be easily retrieved. Note that this is a multi-select so you can use a single literal for multiple applets (including using the default Cinchy literals and translations for custom applets).

Literal Translations

This is the table where the translations are stored.

Translated Text

This is the translated string that is returned.

Literal

This is the literal the translation is for.

Language and Region

A language must be specified for a translation. Region can also be optionally specified for region specific words (ex. color vs colour).

System Properties

System Properties is a table within Cinchy for managing system properties, such as default time zones, system lockout durations, password expirations, password properties, password attempts allowed etc.

Set up

The Default of the Systems Properties table is set up as follows:

Please note that this table is case sensitive.

The System Properties requirements can be changed by an admin user simply by editing the 'Value' columns where applicable:

Default Time Zone

Users can set their own time zones in their user profile. If a user does not set one up, the system default time zone will take effect. If this property does not exist or is invalid, the default time zone will default to UTC.

Minimum Password Length

The minimum password length is 8 characters and it will default to 8 if an invalid value is provided. However, this number can be changed in the 'Value' column to require users to have longer or shorter passwords.

Password Requires Symbols

This property specifies whether symbols are required in a user's password. The 'Value' 0 means symbols are not required and 1 means they are required.

Password Requires Numbers

This property specifies whether numbers are required in a user's password. The 'Value' 0 means numbers are not required and 1 means they are required.

For a new password policy to take effect, you can set all user's Password Expiration Timestamp to yesterday. They will need to change their password the next time they attempt to log in.

Password Expiration (Days)

This property specifies how many days until a password expires. Defaults to 90 but can be set to be shorter or longer by changing the number in the 'Value' column.

Password Attempts Allowed

This property specifies how many bad password attempts a user can make before they are locked out of the system. The default is 3 but this can be set to be more or less attempts by changing the number in the 'Value' column.

System Lockout Duration (minutes)

This property specifies how long a user is locked out of the system once they've run out of bad password attempts. The default is 15min but this can be set to be shorter or longer by changing the number in the 'Value' column.

Note that an administrator can also go into the 'Users' table to manually unlock a user by clearing the Locked Timestamp.

Maintenance Enabled

This is a property, defaulted to 0, that is simply responsible for showing this warning when a data owner is setting up or on a table. It is the administrator's responsibility to set up a scheduled maintenance job for performing compression and erasure, and then to change the property to 1 so that the warning no longer appears.

Forbidden Passwords

There is a new Cinchy table called Forbidden Passwords. This table comes with a prepopulated list of passwords from

You can add more blocked passwords to this list as well, and users will not be able to set their password to it (this can be used to add your company's name, or to import other blocked password lists). The check against the list is case insensitive.

Like other password policies, this check occurs when your password changes, so to enforce this you will need to set all passwords to be expired.

Linking Data

When you create a column within Cinchy, you can choose to create a link column. A link column allows you to establish inherent relationships with other tables.‌

Linking is done by the Cinchy ID, which is unique. When you create a link column, you select a column to link to. This is simply a decision on which field to show from the linked record. You should pick a unique field to link on to avoid confusion if possible. Once a record is created, its Cinchy ID never changes. This means that modifying the row of data in the linked table will not change the relationship in your table to that row. This also means that if you did not use a unique column, even though the UI looks the same, you are actually linking to different rows.‌

Choosing Linked Column

‌In general, you should only use unique columns as the linked column. This needs to be balanced with readability in other tables. For example, Full Name might not be unique to every employee, but it is a lot more readable and understandable than Employee ID. In other cases, it makes sense to link via an ID and simply add a display column to show relevant information.‌

Allow Linking

‌To help other builders follow best practices of only linking to unique (or close to unique, such as Full Name) columns, you should un-check the Allow Linking checkbox for non-unique columns so they will not be able to use it for linking.‌

Allow Display in Linked View

‌If this option is unchecked, it prevents users from showing this column in another table. For example, if you have an ID card # within an employees table, you may not want to display it to the rest of the company because it simply would not be relevant when they are linking to employees and want to see additional information (such as department, title, location). Arguably, a lot of these columns are also taken care of by access controls (since most people will not have access to view that column).‌

Generally unchecking this box should be done sparingly, as it does not impact the security of your data, only how convenient it is to see it.‌

Display Columns

‌When you select a record to link to on the Manage Data screen, it can be useful to see additional information about the records to ensure that it is the record you want to link to. You can add additional display columns in the advanced options for link columns.

‌

When you type in the cell, all displayed columns will be searched through, not just the Linked Column. (Green does not have a B in it, but #00B050 does so the Green record shows up)‌

Link Filter

‌The link filter filters out records from the drop down list. This is useful for reducing the options to only the typical use case. Commonly used for filtering the drop down to a list of active users or other resources, while not preventing someone from entering missing records with inactive resources.‌

Note that this is simply a display filter, it does not prevent other values from being entered as long as they are valid records in the linked table.‌

Relationships

‌You can define 1 to 1, 1 to many, and many to many relationships.‌

1:1 Relationship

‌Generally it is rare to link 1:1 relationships since they should usually be in the same table. For example, you would not have a separate table for Employee Phone Number and Employee Address, they would simply be two columns within the Employees table. However there are cases nonetheless where it makes sense, for example, a Keycard tracking table where each keycard has 1 assigned employee.‌

To enforce a 1:1 relationship within Cinchy, you set the unique constraint and leave it as single-select when creating a link column.‌

1:Many Relationship

‌A common relationship to have is a one to many relationship. For example, one customer can have multiple invoices.‌

To enforce a 1:many relationship within Cinchy, you want to create a link column in the table on the “many” side of the relationship (in the above example, in the invoices table) and leave the link column as single select.‌

Many:Many Relationship

‌You can also have a many to many relationship. For example, you can have multiple customers, and multiple products. Each customer can order multiple products, and each product can be ordered by multiple customers. Another example is books and authors. An author can write multiple books, but a book can also have multiple authors. There are two ways to express many to many relationships.‌

In the case of multiple customers and multiple products, we want to use orders as an intermediary table to create indirect relationships between customers and products. Each order has 1 customer, and each order has multiple products in it. We can derive the relationship between customers and products through the orders table.‌

To create a many:many relationship through a different entity, you want to create a table for orders. Within orders, you want to create a single-select link to customers and a multi-select link to products.‌

In the case of books and authors, it makes sense to create a multi-select link column in the Books table where multiple authors can be selected.‌

To create a multi-select link column in Cinchy, you simply check off the Multi-Select option when you create a new link column.

https://<your Cinchy URL>/Tableau/Connector
https://<your Cinchy URL>/Tableau/Connector

Planning

see here.

debug

boolean

Defaults to false if not specified. Debug true will explain why that string was returned as the translation.

region

string

Subtag from the Regions table. User's preferences will be used if not specified.

guids

array

Array of strings. Guids from the Literals table.

language

string

Subtag from the Languages table. User's preferences will be used if not specified.

Property ID

Name

Value (Default)

2

Default Time Zone

Eastern Standard Time

12

Password Attempts Allowed

3

13

System Lockout Duration (minutes)

15

8

Minimum Password Length

8

9

Password Requires Symbols

1

10

Password Requires Numbers

1

11

Password Expiration (Days)

90

15

Maintenance Enabled

0

Data Erasure
Data Compression
https://www.ncsc.gov.uk/static-assets/documents/PwnedPasswordsTop100k.txt
Entering Comments
Editing Comments
Deleting Comments
Archiving Comments
System Tables: Comments

4.18 Release Notes

Breaking Changes

  • Cinchy Platform 4.18.0+ will require Hosting Bundle .NET Core 3.1 instead of .NET Core 2.1 https://dotnet.microsoft.com/download/dotnet-core/3.1 (for the page) https://dotnet.microsoft.com/download/dotnet-core/thank-you/runtime-aspnetcore-3.1.10-windows-hosting-bundle-installer (for the direct download link)

New Features

  • Cinchy now expires your session based on inactivity and requires you to login again if you have been inactive for too long. The default timeout is 30 minutes, but can be configured in the CinchySSO appsettings.json. User Grants are tracked in the Cinchy.User Grants table.

Enhancements

  • Ability to chain User Defined Function triggers across multiple tables

  • Model loader now respects the order of the columns in your model.

  • Ability to change the issuer via a new app setting in CinchySSO. (see installation guide)

  • Further accessibility enhancements including fixing contrast issues, reordering columns, alerting the user when executing queries or saving data, and labeling objects in My Network

  • Platform returns a better error for unique constraint violations - this does not impact your platform experience since the UI retries each row when there's a unique constraint failure, but it does allow the CLI to sync in all valid rows rather than failing the whole batch.

Bug Fixes

  • Maintenance script was not correct (this would cause an error in running the maintenance command)

  • User Defined Function errors now display in the Query Builder screen (they are intentionally suppressed for calculated columns)

  • Fixed tab characters breaking link display columns

  • Popup for pasting data on Manage Data closes properly

4.18.1

Enhancements

  • Indexes are no longer resaved unnecessarily on table save

Bug Fixes

  • Copy and paste in Data Controls resolves against the correct columns

  • Yes/No filter no longer resets incorrectly

4.18.2

Enhancements

  • New look for Collaboration Log

  • Improve caching and data callback logic

  • Added the /Experiences/Domain/AppletName endpoint

Table Features

There are several table features that can be used to better view, collaborate and analyze the data in Cinchy tables. This section will go over the following features:

Views

When you first create a table, a default view called All Data will be created for you under Manage Data. Cinchy builders can create different views for users to manage their data in Cinchy where Views can be filtered and/or sorted according to the requirements.

To switch between views simply select the view from the left navigation toolbar under Manage Data.

Filters

Users can filter data in a view for one or more columns. Filters persist when users navigate from one view to another. The number of filter criteria is identified against the filter icon.

Column Display

Users can add, remove or rearrange the columns in a view based on how they need the data represented in the View.

To add a column to a View, complete the following:

  1. Click Display Columns in the top toolbar

  2. From the ‘Add a Column’ drop down, locate and select the appropriate column.

To remove a column from a View, complete the following:

  1. Click the “X” to the right of the column name to remove.

To rearrange the columns in a View, complete the following:

  1. Drag the column to the appropriate location in the list of visible columns.

Ensure you click ‘Apply’ to save the modified columns displayed.

Please note that Display Columns DO NOT persist. When you move away from the View, any modifications will be lost.

Sorting

Users can sort data in a view for one or more columns. Sorting can be done by clicking on a column to sort in ascending or descending order.

Sorting can also be done by clicking on the Sorting button and selecting the column(s) to be sorted and the order in which the sorting should occur.

Sorting Columns DO NOT persist, when you move away from the View any modifications will be lost.

Scrolling

Scrolling “Top” & “Bottom” allows you to jump from the top to the bottom of a view without scrolling.

Row Height

Row height can be either Collapsed or Expanded using the Row Height drop-down.

In addition to the row height selection, you can now also manually resize a row (or multiple rows if you select more of them). You can also double click on the default row number on the left to auto expand the row height.

Freeze Unfreeze Rows

Cinchy allows you to freeze and unfreeze a row similar to Excel.

To freeze or unfreeze a row, complete the following:

  1. Select the row for freezing/unfreezing

  2. Right-click on the row and select Freeze/Unfreeze Row from the menu

Freeze Unfreeze Columns

Cinchy allows you to freeze and unfreeze a column similar to Excel.

To freeze or unfreeze a column, complete the following:

  1. Select the column for freezing/unfreezing

  2. Right-click on the column and select Freeze/Unfreeze Column from the menu

SAS

Cinchy’s JDBC Driver can be used with the SAS/ACCESS Interface to JDBC Driver capability to connect to your Cinchy instance. This section will discuss configuration and connection to your Cinchy data experience.

Prerequisite

Cinchy's JDBC Driver must be moved to the "/opt/Cinchy/client/JDBC" directory in the SAS server before attempting to configure.

Configuration

To configure SAS, you only need Cinchy’s JDBC driver that is accessible from your SAS session.

The SAS/ACCESS Interface to JDBC LIBNAME statement allows you to assign a library reference to your data source. This feature lets you reference a table directly in a DATA Step or in a PROC Step. This example shows the basic LIBNAME statement to connect to Cinchy.

libname libCinchy jdbc classpath = "/opt/Cinchy/client/JDBC" class = "org.cinchy.jdbc.Driver" URL = "jdbc:cinchy://<<Cinchy SSO URL>>/identity/connect/token?client_secret=<<GUID>>;client_id=<<ClientID>>;username=<<Username>>;password=<<PASSWORD>>;grant_type=password;scope=js_api;";

For more information on the parameters please see .

Argument

Description

Classpath

This option specifies the class path to your JDBC JAR files.

Class

This option specifies the class name for the JDBC driver to use for your connection. Here, we use JDBC driver class for Cinchy, org.cinchy.jdbc.Driver.

URL

This option specifies the JDBC connection string to use to connect to your data source. The Cinchy JDBC connection string used here is:jdbc:cinchy://<<Cinchy SSO URL>>/identity/connect/token?client_secret=<<GUID>>;client_id=<<ClientID>>;username=<<Username>>;password=<<PASSWORD>>;grant_type=password;scope=js_api;

here
4.21 Homepage Refresh Walkthrough
{
  "data": {
    "button.create": {
      "translation": "Create",
      "language": "en",
      "region": "US",
      "defaultText": false
    },
    "button.cancel": {
      "translation": "Cancel",
      "language": null,
      "region": null,
      "defaultText": true
    },
    "button.favorite": {
      "translation": "Favourite",
      "language": "en",
      "region": "CA",
      "defaultText": false
    },
    "button.delete": {
      "translation": "button.delete",
      "language": null,
      "region": null,
      "defaultText": false
    }
  }
}
Views
Filters
Column Display
Sorting
Scrolling
Row Height
Freeze Unfreeze Rows
Freeze Unfreeze Columns

Data Management

There are several ways to work with (enter, update, remove, load and extract) data from Cinchy tables. This section will go over the following:

  • Data Entry

  • Insert / Delete Data Rows

  • Import Data

  • Export Data

  • Approve / Reject Data

  • Collaboration Log

  • Recycling Bin

Data Entry

Users are only able to enter data into Cinchy based on their access. Users can also copy and paste data from external sources.

Insert / Delete Data Rows

Users are only able to insert or delete rows based on their access. If you have the ability to insert and/or delete a row of data it will be visible when right-clicking on a row of data.

Import Data

Importing data allows you to add new rows of data into a table. If you wish to perform a sync instead please refer to the CLI. Importing data acts as a smart copy-and-paste of new data into an existing table.

Importing the first row of your CSV as a header row will match the headers to the column names within your table. Any columns that cannot be matched will be ignored as well as any columns you do not have edit permissions for.

Users can import data from a CSV file to an existing table in Cinchy. Importing data into a Cinchy table only adds records to the table. This type of importing of data does not update or append existing records

To import data into a table, complete the following:

  1. From within the table, click the Import button on the top toolbar of the table (Image 1).

Image 1: Step 1, Click the Import button

2. Click Choose File to locate and import you file.

3. Validate the imported columns and click next (Image 2).

Image 2: Step 3, Validate your columns and select next

4. Click the Import button

5. Click the OK button on the Import confirmation window

Import Errors

This will return the same errors as you would get in the UI if you were doing a copy and paste.

If there are import errors click the download button next to Rejected Rows on the Import Succeeded with Errors window.

You will get a file back with all the rejected rows, as well as the 2 columns added called ‘Cinchy Import Errors' and 'Cinchy Import Original Row Number’.

Cinchy Import Original Row Number

This provides a reference to the row number in the original file you imported in case you need to check it. You can simply fix any errors in your error log followed by importing the error log since successful rows are omitted.

Export Data

Users are only able to export data in CSV or TSV format. Please note that when data is exported out of the Fabric the data is just a copy and no longer connected to Cinchy.

To export data from a table, complete the following:

  1. From within the table, click the Export button in the table toolbar

  2. Select the Export file type (CSV or TSV)

  3. Open Excel file to view

Approve / Reject Data

Cinchy has the ability to have data change approvals turned on (configured by builders) when data is added or removed from a table view. A change approval process can be put into place for the addition or removal of specific data. If you have been identified as an "Approval" of data you will have the ability to

  • approve a cell of data

  • approve a row of data

  • reject a row of data

To approve or reject a cell/row of data, complete the following:

  1. Right-click on the desired row/cell

  2. Select Approve row/cell or Reject row/cell

Collaboration Log

Accessing the Collaboration Log

The Collaboration log is accessible from each and every table within Cinchy (including metadata). It allows you to see the version history of ALL changes that have been made to an individual row of data.

To access Cinchy’s Collaboration Log complete the following:

  1. Open table

  2. Locate the desired row > Right Click > View Collaboration Log (Image 1)

Image 1: Step 2, Open the Collaboration Log

Once the Collaboration Log is open you have the ability to view ALL changes with a version history for the row selected within the table.

Users have the ability to revert to a prior version of the record. To do so, click the Revert button for the desired version.

Reverting Data in the Collaboration Log

Please note, there is a potential for a record to have a white coloured Revert button. This indicates that version record(s) are identical to the current version of the record in the table. Hovering over the Revert button will provide a tool-tip.

Data Erasure and Compression Policies

By default, Cinchy does not delete any data or metadata from within the Data Fabric.

Click here for more information on Data Erasure & Compression Policies in Cinchy

Audit for Data Synchronization

Audit Logging of data loaded into Cinchy via Data Synchronization such as batch or real-time using the Cinchy CLI, or through data changes by any Saved Queries exposed as APIs to external clients, is recorded the same way as if data is inputted into Cinchy by a User. All data synced into Cinchy will have corresponding line items in the Collaboration Log similarly to how it is handled when data is entered / modified in Cinchy by a User.

Collaboration Log Performance Considerations

The Collaboration Log data is also stored within Cinchy as data, allowing the logs to be available for use through a query or for any downstream consumers. There are no separate performance considerations needed for the logs as it will rely on the Cinchy platform’s performance measures.

Recycle Bin

All data records that have been deleted are put into Cinchy’s Recycle Bin. Data that resides in the Recycle Bin can be restored if required.

To restore data from the recycle bin, complete the following:

  1. From the left-hand navigation, click Recycle Bin

  2. Locate the row for restoring

  3. Right-click and select Restore Row.

The restored row will now be visible in your table.

Note, if Change Approvals are turned on, that row will need to be approved.

ExecuteCQL

You can execute CQL directly without creating a Saved Query using the following endpoint.

API Method Table

POST: ExecuteCQL

https://<Cinchy Web URL>/API/ExecuteCQL

Query Parameters:

Name
Data Type
Description

CompressJSON

boolean

Default is true. Add this parameter and set to false if you want the JSON that is returned to be expanded rather than having the schema being returned separately.

ResultFormat

string

XML

JSON

CSV

TSV

PSV

PROTOBUF

Type

string

QUERY - Query (Approved Data Only) DRAFT_QUERY - Query (Include Draft Changes) SCALAR - Scalar NONQUERY - Non Query, such as an insert or delete VERSION_HISTORY_QUERY - Query (Include Version History)

ConnectionId

string

TransactionId

string

When one or more requests share the same TransactionId, they are considered to be within the scope of a single transaction.

Query

string

The CQL query statement to execute

Parameters

boolean

See below on format for the parameters.

SchemaOnly

integer

Defaults to false.

StartRow

integer

When implementing pagination, specify a starting offset. Combine with RowCount to set the size of the data window.

RowCount

integer

When implementing pagination, specify the number of rows to retrieve for the current page. Combine with StartRow to set the paging position.

CommandTimeout

string

Use this parameter to override the default timeout (30s) for long running queries. In seconds.

UserId

string

Header Parameters:

Name
Data Type

Authorization

string

Bearer <access_token>. See for details.

Responses:

  • 200 (OK)

Parameters

To pass in parameters in your executeCQL, you will need to pass in sets of parameters in the following format. So if you have one parameter then you would pass in 3 query parameters beginning with Parameters[0]. , and if you have a second parameter you would include an additional 3 query parameters beginning with Parameters[1]. .

Query String Parameter Name

Content

Parameters[n].ParameterName

Name of the parameter that is in your query, including the '@'.

Ex. @name

Parameters[n].XmlSerializedValue

XML Serialized version of the value of that parameter.

Ex. &quot;test&quot;

Parameters[n].ValueType

Datatype of the value.

Ex. System.String

High Number of Groups in ADFS

If you are syncing someone with a lot of ADFS groups, the server may reject the request for the header being too large. If you are able to login as a user with a few groups in ADFS but run into an error with users with a lot of ADFS groups (regardless of if those ADFS groups are in Cinchy), you will need to make 2 changes.

Server Max Request Header Size

CinchySSO App Settings

In your CinchySSO app settings, you will also need to increase the max size of the request header.

    "AppSettings": {
      ...
      "MaxRequestHeadersTotalSize": {max size in bytes},
      "MaxRequestRequestBufferSize": {max size in bytes, use same as above},
      "MaxRequestBodySize": -1
    }

For more details on the app settings see the app settings section of Configuring ADFS.

image.png
image.png
image.png
image.png
image.png
image.png
image.png

Indexing and Partitioning

This page outlines indexing and partitioning when building tables

Table of Contents

Table of Contents

1. Indexing

Indexing is used to improve query performance on frequently searched columns within large data sets. Without an index, Cinchy begins a data search with the first row of a table and then follows through the entire table sequentially to find all relevant rows. The larger the table(s), the slower the search.

If the table you are searching for has an index for its column(s), however, Cinchy is able to search much quicker.

In the below example, we will set up a query for a Full Name field. When you create an index for that field, an indexed version of your table is created that is sorted sequentially/alphabetically.

When you run your query on this index, that table will be searched using a binary search.

A binary search will not start from the top record. It will check the middle record with your search criteria for a match. If a match it not found, it will check whether the found value is larger or smaller than the desired value. If smaller, it reruns the data check with the top half of the data, finding the median record. If larger, it reruns the data check with the bottom half of the data, finding the median record. It will repeat until your data is found.

1.1 Setting Up an Index

In this example, we have a table with employee names (Image 1). We want to search for "John Smith", using the Full Name column.

  1. To set up your index, select Design Table from the left navigation tab.

2. Click Indexes (Image 2).

3. Select "Click Here to Add" and fill out the following information for a new index. Click save when done (Image 3):

  • Index Name.

  • Select the column(s) to add to your index. For our example we have selected the Full Name column to be indexed.

    • You can select more than one column per index.

  • Select the Included column(s) to add to your index, if applicable.

    • The difference between regular columns and Included columns is that indexes with included columns provide the greatest benefit when covering your query because you can include all columns your query may reference, such as columns with data types, numbers, or sizes not allowed as index key columns.

    • For more on Included Columns,

4. We can now query our full name column for John Smith and receive our results quicker than if we hadn't set up our index (Image 4).

Note that there is no UI change in the query builder or your results when running a query on an indexed column. The difference will be in the speed of your returned results.

2. Full-Text Indexing

A full-text index is a special type of index that provides index access for full-text queries against character or binary column data. A full-text index breaks the column into tokens and these tokens make up the index data.

2.1 Setting up a Full-Text Index

  1. Click on Design Table > Full-text Index

  2. Add in the desired column(s) and click save when done (Image 5).

3. Partitioning

Partitioning data in a table is essentially organizing and dividing it into units that can then be spread across more than one file in a database. The benefits of this are:

  • Improved efficiency of accessing and transferring data while maintaining its integrity.

  • Maintenance operations can be performed on one or more partitions more efficiently.

  • Query performance is improved based on the types of queries most frequently run.

When creating a partition in Cinchy, you use the values of a specified column to map the rows of a table into partitions.

3.1 Setting up a Partition

In this example we want to set up a partition that divides our employees based on a Years Active column (Image 6). We want to divide the data into two groups: those who have been active for two years or more, and those who have only been active for one year.

  1. Click on Design Table > Partition

  1. Fill in the following information and click save when done (Image 7):

  • Partitioning Column: this is the column value that will be used to map your rows. In this example we are using the Years Active column.

  • Type: Select either Range Left (which means that your boundary will be <=) or Range Right (where you boundary is only <). In this example we want our boundary to be Range Left.

  • Add Boundary: Add in your boundary value(s). Click the + key to add it to your boundary list. In this example we want to set our boundary to 2.

Once set up, this partition will organize our data into two groups, based on our boundary of those who have a Years Active value of two or above.

2. You can now run a query on your partitioned table (Image 8).

Note that there is no UI change in the query builder or your results when running a query on a partitioned table. The difference will be in the speed of your returned results.

For more formation on creating, modifying or managing Partitioning, please visit Microsoft's documentation.

Formatting Rules

You can apply conditional formatting rules. Our first iteration of this is done directly in the Formatting Rules table. A future iteration will add a UI within a table to create them.

Columns

Row Condition

This follows the same syntax as a view filter query.

Ordinal

Order in which the formatting rules will apply on the same table. Ordinal 1 will show up above ordinal 2.

Highlight Color

Color to highlight the cell. If you want to add your own colors you can do so within System Colours and check off highlight under usage.

Table

Table in which to apply the conditional formatting.

Highlight Columns

Columns to apply the conditional formatting rules to. You do not need to include any row condition columns within the highlight columns.

Example

Authentication

Indexing and Partitioning

Indexing and Partitioning

Indexing and Partitioning

click here
Partitioned table and Indexes
Image 1: Setting up an Index
Image 2: Click "Indexes"
Image 3: An example index
Image 4: An example query on an indexed column
Image 5: Full text indexing
Image 6: Partitioning
Image 7: Creating your Partition
Image 8: An example query on a partitioned table
Example formatting rules.
Formatting applied in the table
image.png
image.png
image.png
image.png

Single Sign-On (SSO) Integration

This page walks through the integration of an Identity Provider with Cinchy via SAML Authentication

Overview

Cinchy supports integration with any Identity Provider that issues SAML tokens (e.g. Active Directory Federation Services) for authenticating users. It follows an SP Initiated SSO pattern where the SP will Redirect to the IdP and the IdP must submit the SAML Response via an HTTP Post to the SP Assertion Consumer Service. Below is a diagram outlining the flow when a non-authenticated user attempt to access a Cinchy resource.

SAML Authentication Flow for Single Sign-On

Configure SAML Authentication

Cinchy must be registered with the Identity Provider. As part of that process you'll supply the Assertion Consumer Service URL, choose a client identifier for the Cinchy application, and generate a metadata XML file.

The Assertion Consumer Service URL for Cinchy is the base URL for the CinchySSO application followed by "/{AcsURLModule}/Acs" e.g. https://myCinchyServer/CinchySSO/identity/AuthServices/Acs. {AcsURLModule} value needs to be defined in appsettings.json file

To enable SAML authentication within Cinchy, follow the below steps:

  1. You can find the necessary metadata XML from the applicable identity provider you're using the login against. Place the metadata file in the deployment directory of the CinchySSO web application.

If you are using Azure AD for this process, you can find your metadata XML by following these steps.

If you are using GSuite for this process, you can find your metadata XML by following steps 1-6 here.

If you are using ADFS for this process, you can find your metadata XML at the following link, inputting your own information for <your.AD.server>: https://<your.AD.server>/FederationMetadata/2007-06/FederationMetadata.xml

If you are using Okta for this process, you can find your metadata XML by following these steps.

If you are using Auth0 for this process, you can find your metadata XML by following these steps.

If you are using PingIdentity for this process, you can find your metadata XML by following these steps.

2. Update the value of the below app settings in the CinchySSO appsettings.json file.

  • SAMLClientEntityId - The client identifier chosen when registering with the Identity Provider

  • SAMLIDPEntityId - The entityID from the Identity Provider metadata XML

  • SAMLMetadataXmlPath - The full path to the metadata XML file

  • AcsURLModule - This parameter is needs to be configured as per your SAML ACS URL.

    • Example ACS URL: "https:///CinchySSO/identity/AuthServices/Acs"

    • Example parameter value: "/identity/AuthServices"

3. The following parameters pertain to signed SAML IdP requests. Within the IDPSSODescriptor tag of the metadata is the below WantAuthnRequestSigned attribute:

<IDPSSODescriptor WantAuthnRequestsSigned="true" protocolSupportEnumeration="urn:oasis:names:tc:SAML:2.0:protocol">

If the value for this attribute is set to "true" then your Identity Provider is expecting the request to be "Signed". In this case, please enter the following parameters:

  • SAMLSignCertificatePath - This parameter needs to be the path for the PFX file of the certificate. This must match the same certificate in your identity provider.

  • SAMLSignCertificatePassword - This parameter is the password for the above mentioned PFX file. You may choose to encrypt it or not.

  • SAMLSignCertificateMinAlgorithm - This parameter is optional and only needed for PFX files that are generated at different algorithm levels.

    • The possible options for this parameter are:

      • SHA1

      • SHA256

      • SHA384

      • SHA512

      • http://www.w3.org/2000/09/xmldsig#rsa-sha1

      • http://www.w3.org/2000/09/xmldsig#rsa-sha256

      • http://www.w3.org/2000/09/xmldsig#rsa-sha384

      • http://www.w3.org/2000/09/xmldsig#rsa-sha512

4. If the Identity Provider is configured for the request to be encrypted please provide a PFX file, with a non-empty password, for the below attributes:

  • SAMLEncryptedCertificatePath - This parameter needs to be the path for the PFX file of the certificate. This must match the same certificate in your identity provider.

  • SAMLEncryptedCertificatePassword - This parameter is the password for the above mentioned PFX file.

When configuring the Identity Provider, the only required claim is a user name identifier. If you plan to enable automatic user creation, then additional claims must be added to the configuration. Click here for more information.

Once SSO is enabled, the next time a user arrives at the Cinchy login screen they will see an additional button "Single Sign-On". Before a user is able to login through the SSO flow, the user must be set up in Cinchy with the appropriate authentication configuration. See the User Management section below for instructions on how to perform this setup.

User Management

Users in Cinchy are maintained within the Users table in the Cinchy domain. Each user in the system is configured with 1 of 3 Authentication Methods:

  • Cinchy User Account - These are users that are created and managed directly in the Cinchy application. They log into Cinchy by entering their username and password on the login screen.

  • Non Interactive - These accounts are intended for application use.

  • Single Sign-On - These users authenticate through the SSO Identity Provider (configured using the steps above). They log into Cinchy by clicking the "Login with Single Sign-On" link on the login screen.

How to define a new SSO User

Create a new record within the Users table with the Authentication Method set to "Single Sign-On".

The password field in the Users table is mandatory. For Single Sign-On users, the value entered is ignored. You can input "n/a".

How to convert an existing user to SSO User

Change the Authentication Method of the existing user to "Single Sign-On".

Logging in with SSO

Once a user is configured for SSO, they can then click the "Login with Single Sign-On" link on the login page and that will then take them through the Identity Provider's authentication flow and bring them into Cinchy.

If a user successfully authenticates with the Identity Provider but has not been set up in the Users table, then they will see the following error message - " You are not a registered user in Cinchy . Please contact your Cinchy administrator." To avoid the manual step to add new users, you can consider enabling Automatic User Creation.

Automatic User Creation

Once SSO has been enabled on your instance of Cinchy, by default, any user that does not exist in the Cinchy Users table will not be able to login regardless if they are authenticated by the Identity Provider.

Enabling Automatic User Creation means that upon login, if the Identity Provider authorizes the user, an entry for this user will automatically be created in the Cinchy Users table if one does not already exist. This means that any SSO authenticated user is guaranteed to be able to access the platform.

In addition to creating a user record, if AD Groups are configured within Cinchy, then the authenticated user will automatically be added to any Cinchy mapped AD Groups where they are a member. See AD Group Integration for additional information on how to define AD Groups in Cinchy.

See below for details on how to enable Automatic User Creation.

Users that are automatically added will not be allowed to create or modify tables and queries. To provision this access, Can Design Tables and Can Design Queries must be checked on the User record in the Cinchy Users table.

Pre-requisites

The Identity Provider configuration must include the following claims in addition to the base configuration in the SAML token response:

  • First Name

  • Last Name

  • Email

In order to enable automatic group assignment for newly created users (applicable if you plan on using AD Groups), then also include an attribute that captures the groups that this user is a member of (e.g. memberOf field in AD)

Config setup

Enabling automatic user creation requires the following changes to the appsettings.json file in the CinchySSO web application.

  • Add "ExternalClaimName" attribute values under "ExternalIdentityClaimSection" in appsettings.json file. Do not add the value for "MemberOf" if you don't want to enable automatic group assignment .

  • The ExternalClaimName value must be updated to create a mapping between the attribute name in the SAML response and the required field. (e.g. http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname is the name in the SAML response for the FirstName field)

ExternalIdentityClaimSection
"ExternalIdentityClaimSection": {
			"FirstName": {
				"ExternalClaimName": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname"
			},
			"LastName": {
				"ExternalClaimName": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname"
			},
			"Email": {
				"ExternalClaimName": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress"
			},
			"MemberOf": {
				"ExternalClaimName": "http://schemas.microsoft.com/ws/2008/06/identity/claims/role"
			}
		}

Deployment Planning Guide

Deployment Architecture & Component Integration

The below context diagram provides a high-level overview of:

  1. All of Cinchy's available components and how they integrate with other applications within a customers organization.

  2. The deployment architecture of the Cinchy platform

  3. Interaction points involving end users

Note that certain components and configurations are optional and dependent upon the usage pattern of the platform, these will be called out in the table below the diagram which provides a description of each component.

Component Overview

#

Component

Description

Technology Stack

Dependencies

1

Cinchy Web Application

This is the primary application for Cinchy, providing both the UI for end users as well as the REST APIs that serve application integration needs. The back-end holds the engine that powers Cinchy's data / metadata management functionality.

ASP.NET MVC 5

  • .NET Framework 4.7.2+

  • IIS 7.5+

  • Windows Server 2012 or later

2

Cinchy IdP

This is an OpenID Connect / OAuth 2.0 based Identity Provider that comes with Cinchy for authenticating users.

Cinchy supports user & group management directly on the platform, but can also connect into an existing IdP available in the organization if it can issue SAML tokens. Optionally, Active Directory groups may be integrated into the platform.

When SSO is turned on, this component is responsible for federating authentication to the customer's SAML enabled IdP. This centralized IdP issues tokens to all integrated applications including the Cinchy web app as well as any components accessing the REST based APIs.

.Net Core 2.1

  • .NET Framework 4.7.2+

  • IIS 7.5+

  • Windows Server 2012 or later

3

Cinchy Database

All data managed on Cinchy is stored in a MS SQL Server database. This is the persistence layer

MS SQL Server Database

  • Windows Server 2012 or later

  • MS SQL Server 2012 or later

4

Cinchy CLI

This is Cinchy's Command Line Interface that offers utilities to get data in and out of Cinchy.

One of these utilities is a tool to sync data from a source into a table in Cinchy. This is able to operate on large datasets by leveraging an in-built partitioning capability and performs a reconciliation to determine differences before applying changes.

Another commonly used utility is the data export, which allows customers to invoke a query against the Cinchy platform and dump the results to a file for distribution to other systems requiring batch data feeds.

.NET Core 2.0

  • .NET Core Runtime 2.0.7+ (on Windows or Linux)

5

ADO.NET Driver

For .NET applications Cinchy provides an ADO.NET driver that can be used to connect into the platform and perform CRUD operations on data.

.NET Standard 2.0

  • See implementation support table

6

Javascript SDK

Cinchy's Javascript SDK for front-end developers looking to create an application that can integrate with the Cinchy platform to act as it's middle-tier and backend.

Javascript JQuery

7

Angular SDK

Cinchy's Angular SDK for front-end developers looking to create an application that can integrate with the Cinchy platform to act as it's middle-tier and backend.

Angular 5

Deployment Platforms (Virtual Servers vs. Physical Machines vs. Cloud)

The Cinchy application components are all designed to be agnostic to whether a customer chooses to use Virtual Servers, Physical Machines, or a Cloud environment. For the web components the underlying platform dependency is Windows Servers, these can be provisioned on any of the 3 platforms, including Cloud when using IaaS. The same is true for the database server.

System Requirements

Minimum Web Server Hardware Recommendations

  • 2 x 2 GHz Processor

  • 8 GB RAM

  • 4 GB Hard Disk storage available

Minimum Database Server Hardware Recommendations

  • 4 x 2 GHz Processor

  • 12 GB RAM

  • Hard disk storage dependent upon use case. Note that Cinchy maintains historical versions of data and performs soft deletes which will add to the overall storage requirements.

Clustering

Clustering considerations are applicable to both the Web and Database tiers in the Cinchy deployment architecture.

The web tier can be clustered by introducing a load balancer and scaling web server instances horizontally. Each node within Cinchy uses an in-memory cache of metadata information, and expiration of cached elements is triggered upon data changes that would impact that metadata. Data changes processed by one node wouldn't immediately be known to other nodes without establishing connectivity between them. For this reason the nodes must be able to communicate over either http or https through an IP based binding on the IIS server that allows cache expiration messages to be broadcast. The port used for this communication is different from the standard port that is used by the application when a domain name is involved. Often for customers this means that a firewall port must be opened on these servers.

The database tier relies on standard MS SQL Server failover clustering capabilities.

Scaling Considerations

The web application is responsible for all interactions with Cinchy be it through the UI or connectivity from an application. It interprets/routes incoming requests, handles serialization/deserialization of data, data validation, enforcement of access controls, and the query engine to transform Cinchy queries into the physical representation for the database. The memory footprint for the application is fairly low as caching is limited to metadata, but the CPU utilization grows with request volume and complexity (e.g. insert / update operations are more complex than select operations). As the user population grows or request volume increases from batch processes / upstream system API calls there may be a need to add nodes.

The database tier relies on a persistence platform that scales vertically. As the user population grows and request volume increases from batch processes / upstream system API calls the system may require additional CPU / Memory. Starting off in an environment that allows flexibility (e.g. a VM) would be advised until the real world load can be profiled and a configuration established. On the storage side, Cinchy maintains historical versions of records when changes are made and performs soft deletes of data which will add to the storage requirements. The volume of updates occurring to records should be considered when estimating the storage size.

Backups

Outside of log files there is no other data generated & stored on the web servers by the application, which means backups are generally centered around the database. Since the underlying persistence platform is a MS SQL Server, this relies on standard procedures for this platform.

Row Level Entitlements

These are entitlements that apply to specific rows. Used in conjuncture with Column Level entitlements this allows for granular cell level entitlements.

Insert Row

This permission allows a user to create new rows in the table.

Delete Row

This permission allows users to delete rows in the table.

Viewable & Editable Row Filter

This is a CQL fragment that applies a filter to which rows will be viewable or editable. Think of the column entitlements and the fragment as a SQL statement applied to the table.

SELECT {Edit Selected Columns} WHERE {Editable Row Filter}

Examples for Row Filter

Most of these examples will be with the editable row filter so it is easy to see the underlying data for comparison. However this can be done for viewable row data as well.

Sample Data

Simple Example

Edit Specific Columns: Age Editable Row Filter: [Age] > 30

Example with Viewable Data

View Specific Columns: First Name, Last Name Viewable Row Filter: [End Date] IS NULL OR [End Date] > GetDate()

Layer On Another Entitlement

View Specific Columns: All Edit Specific Columns: First Name, Last Name, Age Viewable Row Filter: [First Name] = 'John' Editable Row Filter: [First Name] = 'John'

Example for Current User

For the All Users group: View All Columns: Check Edit Selected Columns: First Name, Last Name Editable Row Filter: [User Account].[Cinchy Id] = CurrentUserId()

To allow a user to edit certain fields of their own data, you will need an association from a user to the [Cinchy].[Users] table. You can then use the following function to allow edit for that user, where [...] is the chain of link columns to get to the Users table.

[...].[Cinchy Id] = CurrentUserId()

Creating your first query

Follow these instructions to create a new query within Cinchy.

1. Creating a Saved Query

  1. From the homepage, select Create > Query (Image 1).

Image 1: Step 1, Click Create

2. Fill out the following information:

1.1 The Info Tab

Under the Info tab, you can fill out information on the query if you wish to save it (Image 2):

Image 2: The Info Tab
  • Query Name: ‌Mandatory field. Must be unique within the Domain.‌

  • Icon: ‌You can optionally pick a non-default icon, as well as color for your table. This will be displayed in My Network.

  • Domain: ‌You need to select a Domain your query will reside in. As an admin, you can also create new domains in this screen.

  • Description: ‌You can give your query a description. This description will be displayed on the home screen to users browsing the marketplace. It will also be searchable.

  • Return Type: There are six different return types:

    • Query Results (Approved Data Only)

      This is the default return type, it returns a table from a select query with only approved data for Maker/Checker-enabled tables, or all data for tables without Maker/Checker-enabled. This is generally used for external APIs as you will want to query approved data, rather than drafts.

    • Query Results (Including Draft Data)

      This return type returns a table from a select query with only draft data for Maker/Checker-enabled tables. Use this return type when looking to display results of records that are pending approval.

    • Query Results (Including Version History)

      This return type returns a table from a select query with historical data for all tables, as seen in the Collaboration Log of any record. This data includes all changes that happened to all records within the scope of the select query.

    • Number of Rows Affected

      This return type returns a single string response with the number of rows affected if the last statement in the query is an INSERT, UPDATE, or DELETE statement.

    • Execute DDL Script

      Use this return type when your query contains DDL commands that implement schema changes such as CREATE|ALTER|DROP TABLE, CREATE|ALTER|DROP VIEW, or CREATE|DROP INDEX.

    • Single Value (First Column of First Row)

      This return type returns a result of 1 row x 1 column, irrespective of the underlying result set.

1.2 The Query Tab

In the Query screen, you can modify and run your query.

On the left hand side you have the Object tree, which shows you all the domains, tables, and columns you have access to query within Cinchy. You can search or simply navigate by expanding the domains and tables.

You can drag and drop the columns or table you're looking for into the Query Builder.

Once you are satisfied with your query, you can click save to keep a copy. You can then find your query in the "Saved Queries" table:

You can find saved queries in the Saved Queries table.

Tables

Within Cinchy, you can change the structure of your table through the UI instead of at a database level.

To change the structure of an existing table (i.e. Rename columns, add new columns, change data type) go to Design Table in the left navigation.

To create a new table, please see creating your first table.

Saved Queries

Table of Contents

Table of Contents

1. Overview

See on how to create a saved query. The URL can be found on the Execute Query screen.

Use Access Token to Call Cinchy APIs

GET https://<Cinchy Web URL>/API/MyDomain/MyQuery

To access any Cinchy Saved Query API, pass the access token received from the Bearer Token Request in the Authorization header, prefixed by "Bearer".

Query Parameters

Name
Type
Description

Headers

Name
Type
Description

2. Example

%40 is the URL encoded version of @, if you are passing them in as parameters you will need to include the %40 in front of your parameter name.

Use Basic Authentication to Call Cinchy APIs

https://<Cinchy Web URL>/BasicAuthAPI/MyDomain/MyQuery

3. Anonymous Access

The following instructions detail how to allow anonymous access to your saved query API endpoint.

  1. Navigate to the table your query will be referencing. In this example, it is the Accessibility Assessments table (Image 1).

  2. Navigate to Data Controls > Entitlements.

  3. On a new row, add in the Anonymous user and ensure that either "View All Comuns" (to expose all the data) or "View Selected Columns" (to select individual columns) is checked off (Image 1).

Clicking on inline images in Gitbook will open a larger version.

4. Design your query (Image 2). For more information on creating new saved queries, .

5. Once you have written your query, navigate to Design Query > Info, on the left navigation bar.

6. Change your API Result Format to JSON (Image 3).

7. Navigate to Design Controls from the left navigation bar.

8. To ensure that anonymous users have the correct permission needed to execute the query that generates the API response, add the "Anonymous" user to the users permission group uiunder "Who can execute this query?" (Image 4).

9. Navigate to "Execute Query" from the left navigation bar.

10. Copy your REST API endpoint URL (Image 5).

11. To confirm that anonymous access has been successfully set up, paste the URL into an incognito/private browser (Image 6).

3.1 Troubleshooting

  • 401 errors likely mean you have not added the Anonymous user to the list of users (not "Groups") that can execute the query.

  • 400 errors likely mean that you have not added the Anonymous user to the list of users that (not "Groups") that can view column data from the tables that your query uses.

AD Group Integration

This page contains information on how to leverage Active Directory groups within Cinchy.

Group Management

Cinchy Groups

Cinchy Groups are containers that contain Users and other Groups within them as members, and used to provision access controls throughout the platform. Cinchy Groups enable centralized administration for access controls.

Groups are defined in the "Groups" table within the Cinchy domain. By default this table can only be managed by members of the Cinchy Administrators group. Each group has the following attributes:

  • Name - Group name, this must be unique across all groups within the system

  • Users - Users which are members of the group

  • User Groups - Groups which are members of the group

  • Owners - Users which are able to administer the membership of this group. By default Owners are also members of the group (i.e. they do not need to also be added into Users).

  • Owner Groups - Groups whose members are able to administer the membership of this group. By default, members of Owner Groups are also members of the group (i.e. they do not need to also be added into Users or User Groups).

  • Group Type - Either "Cinchy Group" or "AD Group". If this is a "Cinchy Group", this means that membership is maintained directly in Cinchy. If this is an "AD Group", then a sync process will be leveraged to maintain the membership and overwrite the Users.

How to define a new AD Group

  1. Create a new record within the Groups Table with the same name as the AD Group (use the cn attribute).

  2. Set the Group Type = "AD Group".

How to convert an existing group to sync with AD

  1. Update the Name attribute of the existing group record to match the AD Group (use the cn attribute).

  2. Set the Group Type to "AD Group".

Group Membership Sync

AD Groups defined in Cinchy have their members sync'ed from AD through a batch process that leverages the .

Execution Flow

The sync operation performs the following high-level steps:

  1. Fetches all Cinchy registered AD Groups using a Saved Query.

  2. Retrieves the usernames of all members for each AD Group. The default attribute for username that is retrieved is "userPrincipalName", but configurable as part of the sync process.

  3. For each AD Group, loads the users that are both a member in AD and exist in the Cinchy Users table (matched on the Username) into the "Users" attribute of the Cinchy Groups table.

Dependencies

  1. Cinchy CLI Model must be installed in your instance of Cinchy, steps are mentioned

  2. An instance of the Cinchy CLI must be available to execute the sync

  3. A task scheduler is required to perform the sync on a regular basis (e.g. Autosys)

Configuration Steps

Create a Saved Query to retrieve AD Groups from Cinchy

Create a new query within Cinchy with the below CQL to fetch all AD Groups from the Groups table. The domain & name assigned to the query will be referenced in the subsequent step.

Create the Sync Config

Copy the below XML into a text editor of your choice and update the attributes listed in the table below the XML to align to your environment specific settings. Once complete, create an entry with the config in your Data Sync Configurations table (part of the Cinchy CLI model).

If the userPrincipalName attribute in Active Directory does not match what you expect to have as the Username in the Cinchy Users table (e.g. if the SAML token as part of your SSO integration returns a different ID), then you must replaceuserPrincipalNamein the XML config with the expected attribute.

The userPrincipalName appears twice in the XML, once in the LDAPDataSource Columns and once in the CinchyTableTarget ColumnMappings.

Sync Execution & Scheduling

The below CLI command (see for additional information on the syncdata command) should be used to execute the sync. Update the command parameters (described in the table below) with your environment specific settings. Execution of this command can be scheduled at your desired frequency using your scheduler of choice.

The user account credentials provided in above CLI syncdata command must have View/Edit access to Cinchy Groups table.

References

Best Practices

Naming Convention

‌Cinchy is built as a simple, business user friendly application. This means that you should use business friendly terms to name your tables and columns. For example, you want to name a column “Full Name” rather than full_name, fullName, fName etc.‌‌

Domains

‌Domains essentially act as folders to be able to organize your data. Generally you will want to split domains by business lines (ex. Sales, Marketing, Human Resources, Product Development). The key thing is to keep it consistent so users have a general idea where to go to find information.‌‌

Descriptions

‌You can add descriptions to your tables and columns. Descriptions allow other users to use data in a more self-serve fashion, and also helps prevent misunderstandings of the meaning of your data.‌‌

Table descriptions are shown in the My Network screen, and will show up in search as well.‌‌

Column descriptions show up when you hover on the column in the Manage Data screen.​

SavedQuery
SELECT [Name]
FROM [Cinchy].[Cinchy].[Groups]
WHERE [Group Type] = 'AD Group' 
AND [Deleted] IS NULL
ConfigXML
<?xml version="1.0" encoding="utf-16" ?>
<BatchDataSyncConfig name="AD_GROUP_SYNC" version="1.0.0" xmlns="http://www.cinchy.co">
  <Parameters />
  <LDAPDataSource objectCategory="group" ldapserver="LDAP:\\activedirectoryserver.domain.com" username="encryptedUsername" password="encryptedPassword" >
    <Schema>
      <Column name="cn" ordinal="1" dataType="Text" maxLength="5000" isMandatory="false" validateData="false" trimWhitespace="true" description=""/>
      <Column name="member.userPrincipalName" ordinal="2" dataType="Text" maxLength="200" isMandatory="false" validateData="false" trimWhitespace="true" description=""/>
    </Schema>
    <Filter>
      lookup('Domain Name','Query Name')
    </Filter>
  </LDAPDataSource>
  <CinchyTableTarget model="" domain="Cinchy" table="Groups" suppressDuplicateErrors="false">
    <ColumnMappings>
      <ColumnMapping sourceColumn="cn" targetColumn="name" />
      <ColumnMapping sourceColumn="member.userPrincipalName" targetColumn="Users" linkColumn="Username" />
    </ColumnMappings>
    <SyncKey>
      <SyncKeyColumnReference name="name" />
    </SyncKey>
	<ChangedRecordBehaviour type="UPDATE" />
    <DroppedRecordBehaviour type="IGNORE" />
  </CinchyTableTarget>
</BatchDataSyncConfig>

XML Tag

Attribute

Content

LDAPDataSource

ldapserver

The LDAP server url

(e.g. LDAP:\\activedirectoryserver.domain.com)

LDAPDataSource

username

The encrypted username to authenticate with the AD server

(generated using the CLI's encrypt command -

dotnet Cinchy.CLI.dll encrypt -t "Domain/username").

LDAPDataSource

password

The encrypted password to authenticate with the AD server

(generated using the CLI's encrypt command -

dotnet Cinchy.CLI.dll encrypt -t "password").

LDAPDataSource -> Filter

Domain Name

The domain of the Saved Query that retrieves AD Groups

LDAPDataSource -> Filter

Query Name

The name of the Saved Query that retrieves AD Groups

dotnet Cinchy.CLI.dll syncdata -s cinchyAppServer -u username -p "encryptedPassword" -m "Model" -f "AD_GROUP_SYNC" -d "TempDirectory" 

Options

Description

-s, --server

Required. The full path to the Cinchy server without the protocol

(e.g. cinchy.co/Cinchy).

-u, --userid

Required. The user id to login to Cinchy.

This account must have access to edit the Groups table

-p, --password

Required. The encrypted password of the specified user

(generated using the CLI's encrypt command -

dotnet Cinchy.CLI.dll encrypt -t "password").

-m, --model

Required. The Cinchy model to use for retrieval of batch configuration information and persistence of the execution log.

-d, --tempdirectory

Required. The path to a directory that the CLI can use for storing temporary files to support the sync (e.g. partitioned data).

Cinchy Command-line-interface (CLI)
here
here
Cinchy CLI Reference
Syncdata Command
image.png

Saved Queries

Saved Queries

Saved Queries

WrapSingleRecordInArray

boolean

Default is true. Add this parameter and set to false if you want single record results returned as an object instead of within an array.

@param

string

If you have parameters in your query, you pass them indirectly as query parameters.

CompressJSON

boolean

Default is true. Add this parameter and set to false if you want the JSON that is returned to be expanded rather than having the schema being returned separately.

Authorization

string

Bearer <token goes here>

< HTTP/2 200 
< cache-control: private, s-maxage=0
< content-type: application/json; charset=utf-8
< server: Microsoft-IIS/10.0
< x-aspnetmvc-version: 5.2
< access-control-allow-origin: *
< x-aspnet-version: 4.0.30319
< x-powered-by: ASP.NET
< date: Wed, 1 Aug 2020 17:40:13 GMT
< content-length: 2985
--given the following CQL used within a saved query:

IF (ISNULL(@Int,0) = 0)
BEGIN
  RAISERROR('Invalid parameter: @Int cannot be NULL or zero',18,0)
  RETURN
END
IF (ISNULL(@String,'') = '' OR ISNULL(@String,'') = 'X')
BEGIN
  RAISERROR('Invalid parameter: @String cannot be NULL or empty or X',18,0)
  RETURN
END

--note "X-Cinchy-Error" in the sample response:

< HTTP/1.1 400 Bad Request
< Cache-Control: private
< Content-Type: text/html; charset=utf-8
< Server: Microsoft-IIS/10.0
< X-AspNetMvc-Version: 5.2
< Access-Control-Allow-Origin: *
< X-Cinchy-Error: Invalid parameter: @Int cannot be NULL or zero
< X-AspNet-Version: 4.0.30319
< X-Powered-By: ASP.NET
< Date: Wed, 1 Aug 2020 17:43:04 GMT
< Content-Length: 4517
< HTTP/1.1 401 Unauthorized
< Cache-Control: private
< Content-Type: text/html; charset=utf-8
< Server: Microsoft-IIS/10.0
< X-AspNetMvc-Version: 5.2
< X-AspNet-Version: 4.0.30319
Saved Queries
click here
Image 1: The Entitlements Table
Image 2: Design your Query
Image 3: Change your API Result Format to JSON
Image 4: Ensuring the anonymous user has access
Image 5: Copy your REST API endpoint URL
Image 6: Testing your API
here

Compression

If you need to manage space within your database, you can set a data compression policy. Currently we allow you to permanently delete versions in the collaboration log. Be aware that the current version of compression is a LOSSY process (data will be permanently deleted). Take that into consideration when configuring a policy.

We err on the side of keeping more versions rather than less versions. You can think of the above as keep any versions newer than 180 days and keeping the most recent 50 versions. So as long as a version satisfies one of the two keep conditions, we keep it. Using the example above:

  • A version that’s from 205 days ago but is amongst the most recent 50 versions (e.g. version 22 of 60) will be kept, because it satisfies at least one condition of being in the most recent 50 versions.

  • A version that’s from 163 days ago but is version 65 of 80 will be kept, because it satisfies at least one condition of being less than 180 days old.

  • A version that’s from 185 days ago and is version 65 of 80 will be deleted because, it doesn’t satisfy either of the conditions.

The actual compression of data happens during the maintenance window. Please check with your system administrators to confirm when maintenance is scheduled.

Change Approval Enabled Tables

The number of versions is based on the major version and not the minor version. This means for a record on version 35.63 with a keep most recent 10 versions, it will keep all versions 26.0 +, rather than all versions 35.44+.

4.13 Release Notes

Password Policies

You can now customize your password policies within Cinchy.

Default properties

See System Properties for more details.

4.13.1 Patch

Fixed an issue where you are unable to run a query referencing a Table Valued UDF concurrently under the same user account.

4.13.2 Patch

  • Fixed a bug relating to automatic group sync for Single-Sign On users on login.

  • Addressed a few bugs that caused limitations with certain SQL functions in CQL.

  • Fixed a bug where queries with parameters fail in Tableau.

  • Fixed a bug where Tableau was not working due to SameSite configurations.

  • Fixed an issue with single record updates in Cinchy that was causing resolvelink issues with the CLI.

image.png

Configuring ADFS

The following outlines the configuration required in Active Directory Federation Services (ADFS) to enable Single Sign-On (SSO).

ADFS Configuration

On your ADFS Server, Open AD FS Management.

Righ-click on Relying Party Trusts and select Add Relying Party Trust. This will launch the Add Relying Party Trust Wizard.

ADFS Wizard

Welcome

Select Claims aware. Click Start.

Select Data Source

Choose Enter data about the relying party manually. Click Next.

Specify Display Name

Enter a Display Name of your choice.

Configure Certificate

Do not choose any certificates.

Configure URL

Select Enable support for the SAML 2.0 SSO Web SSO protocol.

Enter your login URL in the following format:

Configure Identifiers, Choose Access Control Policy, Ready to Add Trust, Finish

Choose an Identifier and click Next until you are complete.

Claim Issuance Policy

  1. Right-click on the Relying Party Trust you just created (look for the Display Name) and click Edit Claim Issuance Policy.

  2. Click on the Add Rule... and choose Claim Rule as Send LDAP Attributes as Claims.

  3. Add Claim rule name, choose Active Directory under Attribute store and map LDAP attribute to outgoing claim types:

4. Click Finish.

5. Click on Edit Rule...

6. Click on View Rule Language and copy out the Claim URLs for the claims defined. This information will be needed in a later step to configure your Cinchy appsettings.json. This will look something like this:

7. Click Ok to save the rule.

8. Right-click on your Relying Party Trust again and click Properties.

9. Go to the Advanced tab and set the secure hash algorithm to SHA-256

Cinchy Configuration

Everything below is case sensitive and must match whatever is specified in your SAML IdP configuration.

  1. Open https://<your.AD.server>/FederationMetadata/2007-06/FederationMetadata.xml in a browser and save the XML file in the cinchysso folder.

  2. Open IIS Manager and create an HTTPS binding on the Cinchy site (if necessary).

  3. Go to sso site and bind HTTPS with it. Make sure to use the same port as the login URL above if specified.

Cinchy appsettings.json

AppSettings Section

External Identity Claim Section

You will need the Rule Language URLs you copied out from the ADFS Configuration above. Using the same example as above, we would get the following (replace with your own URLs).

Web.config

Add the 3 following lines to your web.config within the appSettings section:

Setting Up Experiences

Previously called Applets

Integrated Client Setup

Applet Setup

Column Types

All possible column types in Cinchy.

Common Fields

Name

Each column must have a unique name. They must also not conflict with system columns (even if you are not using Maker/Checker on the table).

Data Security Classification

Each column has a data security classification. This defaults to blank, and can be set to one of 4 pre-configured settings (Public, Internal, Restricted, Confidential) or additional options created in the [Cinchy].[Data Security Classifications] table by an administrator.

Currently there is no functionality tied directly to Data Security Classification - the tagging is just for internal auditing purposes. Future security settings will be tied to Data Security Classifications, rather than simply done at a column level.

Description

Each column can optionally have a description. The description is displayed when you hover on the column header in Data Management.

Common Parameters

Add to Default View

Checked by default. After saving your changes this will add the column to be displayed in the default table (All Data by default). Generally it makes sense to be checked since there should be a view where all columns are displayed.

If you need to hide a column from certain users or groups you can do so in table controls. It is usually best to have a view where all table columns are displayed.

Mandatory

Makes the column a mandatory field. You will not be able to save or alter a record in a state where a mandatory field is blank.

Unique

Requires all values in the column to be unique. Adding a new record or modifying a previous record into a state where it is a duplicate of another record will cause an error and cannot be saved.

If you need uniqueness across multiple columns instead (ex. First Name does not need to be unique, but First Name + Last Name needs to be unique), you can create an index in Design Table, add those columns and set the index to unique. If it needs to be more complicated, you can also create a calculated column and set that column to unique.

Multi-Select

Some fields can also be set to multi-select.

For example, the column Players in [Football].[Teams] can be a multi-select field since each team will have multiple players.

Allow Linking

Checked by default. This allows other tables to use the column as a link/relationship.

See to get more context on how they are used.

You want to pick identifying columns for linking, such as IDs or Name. Generally you want to use unique columns, but in some cases it is a better user experience to pick an almost unique field for readability.

I.e. Full name may not be unique, but it is much easier to understand than Employee ID.

Allow Display in Linked Views

Checked by default. Some columns may not make sense for linking but can be useful to display when someone is choosing an option.

See to get more context and tips.

Encrypt

If is enabled, you will see the option of Encrypt for columns. If this is checked, the column will be encrypted within the database. This is useful for hiding sensitive information so that people with access to the database directly do not see these fields.

There is no difference in user experience within the Cinchy platform. The data is displayed in plain text on the UI or via the query APIs.

Regular Columns

Text

Text columns have a maximum length, set to 500 by default.

These are equivalent to VARCHAR(n) data type in SQL.

Number

You can choose from 3 display formats for number - regular, currency, and percentage. You can also decide how many decimal places to display (0 by default). Note that these are both display settings, and will not affect how the number is stored.

These are equivalent to FLOAT(53) data type in SQL (default, 8-byte field).

Date

There are several Date column type display format options available in Cinchy:

  • MMM DD, YYYY (e.g. Oct 31, 2016)

  • YY-MM-DD (e.g. 16-10-31)

  • DD-MM-YYYY (e.g. 31-10-2016)

  • DD-MMM-YY (e.g. 31-Oct-16)

  • Custom Format

Custom data formatting provides additional flexibility in how dates (and times) can be displayed within a Date column type:

Please Note: the "Default Value" field is not mandatory and should be left blank (best practice). However, if populated you will not be able to clear the default date value provided to a "blank" data (no date). You will only be able to overwrite it with another date value.

These are equivalent to DATE() data type in SQL.

Yes/No

You must select a default value of yes (1) or no (0) for yes/no fields.

These are equivalent to bitdata type in SQL.

Calculated

A calculated column is evaluated using other fields on the record. It will also have a result type - which is the form in which the calculated results will be stored.

For example, you can have a column [Full Name] that is CONCAT([First Name], ' ', [Last Name]).

These are equivalent to computed columns in SQL.

Choice

You can create a choice column (single or multi-select) in Cinchy. In this scenario, you specify all your choices (1 per newline) in the table design. A user will only be able to select from the options provided.

Geospatial Columns

If you created a spatial table, you will have access to the geography and geometry column types. These columns also have the option to be indexed via Index in the advanced settings on the column itself.

Geometry

In the UI, this takes a well-known text (WKT) representation of a geometry object. You can modify or paste the WKT representation directly in the editor on the UI. Geometric functions can be performed on this column through CQL and calculated columns.

Geography

In the UI, this takes a well-known text (WKT) representation of a geography object. You can modify or paste the WKT representation directly in the editor on the UI. Geographic functions can be performed on this column through CQL and calculated columns.

Link Columns

Link columns allow you to establish inherent relationships with other records in other tables. See for more details.

Hierarchy Columns

Hierarchy columns are simply link columns referencing the current table. Some example uses of hierarchies:

  • Related Software Changes

  • Manager

Column Name

Description

Client Id

A unique identifier for each client.

Client Name

A friendly name for the client to help users maintaining this record.

Grant Type

The OAuth 2.0 flow that will be used during authentication. "Implicit" should be selected for API calls.

Permitted Login Redirect URLs

Add all URLs of an Applet separated by semicolon which can initiate login

Permitted Logout Redirect URLs

Add all URLs of an Applet separated by semicolon which can be used as Post Logout URL

Permitted Scopes

The list of permitted OAuth scopes, please check all available options.

Access Token Lifetime (seconds)

The time after with the token expires. If left blank, the default is 3600 seconds.

Show Cinchy Login Screen

Uncheck if you want to have SSO as default authentication and skip the Cinchy login screen

Enabled

This checkbox is used to enable or disable a client

Guid

This is a calculated field that will auto-generate the client secret

Column Name

Value

Domain

Select a domain for the applet to belong to.

Name

This is the name that will display for the applet in My Network

Full Name

This is a calculated field Domain.Name

Icon

Select a system icon for the applet, this will show in My Network.

Icon Colour

Select a system color for the icon above.

Description

Similar to table or query description. This field is viewable and searchable in My Network.

Target Window

When someone clicks the applet, this is the default way it will open.

Existing Window (Redirect) - this will redirect the user in the current window

Existing Window (Embedded) - this will open the applet embedded in Cinchy, the Cinchy header will be visible

New Window - the applet will open in a new window/tab.

Application Url

This is the URL where the applet resides.

Users

Users who can see this applet in the marketplace.

Groups

Groups who can see this applet in the marketplace.

Integrated Client

The integrated client for the applet.

Guid

This is a calculated field that is automatically generated for the applet.

https://<cinchy-sso-URL>/Saml2/Acs

LDAP Attribute

Outgoing Claim Type

Comments

User-Principal-Name

Name ID

SAM-Account-Name

sub

subwill need to be typed manually, make sure it does not autocomplete to something else like subject.

Given-Name

Given Name

Necessary for Automatic User Creation

Surname

Surname

Necessary for Automatic User Creation

E-Mail-Address

E-Mail Address

Necessary for Automatic User Creation

Is-Member-Of-DL

Role

Necessary for Automatic User Creation

c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname", Issuer == "AD AUTHORITY"]
  => issue(store = "Active Directory",
           types = ("http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier",
                    "sub",
                    "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname",
                    "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname",
                    "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress",
                    "http://schemas.microsoft.com/ws/2008/06/identity/claims/role"),
           query = ";userPrincipalName,sAMAccountName,givenName,sn,mail,memberOf;{0}",
           param = c.Value);

Attribute

Value

CinchyLoginRedirectUri

https://<cinchy-sso-URL>/Account/LoginRedirect

CinchyPostLogoutRedirectUri

https://<Cinchy-Web-URL>

CertificatePath

<Path to cinchysso>\\cinchyidentitysrv.pfx

SAMLClientEntityId

Relying party identifier from Relying Party Trust above

SAMLIDPEntityId

http://<AD-Server>/adfs/services/trust

Your FederationMetadata.xml will have this near the beginning. Note that this is the entityID, not the ID.

SAMLMetadataXmlPath

<Path to cinchysso>\\FederationMetadata.xml

This is the location where you placed the FederationMetadata.xml in step 1.

SAMLSSOServiceURL

In Domain controller, in-service endpoints, look for type Saml 2, URL path: https://<AD-Server>/Saml2/Acs

Same as the login URL provided to the wizard in the ADFS Configuration

AcsURLModule

/Saml2

MaxRequestHeadersTotalSize

Integer

Bytes to set the max request header to. If the default (likely 32kb) does not work, you may have to set this larger to accommodate a large number of groups.

MaxRequestBufferSize

Integer

This should be equal or larger than your header's total size above.

MaxRequestBodySize

Integer

If any of these values are -1 they will use the default. It is not necessary to change the body size.

{
  "AppSettings": {
    ...
    },
  "ExternalIdentityClaimSection": {
    "FirstName": {
      "ExternalClaimName": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname"
    },
    "LastName": {
      "ExternalClaimName": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname"
    },
    "Email": {
      "ExternalClaimName": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress"
    },
    "MemberOf": {
      "ExternalClaimName": "http://schemas.microsoft.com/ws/2008/06/identity/claims/role"
    }
  }
}
<appSettings>
  ...
  <add key="UseHttps" value="true" />
  <add key="StsAuthorityUri" value="https://<your.cinchy.url>" />
  <add key="StsRedirectUri" value="https://<your.cinchysso.url>/Account/LoginRedirect" />
  ...
</appSettings>
This will be your SAMLClientEntityId in your appsettings.json
Control Panel > System and Security > Administrative Tools > AD FS Management
Linking Data
Linking Data
Data At Rest Encryption
Linking Data
Enter the description in Design Table.
Description displayed on hover over column header.
Default Settings
Check off Index to create an index on a geospatial column

Installation

1. Create a Database

On your SQL Server 2012+ instance, create a new database named Cinchy (or any other name you prefer). If you choose an alternate name, in the remaining instructions wherever the database name is referenced, replace the word Cinchy with the name you chose.

A single user account with db_owner privileges is required for the Cinchy application to connect to the database. If you choose to use Windows Authentication instead of SQL Server Authentication, the account that is granted access must be the same account under which the IIS Application Pool runs.

2. Create an IIS Application Pool

On the Windows Server machine, launch an instance of PowerShell as Administrator.

Run the below commands to create the application pool and set its properties.

Import-Module WebAdministration
$applicationPoolNameSSO="CinchySSO"
$applicationPoolNameWeb="CinchyWeb"
New-WebAppPool -Name $applicationPoolNameSSO
$appPath = "IIS:\AppPools\"+ $applicationPoolNameSSO
$appPool = Get-IISAppPool $applicationPoolNameSSO
$appPool.managedRuntimeVersion = ""
Set-ItemProperty -Path $appPath -Name managedRuntimeVersion $appPool.managedRuntimeVersion
Set-ItemProperty "IIS:\AppPools\$applicationPoolNameSSO" -Name Recycling.periodicRestart.time -Value 0.00:00:00
Set-ItemProperty "IIS:\AppPools\$applicationPoolNameSSO" -Name ProcessModel.idleTimeout -Value 1.05:00:00
New-WebAppPool -Name $applicationPoolNameWeb
Set-ItemProperty "IIS:\AppPools\$applicationPoolNameWeb" -Name Recycling.periodicRestart.time -Value 0.00:00:00
Set-ItemProperty "IIS:\AppPools\$applicationPoolNameWeb" -Name ProcessModel.idleTimeout -Value 1.05:00:00

If you chose to use Windows Authentication in the database or want to run the application under a different user account, execute the below commands to change the application pool identity.

$credentials = (Get-Credential -Message "Please enter the Login credentials including your Domain Name").GetNetworkCredential()
$userName = $credentials.Domain + '\' + $credentials.UserName
Set-ItemProperty "IIS:\AppPools\$applicationPoolNameWeb" -name processModel.identityType -Value SpecificUser
Set-ItemProperty "IIS:\AppPools\$applicationPoolNameWeb" -name processModel.userName -Value $username
Set-ItemProperty "IIS:\AppPools\$applicationPoolNameWeb" -name processModel.password -Value $credentials.Password
Set-ItemProperty "IIS:\AppPools\$applicationPoolNameSSO" -name processModel.identityType -Value SpecificUser
Set-ItemProperty "IIS:\AppPools\$applicationPoolNameSSO" -name processModel.userName -Value $username
Set-ItemProperty "IIS:\AppPools\$applicationPoolNameSSO" -name processModel.password -Value $credentials.Password

You may use an alternate application pool name (i.e. instead of Cinchy) if you prefer.

3. Create the Application Directories

  1. Unzip the application package on your C drive. This will create 2 directories, C:\Cinchy and C:\CinchySSO. Ensure your application pool accounts has read and execute access to these directories (default accounts are IIS AppPool\CinchyWeb and IIS AppPool\CinchySSO).

  2. Run the below commands in the Administrator instance of PowerShell to create directories for the application logs. Ensure your application pool account has write access to these directories.

md C:\CinchyLogs\Cinchy
md C:\CinchyLogs\CinchySSO
md C:\CinchyErrors

4. Update the CinchySSO appsettings.json

Open the C:\CinchySSO\appsettings.json file in a text editor and update the values below.

Config Settings

App Settings

Under AppSettings section, update the values outlined in the table. Wherever you see <base url> in the value, replace this with the actual protocol (i.e. http or https) and the domain name (or ip address) you plan to use. e.g. if you're using https with the domain app.cinchy.co, <base url> should be replaced with https://app.cinchy.co

Key

Value

CinchyLoginRedirectUri

<base url>/Cinchy/Account/LoginRedirect

CinchyPostLogoutRedirectUri

<base url>/Cinchy

CertificatePath

C:\\CinchySSO\\cinchyidentitysrv.pfx

StsPublicOriginUri

Base URL used by the .well-known discovery. If left blank will match the request URL.

<base url>/cinchysso

IssuerUrl

The URL of the issuer. This value defaults to the StsPublicOriginUrl and will be used as the issuer of tokens issued by CinchySSO.

<base url>/cinchysso

CinchyAccessTokenLifetime

Duration for the Cinchy Access Token. Timespan, defaults to "0.00:30:00"

4.18.0+ includes session expiration based on the CinchyAccessTokenLifetime. So for the default of "0.00:30:00", this means that if you have been inactive in Cinchy for 30 minutes, your session will expire and you will need to log in again.

Below values are only required for SSO, otherwise leave them as blank

Key

Value

SAMLClientEntityId

Client Entity Id

SAMLIDPEntityId

Identity Provider Entity Id

SAMLMetadataXmlPath

Identity Provider metadata XML file path

SAMLSSOServiceURL

Configure service endpoint for SAML authentication

AcsURLModule

This parameter is needs to be configured as per your SAML ACS URL. For example, if your ACS URL looks like this - "", then the value of this parameter should be "/identity/AuthServices"

Connection String

Under the "ConnectionStrings" section you'll see

"SqlServer" : ""

The "SqlServer" value needs to be set for the application to connect to the database. If you're using SQL Server Authentication you can use the below as a reference and update the Server, User Id, and Password properties. If you chose a different database name earlier, you'll need to update that as well.

"SqlServer" : "Server=MyServer;Database=Cinchy;User ID=cinchy;Password=password;Trusted_Connection=False;Connection Timeout=30;Min Pool Size=10;"

If you're using Windows Authentication, then use the below as a reference and update the Server property (and Database if required).

"SqlServer" : "Server=MyServer;Database=Cinchy;Trusted_Connection=True;Connection Timeout=30;Min Pool Size=10;"

External Identity Claim Section

Under the "ExternalIdentityClaimSection" section you'll see, these values are used for SAML SSO. If you are not using SSO, keep these values as blank

Key

Value

ExternalIdentityClaim -> FirstName -> ExternalClaimName

ExternalIdentityClaim -> LastName -> ExternalClaimName

ExternalIdentityClaim -> Email -> ExternalClaimName

ExternalIdentityClaim -> MemberOf -> ExternalClaimName

Log4net.config and Web.config

The log folder is required to be configured under log4net.config and web.config files. Please make sure the identity under which the application pool is running has access to the log and certificate folders as configured.

Under the log4net.config, you'll see a RollingLogFileAppender section, and within that you need to update the value of <file> tag as below

<file value="C:\CinchyLogs\CinchySSO\log.txt" />

Under web.config, update "stdoutLogFile" value to "C:\CinchyLogs\CinchySSO\stdout" under "aspNetCore" tag. Also, update the value of "ASPNETCORE_ENVIRONMENT" to "Production".

5. Update the Cinchy Web.Config

Open the C:\Cinchy\Web.config file in a text editor and update the sections outlined below.

Connection String

Under the <connectionStrings> section you'll see

<add name="SqlServer" connectionString="" />

Replace this with the same connection string value you set in the C:\CinchySSO\appsettings.json file.

Application Settings

Under the <appSettings> section, update the values outlined in the table. Wherever you see <base url> in the value, replace this with the actual protocol (i.e. http or https) and the domain name (or ip address) you plan to use. e.g. if you're using https with the domain app.cinchy.co, <base url> should be replaced with https://app.cinchy.co

Key

Value

SSOLogPath

C:\CinchyLogs\CinchySSO\log.txt

UseHttps

true or false (based on whether you are using https in your base url)

StsAuthorityUri

Should match the StsPublicOriginUri value specified in the SSO appsettings above.

<base url>/cinchysso

StsRedirectUri

<base url>/Cinchy/Account/LoginRedirect

For StsAuthorityUri - Please make sure the sitename and cinchysso is in lower case. The same URL will be used for Applet's authority config.

Log File Path

Under the <log4net> section you'll see a RollingLogFileAppender, and within that is the following line

<file value="logs\log.txt" />

Replace the value attribute with the target log file location:

<file value="C:\CinchyLogs\Cinchy\log.txt" />

Under the <elmah> section you'll see

<errorLog type="Elmah.XmlFileErrorLog, Elmah" logPath="~/App_Data" />

Replace the logPath attribute with the target error log location:

<errorLog type="Elmah.XmlFileErrorLog, Elmah" logPath="C:\CinchyErrors" />

6. Create the IIS Applications

In the Administrator instance of PowerShell, execute the below commands to create the IIS applications and enable anonymous authentication (required to allow authentication to be handled by the application).

New-WebApplication -Name Cinchy -Site 'Default Web Site' -PhysicalPath C:\Cinchy -ApplicationPool CinchyWeb
New-WebApplication -Name CinchySSO -Site 'Default Web Site' -PhysicalPath C:\CinchySSO -ApplicationPool CinchySSO
Set-WebConfigurationProperty -Filter "/system.webServer/security/authentication/anonymousAuthentication" -Name Enabled -Value True -PSPath IIS:\ -Location "Default Web Site"

To enable HTTPS, the server certificate must be loaded and the standard IIS configuration completed at the Web Site level to add the binding.

7. Test the Application

Access the <base url>/Cinchy (e.g. http://app.cinchy.co/Cinchy) through Google Chrome. The login screen should appear. The default username is admin and the password is cinchy. You will be prompted to change your password the first time you log in.

To avoid users from having to access the application at a url that contains /Cinchy, you can use a downloadable IIS extension called URL Rewrite to remap requests hitting the <base url> to <base url>/Cinchy. The extension is available here.

HTTP 400 error responses to HTTP requests - Internet Information Servicesdocsmsft
https:///CinchySSO/identity/AuthServices/Acs

Migration from 1.x

Migration prerequisites when upgrading to Cinchy 2.x and later versions from 1.x

Pre-Deployment Steps

  • As "Integrated Apps" table will get modified to "Applets" table after 2.0 upgrade, please save "Integrated Apps" table data by running a select statement on this table. This data will be required to populate new table named "Integrated Clients" after deployment.

  • Please run below Update statement for updating "Integrated App" to "Applet" under "Launcher Objects" table

Pre-Deployment CQL Script
UPDATE [Cinchy].[Cinchy].[Launcher Objects]
SET [Type] = 'Applet'
WHERE [Type] = 'Integrated App'
  • Install .net core Hosting bundle Version 2.1 - https://www.microsoft.com/net/download/dotnet-core/2.1

  • Install .Net Framework 4.7.2 on the server

  • Create a new application pool with configuration as "No managed Code" and "Integrated" in IIS manager.

  • Take backup - Cinchy DB, Cinchy Web and Cinchy SSO

Deployment

  • Deploy binaries

  • Assign newly created application pool in Pre-Deployment steps to the Cinchy SSO application

  • Configure appsettings.json file under Cinchy SSO as described below, most of these configurations can be taken from the web.config of previous version of Cinchy SSO

Web.config (Previous location)

appsettings.json (New Location)

appsettings -> CinchyLoginRedirectUri

AppSettings -> CinchyLoginRedirectUri

appsettings -> CinchyPostLogoutRedirectUri

AppSettings -> CinchyPostLogoutRedirectUri

appsettings -> CertPath

AppSettings -> CertificatePath

connectionStrings -> SqlServer

ConnectionStrings -> SqlServer

Below configurations are only required for External login authentication, otherwise can be left as blank

Web.config (Previous location)

appsettings.json (New Location)

appsettings -> SAMLClientEntityId

AppSettings -> SAMLClientEntityId

appsettings -> SAMLIDPEntityId

AppSettings -> SAMLIDPEntityId

appsettings -> SAMLMetadataXmlPath

AppSettings -> SAMLMetadataXmlPath

ExternalIdentityClaimSection -> FirstName -> ExternalClaimName

ExternalIdentityClaim -> FirstName -> ExternalClaimName

ExternalIdentityClaimSection -> LastName -> ExternalClaimName

ExternalIdentityClaim -> LastName -> ExternalClaimName

ExternalIdentityClaimSection -> Email -> ExternalClaimName

ExternalIdentityClaim -> Email -> ExternalClaimName

ExternalIdentityClaimSection -> MemberOf -> ExternalClaimName

ExternalIdentityClaim -> MemberOf -> ExternalClaimName

  • SAMLSSOServiceURL - Configure service endpoint for SAML authentication.

  • AcsURLModule - This parameter is needs to be configured as per your SAML ACS URL. For example, if your ACS URL looks like this - "https:///CinchySSO/identity/AuthServices/Acs", then the value of this parameter should be "/identity/AuthServices"

  • In Cinchy SSO, Log folder will required to be configured under log4net.config and web.config files. Please make sure that Identity under which application pool is running must have access to logs and certificate folder as configured.

  • In Cinchy Web application web.config file, modify "StsAuthorityUri" parameter to remove "identity" keyword from the URL.

    • URL will modify from "https://<your server URL>/SiteName/CinchySSO/identity" to "https://<your server URL>/sitename/cinchysso"

    • Please make sure the sitename and cinchysso is in lower case

Post Deployment Steps

  • Login to Cinchy and follow these below steps to update Launcher Objects table-

    • Go to Design Table

    • Update "Integrated App" link column name to "Applet".

    • Update "Type" choice column's choice from "Integrated App" to "Applet"

  • "Integrated Clients" table would be required to be populated from the data taken from "Integrated Apps" table in pre-deployment steps as shown in below table

Integrated Apps (Old Table)

Integrated Client

Id

Client Id

Login Redirect Url

Permitted Login Redirect URLs

Logout Redirect Url

Permitted Logout Redirect URLs

  • Please check off Query permissions under Design controls of "Users" table for "All Users" row.

  • Upload your organization logo on Admin screen - https://<your server URL>/Cinchy/Admin/Index

Logo

Custom Data Network Visualizer

Instructions on how to set up your own custom data network visualization.

Nodes

The nodes query defines the nodes in the network.

Edges

The edges query defines the relationships between the nodes.

Node Groups

Node groups are an optional query you can provide to group your nodes.

Timeline

If no start or end date is specified, the data network is just shown as is. If there's a start or end date, the other CQLs need to have a @date parameter and that will be used to render the data network at a point in time.

You can use @date between [Modified] and [Replaced] with a version history query to see data at a point in time. You can also simply use @date > [Created] if it's an additive system.

Timeline Start Date

This CQL should return a date value as 'startDate'.

Timeline End Date

This CQL should return a date value as 'endDate'.

Slicers

To use slicers, you need to define the slicers in the [Slicers] column and add the additional attributes to the nodes query.

Attribute is the column name from the nodes query, displayName is what shows up in the visualizer.

System Tables

All the information above is entered into the [Cinchy].[Networks] table. To access the network, go to

<Cinchy URL>/Cinchy/apps/datanetworkvisualizer?network=<NAME>

Alternatively you can go to My Data Network and then add ?network=<NAME> to the end of it.

It is highly recommended to add a new applet for each custom data network visualizer for ease of access.

Example Network

For ease of testing, save the following as saved queries and then in the Networks table simply add exec [Domain].[Saved Query Name] as the CQLs.

Node Groups CQL

Nodes CQL

Edges CQL

Authentication

Authentication Methods

The APIs in Cinchy use bearer token based authentication. This token is issued by the Cinchy SSO using the OAuth 2.0 Resource Owner Password Flow and can be retrieved for any Cinchy User Account or SSO Account. API calls made using a bearer token will run under the privileges of the authenticated user, and are driven by the configured data level access controls. You must include the token in each request in the Authorization header.

APIs that are dynamically generated through a Saved Query in Cinchy also allow for basic authentication. In this case, the url to the saved query is different, it will be:

https://<Cinchy Web URL>/BasicAuthAPI/MyDomain/MyQuery

The Resource Owner Password Flow uses a combination of a client id, client secret, username, and password to authenticate both the calling application as well as the user. To get started with, you must register a client in Cinchy. You should use a different client id for each calling application to distinguish activity from each source.

Registering a New Client

Clients are managed in the Integrated Clients table within the Cinchy domain. To register a client, create a new record in this table. In a fresh install, only members of the Cinchy Administrators group will have access to perform this function.

Below is a description of the value that should be used for each column in the Integrated Clients table.

API Method Table

POST: Bearer Token Request

https://<Cinchy SSO URL>/identity/connect/token

The Post Request will return an access token which can be used to access Cinchy APIs.

Header Parameters:

Name
Data Type
Description

Body Parameters:

Name
Data Type
Description

Responses:

  • 200: The request is successful

  • 400: For invalid parameters, a 400 error will be returned with the following JSON response with a description of the error.

    • Example:

To get a bearer token from Cinchy, you can provide either:

  • Username and password (username, password), or

  • SAML token (token)

Failure to provide a valid set of one of the above will not return a token.

Column

Description

Client Id

A unique identifier for each client. The client will use this identifier when retrieving a bearer token.

Client Name

A friendly name for the client to help users maintaining this record.

Grant Type

The OAuth 2.0 flow that will be used during authentication. "Resource Owner Password" should be selected for API calls.

Permitted Login Redirect URLs

N/A for the Resource Owner Password flow - leave this blank

Permitted Logout Redirect URLs

N/A for the Resource Owner Password flow - leave this blank

Permitted Scopes

The list of permitted OAuth scopes, please check all available options.

Access Token Lifetime (seconds)

The time after with the token expires. If left blank, the default is 3600 seconds.

Show Cinchy Login Screen

N/A for the Resource Owner Password flow

Enabled

This is used to enable or disable a client

Guid

This is a calculated field that will auto-generate the client secret

Content-Type

string

application/x-www-form-urlencoded

token

string

You can pass in your base64 encoded SAML token instead of your Cinchy username and password

client_id

string

Client Id value from Integrated Clients table

client_secret

string

Guid value from Integrated Clients table

username

string

Username of Cinchy user

password

string

Password for Cinchy user in plain text

grant_type

string

Set as "password" for username/password authentication. Set as "saml2" for saml token authentication.

scope

string

Set as "js_api"

{
    "access_token": "eyUzI1NiIsImtpZCI6IkE4M0UwQTFEQTY1MzE0NkZENUQxOTFDMzRDNTQ0RDJDODYyMzMzMzkiLCJ0eXAiOiJKV1QiLCJ4NXQiOiJxRDRLSGFaVEZHX1YwWkhEVEZSTkxJWWpNemsifQ.eyJuYmYiOjE1NTQxMzE4MjAsImV4cCI6MTU1NDEzNTQyMCwiaXNzIjoiaHR0cDovL2xvY2FsaG9zdDo4MDgxIiwiYXVkIjpbImh0dHA6Ly9sb2NhbGhvc3Q6ODA4MS9yZXNvdXJjZXMiLCJqc19hcGkiXSwiY2xpZW50X2lkIjoiYWJjIiwic3ViIjoiMSIsImF1dGhfdGltZSI6MTU1NDEzMTgyMCwiaWRwIjoibG9jYWwiLCJwcm9maWxlIjoiQWRtaW5pc3RyYXRvciIsImVtYWlsIjoiYWRtaW5AY2luY2h5LmNvIiwicm9sZSI6IkNpbmNoeSBVc2VyIEFjY291bnQiLCJpZCI6ImFkbWluIiwic2NvcGUiOlsianNfYXBpIl0sImFtciI6WyJjdXN0b20iXX0.N7drAlvtFiQoN4njs1rd5ZnTvJ_x8ZEnUEi6G1GjR4FS5FyS4hC6xdsT-Zhn1yRJQMkI2HA7HMPWwjsfkZ0IlBwuC25ECkGhbjv7DlK6baHQIkqeB0aTB9aDZSxWfDhV66O0dhby6EIEa4YuGspyjQMsDpx_LimmE9alfsUU-608944ZZkS6lBJlJ9LFCC5hYKARQIMZavrftz0tFUBsDU0T2fHpLNGo5GGwG1f9jUZTWTu7s3C05EsgboW3scUfDzjS_Wf55ExwhopIg9SD6ktHYYNRaCPtfMhU-e43l6a2LH-XrmP7OfoxJP2bvTMcvQCQWUEizKHuxKLl-ehWBw",
    "expires_in": 3600,
    "token_type": "Bearer"
}
{
  "error": "invalid_grant",
  "error_description": "Invalid username or password"
}

Parameter

Description

id

Id for the node. This will be used by the edges to define the relationships.

title

This is the text that is displayed when hovering on a node.

label

The label that is shown below the node.

value

The visual size of the node relative to other nodes.

mass

The gravitational pull of a node. Unless you really want to customize the visualizer, it is recommended to keep this the same value as the value.

group

Optionally you can associate a node with a group.

color

Optional hex code for the color of a node. The node will take the color of the group if a color is not specified for the node.

description

The description shows up in the top right hand corner when you click a node.

nodeURL

Page to display when you click the open button next to the description.

Parameter

Description

id

Id for the edge.

label

Label that shows up on the edge.

from

Originating node id.

to

Target node id. Can be the same as the from node, it will show a loop back into the same node.

showArrowTo

Set this to True if you want to show the direction of the relationship.

showArrowFrom

Generally should only be used for bi-directional relationships along with the arrow to.

Parameter

Description

sub network

Name for the group

color

Hex value for the color of the group

[
    {
        "attribute": "slice1",
        "displayName": "First Slicer"
    },
    {
        "attribute": "slice2",
        "displayName": "Second Slicer"
    }
]
-- Sample Data
SELECT
  [Sub Network] = 'Blue Network'
, [Color] = '#03a9f4'
INTO #TEMP
UNION SELECT
  [Sub Network] = 'Green Network'
, [Color] = '#8bc34a'
-- Node Groups CQL
SELECT
  t.[Sub Network] as 'sub network'
, t.[Color] as 'color'
FROM #TEMP t
-- Sample Data
SELECT
   [Id] = 'N01'
,  [Title] = 'Hover text on the node.'
,  [Label] = 'Purple Node 1 (Ax)'
,  [Value] = 5
,  [Mass] = 5
-- optional
,  [Group] = 'Green Network'
,  [Color] = '#ab47bc'
,  [Description] = 'This description shows up in the top right hand corner when you select a node.'
,  [Node URL] = 'https://www.cinchy.com'
,  [Slicer 1] = 'A'
,  [Slicer 2] = 'x'
INTO #TEMP
UNION SELECT
   [Id] = 'N02'
,  [Title] = 'Hover text on node 2.'
,  [Label] = 'Node 2 (Ay)'
,  [Value] = 10
,  [Mass] = 10
-- optional
,  [Group] = 'Green Network'
,  [Color] = ''
,  [Description] = 'This description shows up in the top right hand corner when you select a node.'
,  [Node URL] = ''
,  [Slicer 1] = 'A'
,  [Slicer 2] = 'y'
UNION SELECT
   [Id] = 'N03'
,  [Title] = 'Hover text on node 3.'
,  [Label] = 'Node 3 (Bx)'
,  [Value] = 1
,  [Mass] = 1
-- optional
,  [Group] = 'Blue Network'
,  [Color] = ''
,  [Description] = ''
,  [Node URL] = ''
,  [Slicer 1] = 'B'
,  [Slicer 2] = 'x'
UNION SELECT
   [Id] = 'N04'
,  [Title] = 'Hover text on node 4.'
,  [Label] = 'Minimum Node 4 (y)'
,  [Value] = 3
,  [Mass] = 3
-- optional
,  [Group] = ''
,  [Color] = ''
,  [Description] = ''
,  [Node URL] = ''
,  [Slicer 1] = ''
,  [Slicer 2] = 'y'
UNION SELECT
   [Id] = 'N05'
,  [Title] = 'Hover text on node 4.'
,  [Label] = 'Orphan Node 5 (By)'
,  [Value] = 2
,  [Mass] = 2
-- optional
,  [Group] = ''
,  [Color] = ''
,  [Description] = ''
,  [Node URL] = ''
,  [Slicer 1] = 'B'
,  [Slicer 2] = 'y'
-- Nodes CQL
SELECT
   t.[Id] as 'id'
,  t.[Title] as 'title'
,  t.[Label] as 'label'
,  t.[Value] as 'value'
,  t.[Mass] as 'mass'
,  t.[Group] as 'group'
,  t.[Color] as 'color'
,  t.[Description] as 'description'
,  t.[Node URL] as 'nodeURL'
,  t.[Slicer 1] as 'slice1'
,  t.[Slicer 2] as 'slice2'
FROM #TEMP t
-- Sample Data
SELECT
  [Id] = 'E01'
, [Label] = 'Node 1 to Node 2, Double Arrows'
, [From] = 'N01'
, [To] = 'N02'
-- optional
, [Show Arrow To] = 'True'
, [Show Arrow From] = 'True'
INTO #TEMP
UNION SELECT
  [Id] = 'E02'
, [Label] = 'Node 2 to Node 3, To Arrow'
, [From] = 'N02'
, [To] = 'N03'
-- optional
, [Show Arrow To] = 'True'
, [Show Arrow From] = 'False'
UNION SELECT
  [Id] = 'E03'
, [Label] = 'Node 1 to Node 4, From Arrow'
, [From] = 'N01'
, [To] = 'N04'
-- optional
, [Show Arrow To] = ''
, [Show Arrow From] = 'True'
UNION SELECT
  [Id] = 'E04'
, [Label] = 'Node 3 to Node 3, No Arrow'
, [From] = 'N03'
, [To] = 'N03'
-- optional
, [Show Arrow To] = ''
, [Show Arrow From] = ''

-- Edges CQL
SELECT
   t.[Id] as 'id'
,  t.[Label] as 'label'
,  t.[From] as 'from'
,  t.[To] as 'to'
,  t.[Show Arrow To] as 'showArrowTo'
,  t.[Show Arrow From] as 'showArrowFrom'
FROM #TEMP t
Not all nodes need to be in a sub network

Frequently Asked Questions

How do I know which version of Cinchy I am on?

Navigate to <baseURL>/healthcheck

(ex. if your current URL is https://cinchy.mycompany.com/Tables/123?viewId=0 then you would navigate to https://cinchy.mycompany.com/healthcheck)

The response looks this:

{ 
  "component": "Cinchy", 
  "version": "4.14.0.0", 
  "ipAddress": [ 
      "172.31.14.171", 
      "172.19.64.1" 
  ], 
  "systemTime": "2020-06-18T19:43:54.1692859Z", 
  "status": "Green", 
  "healthChecks": [ 
    { 
      "name": "Database Connectivity", 
      "description": "Validates that the application can connect to the database",
      "status": "Green" 
   } 
 ] 
}

In this case your Cinchy version is 4.14.0.0

If you would like to use the healthcheck link for monitoring of the Cinchy application you can add

?return503OnFailure=true

to the URL

How can I enter a new line into a field in Manage Data?

You can add line breaks in a cell on the UI, the same way as in Excel, by typing Alt+Enter. If you use the expanded row heights option, or manually expand the row, it will show the line breaks.

How can I automatically check if a CLI data sync was successful or failed?

You can check if a data sync was successful by its exit code. Below is sample code in Powershell to check for the exit code and what they mean.

Invoke-Expression $CLICommand
switch ($LASTEXITCODE) {
  0 { Write-Host "Completed without errors" }
  1 { Write-Host "Execution failed" }
  2 { Write-Host "Completed with validation errors" }
}

From the command prompt the following will also return the error code:

echo %ErrorLevel%

My Insert/Update statement is making multiple changes instead of 1

A query like the following will cause multiple inserts instead of one if your result type is set to Query Results instead of # of Rows Affected.

INSERT INTO [Customer].[Tickets] ( [Ticket Id], [Subject] )
VALUES ( 1900, 'This is a Test' );
SELECT [Cinchy Id],
       [Ticket Id],
       [Subject]
FROM [Customer].[Tickets]
WHERE [Deleted] IS NULL

The same applies to UPDATE statements.

If you need to perform inserts and updates in a query and want to return data at the end, another option is to use the "Single value (First Column of First Row)" return type, which will only be able to return a single value.

Does a row filter restrict access for a Cinchy administrator?

Currently Cinchy Administrators have access to view/edit/approve all data in the platform. There is no way currently to restrict access for Cinchy administrators.

A workaround is to create a separate "administrators" group which has edit access to all Cinchy system tables, and just leave the "admin" user account or superadmins as "Cinchy administrators."

My query parameter is not working

When I pass a value to the following query, the result is empty.

DECLARE @nbdays AS INT;
SELECT @nbdays;

The query works without the DECLARE statement. When the DECLARE statement is present, the input variable is ignored, and needs to be SET. In order to still get the variable from the input, a second variable is needed.

DECLARE @nbdays AS INT;
SET @nbdays = @inputDays;
SELECT @nbdays;

I cannot disable Change Approval

This is caused by records in Draft status. To retrieve these records, run a query with return type Query Results (Including Draft Data).

SELECT T.*
FROM [Your Domain].[Your Table] T
WHERE T.[Approval State] <> 'Approved'

After approving these records, you will be able to disable change approval.

You may have to restore cancelled records, approve them, and delete them so that everything is approved.

How do I get the change history through CQL

You write the query for the records for which you want the change history, including system columns like [Version], [Created], ... and the columns for which you like to see the changes.

You can add an ORDER BY [Version] (either ASC or DESC)

Then you change the query return type to "Query Results (Including Version History )"

The following query will show when the Cinchy instances were upgraded.

SELECT [Version], [Modified], [Model Version]
FROM [Cinchy].[Models]
WHERE [Deleted] IS NULL AND [Name]='Cinchy'
ORDER BY [Version] DESC

Some of the columns have been re-arranged under the default "All Data" view

The default All Data view displays the columns in the same order as in Design Table. But you can create a view and change the columns displayed and their order.

Can I get a record count from a delimited file before running the CLI

You can use PowerShell to count the lines in a delimited file and based on the result decide if you will run the CLI.

$row_count=(get-content sample_150k.csv).length
write-host $row_count

If ($row_count -lt 50000)
{
     exit
}
else {
      Write-host "run CLI"
}

There is currently no way for you to find out how many records will be inserted/updated/deleted if you run a CLI without performing the sync.

How to prevent wrong data to be loaded from external applications

The best way to load data from external sources into Cinchy, is by using Cinchy's data sync.

You can do the following to preview your changes:

  • Create staging tables to validate the data first.

  • Use formatting rules in Cinchy, to highlight data that is not valid.

  • Configure a CLI using a Cinchy Query source to move the data from the staging tables to the permanent tables.

Multi-select option on Link column is disabled

Once link column is added to a table and saved, the multi-select checkbox should be disabled. If you need to change the option, you need to rename the column and create a new link column with the correct option.

How to remove the leading 0 from an incoming field using the CLI

This can be done by using Transformations in the sync configuration of a column. Here is an example:

<Column name="Value 2" dataType="Text" >
      <Transformations>
            <StringReplacement pattern="^0*" replacement="" />
      </Transformations>
</Column>

The pattern contains a regular expression:

^ - anchor for the beginning of the string

0 - the string to replace

* - quantifier to be applied to 0 or more occurrences

What permissions are needed for a user to be able to create and edit views?

The user needs to have "Design Table" permissions granted for the table where he/she will create or edit views and also needs to have the "Can Design Tables" checked in the [Cinchy].[Users] table.

Is it possible to correct/replace a table or column's GUID?

It can be done. It is very unlikely that the GUID you want to change to is already allocated, but you should still check. Filter the [Cinchy].[Table Columns] for the new GUID. You should not find it. Then replace it in two places:

  • the json field in [Cinchy].[Tables] - replace it in the column definition

  • the GUID field in [Cinchy].[Table Columns]

To replace the table GUID, replace it in the json in [Cinchy].[Tables] and in the GUID field in [Cinchy].[Tables].

When you are done, restart the Cinchy UI. ​

Cloning a Cinchy table

  1. Export the Model to XML from the Design Table info tab

  2. Open the exported model in an editor and change the name of the model

  3. Change the name of the table

  4. remove the guids from the table in the model and save the file

  5. Use the modelloader at <cinchy base URL>/apps/modelloader to upload the modified model

  6. Export the data from the Manage Data screen of the initial table and import it in the new table

Cloning a domain

If you just have a group of tables, see the instructions below. If you have tables, queries, you want to port the permissions, etc. you can use this: Data Experience Deployment - Cinchy Platform Documentation

Table only instructions:

1. Create a dummy Data Experience and add all your tables from your domain to it.

2. Hit this endpoint with the GUID in your row

<CinchyURL>/api/createdxversion?guid=<GUID>

How do I map a parameter's value to one of my target columns?

3. Use the model loader to load it back in the system (/apps/modelloader).

You create a calculated column in the source and give it the value of the parameter.

4. For each table, export and import the data via the UI.

<Parameters>
    <Parameter name="snapshotDate" />
</Parameters>
...
<Schema>
...
    <CalculatedColumn name="Snapshot Date" formula="@snapshotDate" dataType="Date" />
<Schema>

Then map the calculated source column to the target. The order of the columns in the source is important. If your source is a file, put the calculated columns at the end in the source, after all the actual columns in the file.

How do I insert, update and delete links in a multi-select link field using CQL?

Removing and updating a multi-select a link, is the same as setting the link field. The field needs to be updated with the list of values.

The value is a concatenated string of '[Cinchy Id],[Version],[Cinchy Id],[Version],[Cinchy Id],[Version]' from the lookup values

UPDATE T
SET T.[Multi-Link Field Name] = '1,1,2,1,3,1'
FROM [Domain].[Table Name] T
WHERE T.[Deleted] IS NULL AND ...

In this example it would set [Multi-Link Field Name] to values with [Cinchy Id] 1, 2, and 3. The version after each Cinchy Id should be 1.

You must provide the full list of multi-select values. If your field was '1,2' and you update it with '3,1' it will end up as '3', not '1,2,3'.

Null values are not updating correctly in Salesforce using the Data Sync

When performing a data sync with a Salesforce target, you need to replace nulls with '#N/A' in the source. You can use ISNULL([Column],'#N/A') in the source query. The following is a link to the Salesforce documentation related to this topic:

https://help.salesforce.com/articleView?id=000328822&language=en_US&type=1&mode=1

I cannot find the column I want to use for linking to in a linked column but the column is there in the table.

In this example it would set [Multi-Link Field Name] to values with [Cinchy Id] 1, 2, and 3. The version after each Cinchy Id should be 1."

Columns do not "Allow Linking" by default. Check the properties of the column in the original table and make sure that in “Show Advanced” the “Allow Linking” checkmark box is selected. If you do not have Design Table access to that table, you will need to ask someone who does to do it.

How can I see who has modified my data?

Right click on the row you want additional information, select the Collaboration Log.

You can also add the "Modified By" and "Modified" columns into the current view/to your query if you want to see it for multiple rows at once.

How do I create a Cinchy user with a set password?

One Time set up:

  1. Open the Users table

  2. For the password of this user, copy the admin user's password and paste it into the Password field of "defaultuser".

  3. Set the Password Expiration Timestamp to today

  4. In an Incognito browser, navigate to the Cinchy website

  5. Sign in as defaultuser with the admin user password

  6. Cinchy will ask you to change the password for defaultuser, change it to a default password you will give out every time you create an account.

  7. In the original session window, refresh the Users table and remove the Password Expiration Timestamp for defaultuser

Each time, for new users:

  1. Open the Users table

  2. Create the new user, for example "sandip"

  3. For the password of this user, copy the "defaultuser" password and paste it into the Password field of "sandip".

  4. Set the Password Expiration Timestamp to today

  5. Give the user their username and friendly password created in step 7 above. They will be asked to change their password on first sign in.

Can I restore my deleted data?

If the record is still in the table, but has been overwritten by mistake, access your Collaboration Log for the row, and restore back to the correct version.

If your row has been deleted by mistake, access your Recycling Bin, locate the row and restore it.

The only way to truly delete data on the platform is through Data Erasure and Data Compression.

I have access to the table but I cannot see any rows

There are a couple reasons why you may not be able to see any rows:

  • View Filter

  • Data access controls

  • Error with the View or Table

View Filter

Check the All Data view and see if there is data there, if that is the case but a particular view has no rows, there could be a filter on the view. For example, if there is a "Due Soon" or "My Actions" view, it could just be that there are no records assigned to you that require actioning.

Data Access Controls

Access controls set on the table could cause you to have access to 0 records. Since you are able to set row level filters in Cinchy, it may be the case that the permissions of the table has not changed, but the data has changed such that you no longer have permission or vice versa.

Error

There may be an error on the view. If the bottom of the page does not show 0 records then there may be an error on the page.

I cannot find the [Cinchy].[Table Access Control] table

The [Cinchy].[Table Access Control] table does not show in the Market Place, but you can query for the data in the table.

SELECT *
FROM [Cinchy].[Table Access Control]
WHERE [Deleted] IS NULL AND [Table]='HR.Employees'

How do I parse a pipe delimited file using the CLI?

Set the delimiter to "|".

<DelimitedDataSource delimiter="|" textQualifier="&quot;"  headerRowsToIgnore="2" path="@filePath" encoding="UTF8">

Cinchy dates are being saved as 1900-01-01 when updating using a variable

When updating a date field using a variable, and no value is entered for that variable, the date field will be 1900-01-01. To avoid this, use a case statement to replace the empty string with NULL, as shown in the following example:

UPDATE E
SET E.[Date Hired]=CASE WHEN @dhired<>'' THEN @dhired ELSE NULL END
FROM [HR].[Employees] E
WHERE E.[Deleted] IS NULL AND [Employee ID]=@empid

How can I check for platform errors?

First check <base URL>/elmah, which stores web-related induced errors.

Then check the logs, which can be accessed from <base URL>/admin/index.

Cinchy logs will contain all exceptions thrown by the Cinchy Web application. This includes failed queries, stack overflows and much more.

CinchySSO logs will contain IDP errors

Unable to use COALESCE with a link column in a calculated column

If [Person 1] and [Person 2] are Link columns and [Member] is a Text column, a calculated column with the following expression

COALESCE([Person 1],[Person 2],[Member]) will fail to save.

Please cast the link columns to VARCHAR

COALESCE(CAST([Person 1] AS VARCHAR(50)),CAST([Person 2] AS VARCHAR(50)),[Member])

Passing Parameters to a Query Called With EXEC

Declare and set the parameters before invoking the query:

DECLARE @dep AS VARCHAR ( 500 );
SET @dep = 'Accounting';
EXEC [HR].[Employees and Departments]

[HR].[Employees and Departments] is:

SELECT [Employee ID],
       [Full Name],
       [Date Hired],
       [Department]
FROM [HR].[Employees]
WHERE [Deleted] IS NULL
      AND [Department] = @dep

Can I send multiple comma-delimited values to a query parameter? [i.e. 234,233,365 to be used in WHERE [Id] IN (@param)]

For example: 4,10,15 to be used in WHERE [Id] IN (@param)

This can be done by using parameters in {}, such as {0},{1},...

These will be replaced with the exact text when running the query.

For example: ​ SELECT * FROM [HR].[Employees] WHERE [Deleted] IS NULL AND [Employee ID] IN ({0})

Data Experience Deployment

Introduction to Cinchy DXD

CinchyDXD is a utility (downloadable) used to move Data Experiences (DX) from one environment to another. This includes any and all objects and components that have been built for or are required in support of the Data Experience.

The following sections in this document will outline the basics of how to build, export, and install a DX’s.

Items of note moving forward in this document:

  • Source Environment - is the environment in which the DX is built

  • Target Environment - is the environment in which the DX will be installed

  • The example DX is a simple Currency Converter DX that consists of

    • One (1) table

    • One (1) query

  • This example does not include the following:

    • NO applets

    • NO integrated clients

    • NO Data Sync Configurations

    • NO Reference Data

    • NO Models

    • NO Groups

    • NO System Colours

    • NO Formatting Groups

    • NO Literal Groups

Future iterations of this document will add to this example's complexity level.

Build the Data Experience (DX)

Table Creation

Create your data experience (DX) in a virtual data fabric.

  1. Logon to Cinchy URL: <cinchy source URL> User ID: <source user id> Password: <source password>

  2. From under “My Network”, click the Create button

  3. Select Table

  4. Select From Scratch

  5. Create the table with the following properties

Table Details

Values

Table Name

Currency Exchange Rate

Icon + Colour

Choose your own icon

Domain

Sandbox (if the domain does not exist, create it)

To create a domain on the fly:

  1. Enter domain name in Domain field

  2. Hit enter on keyboard

  3. On the Confirm Domain window, click Yes

Description

This table is a test table for building and deploying a data experience for currency conversion

6. Click Columns in the left hand navigation to create the columns for the table

7. Click the “Click Here to Add” a column tab to add a column

Column Details

Values

Column 1

Column Name: Currency 1

Data Type: Text

Advanced Settings:

  • Select Mandatory

  • Leave all other defaults

Column 2

Column Name: Currency 2

Data Type: Text

Advanced Settings:

  • Select Mandatory

  • Leave all other defaults

Column 3

Column Name: Rate

Data Type: Number

Advanced Settings:

  • Set Decimal Places to 4

  • Select Mandatory

  • Leave all other defaults

8. Click the Save button to save your table

Enter Sample Data

  1. In your newly created table, enter the following sample data:

Currency 1

Currency 2

Rate

CAD

USD

0.71

USD

CAD

1.40

Create Query

Create a simple query that pulls information from the Currency Exchange Rate table that will allow a simple currency exchange calculation.

  1. From “My Network”, click the create button

  2. Select Query

3. In the query builder locate the Currency Exchange Rate table and drag it to the “FROM” line Hint: you will find the Currency Exchange Rate table in the “Sandbox” domain, to expand the “Sandbox” domain, click on the gray arrow (or double click)

4. In the “SELECT” line drag and drop the “Rate” column and enter in the following: SELECT [Rate] * @Amount AS 'Converted Amount' Hint: you will find the Rate column by expanding the Currency Exchange Rate table, similarly to expanding the “Sandbox” domain

5. Enter in the following for the WHERE clause

WHERE [Deleted] IS NULL AND [Currency 1] = @Currency_1 AND [Currency 2] = @Currency_2

6. Click the Execute (or play) icon to run the query

7. Test the query by entering in the following and clicking the submit button

@Amount: 100 @Currency_1: CAD @Currency_2: USD

8. Save the Query by clicking on the Info tab (Left Navigation) 9. Enter in the following details for the query

Query Details

Values

Query Name

Currency Converter

Icon + Colour

Choose your own icon

Return

Query Results (Approved Data Only)

Domain

Sandbox

API Result Format

JSON

Description

This query is a test query for building and deploying a data experience for currency conversion

10. Click the Save button

Package the Data Experience (DX)

Download CinchyDXD Utility

The CinchyDXD utility is used to take all of the components (e.g. tables, queries, views, formatting rules, UDF’s etc…) of a DX and package them up so they can be moved from one environment to another.

The CinchyDXD utility is only required (made accessible) for the environment that is packing up the data experience. It is not required for the destination (or target) environment.

For CinchyDXD to work, you must have CinchyCLI installed. For further installation instructions please refer to CLI (https://cli.docs.cinchy.com/) documentation

To access the Data Experience Deployment utility please contact Cinchy support ([email protected]).

To download the Utility:

  1. Login to Cinchy

  2. Navigate to the Versions Table

  3. Select the Experience Deployment Utility View

  4. Locate and download the utility (e.g. Cinchy DXD v1.3.1.zip)

Note: the CinchyDXD utility is only upwards compatible with Cinchy version 4.6+

5. Unzip the utility and place the folder at any location on a computer that also has CinchyCLI installed

6. Create a new folder in the same directory that will hold all of the DX exports generated (e.g. CinchyDXD_Output)

This folder will then hold all of your deployment packages.

7. Launch a Powershell console window

8. From the console, navigate to the CinchyDXD directory

Tip: From within your file explorer window (folder: Cinchy DXD v.X) type “Powershell” into the file path. It will launch a Powershell window already at the folder path.

One Time Setup: Powershell

There is a one (1) time powershell setup that is required when using CinchyDXD.

  1. From your Powershell window type cin

  2. Hit Tab on your keyboard

3. Hit Enter on your keyboard

You will get an error message (above) that CinchyDXD.ps1 cannot be loaded because the running script is disabled. To resolve this error:

4. From your start menu, search for Powershell and select Run as Administrator

5. When prompted “if you want to allow this app to make changes on your device”, select Yes.

6. In your Powershell Administrator window enter Set-ExecutionPolicy RemoteSigned

7. Hit Enter on your keyboard

8. When prompted with the Execution Policy Changes, enter A for “Yes to All”

9. Close the Powershell Administrator window

10. Navigate back to your Powershell window for the CinchDXD v.X window

11. From your Powershell window type cin

12. Hit Tab and then Enter on your keyboard

The basic CinchyDXD instructions will be displayed. You will be able to execute commands such as exporting and installing a Data Experience.

Cinchy DXD Tables Overview

There are four (4) tables in Cinchy that are used for packing up and deploying a Data Experience.

Note: The Data Experience is defined and packed in what will be referred to moving forward as the “Source Environment”. Where the environment that the Data Experience will be deployed to will be referenced to as the “Target Environment”.

  1. Data Experience Definition Table - this is where the data experience is defined (e.g. tables, queries, views, formatting rules, UDF’s etc.)

  2. Data Experience Reference Data Table - this is where we define any data that needs to move with the Data Experience for the experience to work (e.g. lookup values, static values that may need to exist in tables - it typically would not be the physical data itself)

  3. Data Experience Releases Table - once a Data Experience is exported, an entry is created in this table for the export containing:

    • Version Number

    • Release Binary is the location where you can archive/backup your release history in Cinchy Please Note: if you have your own release management system, you do have the option to opt out of archiving the releases in Cinchy and check the release into your own source control

    • Release Name

    • Data Experience

  4. Data Experience Release Artifact Table - stores all of the files that are part of the Data Experience package as individual records along with all of the binary for each record

STEP 1: Define the Data Experience

When setting up a Data Experience definition, you will need one (1) definition for each Data Experience you wish to package and deploy to a given number of Target Environments.

  1. Locate and open the Data Experience Definitions table

Column

Definition

GUID

This value is calculated, please note this value will be required as one of your export parameters in Powershell

Name

This is the Name of your Data Experience

Tables

Select all tables that are part of the Data Experience

Integrated Clients

Select any integrated clientes (e.g. Tableau, PowerBI, custom integrations) that are part of the Data Experience

Data Sync Configurations

Select any data syncs (e.g. CLI’s experience needs to work) that are part of the Data Experience

Reference Data

Select any reference data that is part of the Data Experience. Please note that the setup of the reference data is done in the table called Data Experience Reference Data (see step 2 below for setup details)

User Defined Functions

Select any user defined functions (e.g. validate phone, validate email) that are part of the Data Experience

Models

Select any custom models that override columns or tables in your Data Experience, if there are none - leave blank

Groups

Select any groups that are part of the Data Experience (when moving groups, it will also move all table access [design] controls)

System Colours

Select a system colour (if defined) for the Data Experience

Saved Queries

Select any queries that are part of the Data Experience

Applets

Select any appletes that are part of the Data Experience

Formatting Rules

Select any formatting rules that are part of the Data Experience

Literal Groups

Select any literals that are associated to the Data Experience (e.g. key values with English and French definitions)

Builder

Select the builder(s) who have permission to export the Data Experience

Builder Groups

Select the builder group(s) that have permission to export the Data Experience

Note: Best Practice is to use a Group over a User. Users within groups can fluctuate, where the Group (or Role) will remain. This will require less maintenance moving forward

Sync GUID

Leave this column blank

2. Complete the enter/select the following:

Column

Value

Name

Currency Converter

Tables

Currency Exchange Rate (Sandbox)

Saved Queries

Currency Converter

Builder Groups

Currency Converters

Note: If you make changes to the DX in the future, you are NOT required to build a new Data Experience Definition in this table, you will update the existing definition. If you need to review what the definition looked like historically, you can view it via the Collaboration log.

STEP 2: Define the Reference Data

When setting up a Data Experience Reference Data definition, you will need one (1) definition for each Reference Data table you wish to package and deploy with your Data Experience to the Target Environment.

Note: this table set up will be similar to how you would set up a CLI.

  1. Locate and open the Data Experience Reference Data table

Column

Definition

Name

This is the Name of your Reference Data Table, note this name can be anything and does not have to replicate the actual table name

Ordinal

The ordinal number assigned will identify the order in which the data is loaded and required based on dependencies within the data experience. For example if you have tables that have hierarchies in them, you will need to load the parent records first and then load your child records which would then resolve any links in the table.

Filter

This is where a WHERE clause would be required. For example, if you have a table that has hierarchies, you would require two rows within the Data Experience Reference Data table. One to load the parent data and one to load the children data. In the parent record a filter WHERE clause would be needed to filter all parent records. In the second record in the filter column a WHERE clause in another in the secord record that would be needed to filter the children records.

New Records

Identify the behaviour of a new record (e.g. insert, update, delete, ignore)

Change Records

Identify the behaviour of a changed record (e.g. insert, update, delete, ignore)

Dropped Records

Identify the behaviour of a dropped record (e.g. insert, update, delete, ignore)

Table

Identify the table that you are exporting data from

Sync Key

Required (need definition)

Expiration Timestamp Field

If Dropped Records is set to “Expire” then a timestamp column is required

Based on the configuration set up in this table, Cinchy will export the data and create CSV and CLI files.

Please note in this example we do not have Reference Data as part of our Data Experience.

STEP 3: Export the Data Experience

Using Powershell you will now export the Data Experience you have defined within Cinchy.

  1. Launch Powershell and navigate to your CinchyDXD folder

Reminder: you can launch Powershell right from your file explorer window in the CinchyDXD folder by entering in the folder path “powershell” and hitting enter on your keyboard. Saving you an extra step of navigating to the CinchyDXD folder manually in Powershell.

2. In the Powershell window type in cin and hit tab on your keyboard

3. Hit Enter on your keyboard, you will see a list of commands that are available to execute.

4. In the Powershell command line hit your “up” arrow key to bring back the last command and type export next to it.

5. Hit Enter on your keyboard

The Powershell window will provide you with the required and optional components to export the data experience.

6. Let’s now set up our mandatory export parameters

NOTE: the parameters executed in Powershell can exist on one line in powershell, however for legibility (below) the parameters have been put on separate lines. If you are putting your parameters on separate lines you will be required to have backticks quote ` for the parameters to execute.

NOTE: please ensure that you are using the sample below as a sample. You will be required to provide values that correspond to:

  • the URL for the source environment

  • the User ID for the user who is performing the export

  • the Password for the user who is performing the export

  • your folder path for where CLI is stored

  • your folder path for where the CLI output files are written to

  • the GUID for the Data Experience that is generated in the Data Experience Definition table

  • your own version naming convention

  • your folder path for where your CinchyDXD output files are written to

Sample: .\CinchyDXD.ps1 export ` -s "<cinchy source url>" ` -u "<source user id>" ` -p "<source passsword>" ` -c "C:\Cinchy CLI v4.0.2" ` -d "C:\CLI Output Logs" ` -g "8C4D08A1-C0ED-4FFC-A695-BBED068507E9" ` -v "1.0.0" ` -o "C:\CinchyDXD_Output" `

7. Enter the export parameters into the Powershell window

8. Hit Enter on your keyboard to run the export command

Powershell will begin to process the export. Once the export is complete, Powershell will provide you with an export complete message.

STEP 4: Validate Export

Now that the export is completed, be sure to validate the following:

  1. DXD Export Folder is populated

2. Data Experience Release table is populated in the source environment

3. Data Experience Release Artifacts table is populated in the source environment

Install the Data Experience (DX)

The install of a Data Experience is executed in a different environment than that of the export. Please ensure that before moving forward with the following instructions you have an environment to install the data experience into. The install of a data experience MUST be done in the same version i.e. your source and target environment version MUST be the same (e.g. Source Version = 4.11 | Target Version = 4.11)

Below are the details that will be required for the installation environment

  • Source: <cinchy target url>

  • UserID: <target user id>

  • Password: <target password>

STEP 1: Install the Data Experience

Using Powershell you will now install the Data Experience you have exported out of Cinchy.

1. Open File Explorer, navigate to the DX exported folder (see Step 4: Validate Export [e.g. Currency Converter folder])

2. In the folder path URL for the exported data experience type in powershell (to launch Powershell for that path).

3. Hit Enter on your keyboard, the powershell window will appear

4. In the Powershell window type in cin and hit tab on your keyboard

5. In the Powershell command line hit next to .\CinchyDXD.ps1 type install

6. Hit Enter on your keyboard

The Powershell window will provide you with the required and optional components to install the DX.

7. Let’s now set up our mandatory install parameters

NOTE: the parameters executed in Powershell can exist on one line in powershell, however for legibility (below) the parameters have been put on separate lines. If you are putting your parameters on separate lines you will be required to have backticks quote ` for the parameters to execute

Sample: .\CinchyDXD.ps1 install` -s "<target Cinchy url>" ` -u "<target user id>" ` -p "<target password>" ` -c "C:\Cinchy CLI v4.0.2" ` -d "C:\CLI Output Logs" `

Note: be sure that the user(s) and group(s) required to install a DX are existing in your target environment. If they are not, Powershell will generate an error message when you attempt to install the DX.

8. Enter the install parameters into the Powershell window

9. Hit Enter on your keyboard to run the install command. Once the Data Experience has been installed you will get a message in Powershell that the install was completed.

STEP 2: Validate Install

Now that the install is completed, be sure to validate the following:

  1. Models Table in populated in the Target environment with the model that was installed

2. Currency Exchange Rate tale exist in the Target environment

3. Currency Converter query exist in the Target environment

4. Data Experience Definitions table is populated with the DX parameters that were setup in the Source environment

5. Data Experience Releases table in the target environment is populated

Update the Data Experience (DX)

There are a few updates that are required in the Data Experience that has been created in our source environment. We do not want to have to repeat the updates in both the source and target environments. In the upcoming sections we will look at how to update the data experience in the source environment, re-package (re-export) the data experience and reinstall the data experience in the target environment.

Table Updates

  1. Log back into the source environment using the following: URL: <cinchy source url> User ID: <source user id> Password: <source password>

  2. Make the following changes to the Currency Exchange Rate Table:

Column Details

Values

Column 1

Current Column Name Value = Currency 1

New Column NameValue = From Currency

All other settings remain the same

Column 2

Current Column Name Value = Currency 2

New Column NameValue = To Currency

All other settings remain the same

Be sure to save your change before leaving the table.

Query Updates

  1. Update the Currency Converter query to reflect column name changes that were made in the Table Updates section (above)

Reminder: Be sure to update the @Currency_1 and @Currency_2 labels to better reflect the input fields

2. Test the query to validate that it is still functioning

3. Be sure to save your query

Re-Package the Data Experience (DX)

Now that you have made any necessary changes to the DX it is time to re-export the package out of our source environment and re-install it into our target environment.

It is important to note that you should not create a new record in the Data Experience Definition table though the evolution of your DX, the initial record will be used to expand the definition of the DX. For example, if new objects (tables, queries, applets) integrations and UDF’s are added to the DX, the DX definition will need to be updated. To see the historical definitions of the DX, please see the Collaboration Log.

STEP 1: Update Data Experience Table

If you have added or removed any of the following you will need to update the Data Experience Definition table:

  • Name

  • Tables

  • Integrated Clients

  • Data Sync Configurations

  • Reference Data

  • User Defined Functions

  • Models

  • Groups

  • System Colours

  • Saved Queries

  • Applets

  • Literal Groups

  • Builders

  • Builder Groups

  • Sync GUID

STEP 2: Update Reference Data Table

If you have added or removed any of the following you will need to update the Data Experience Reference Data table:

  • Name

  • Ordinal

  • Filter

  • New Records

  • Changed Records

  • Dropped Records

  • Table

  • Sync Key

  • Expiration Timestamp Field

  • Sync GUID

STEP 3: Re-Run CinchyDXD Export

Using Powershell you will now export the Data Experience you have defined within Cinchy.

  1. Launch Powershell and navigate to your CinchyDXD folder

Reminder: you can launch Powershell right from your file explorer window in the CinchyDXD file, saving you an extra step of navigating to the CinchyDXD folder manually in Powershell.

2. In the Powershell window type in cin and hit tab on your keyboard 3. In the Powershell command line next to .\CinchyDXD.ps1 type in export 4. Hit Enter on your keyboard

Reminder: If you do not remember the mandatory parameters, you can click the enter on your keyboard after typing in .\CinchyDXD.ps1 export, Powershell will provide you with the required and optional components to export the data experience.

5. Let’s now enter the mandatory export parameters

NOTE: the parameters executed in Powershell can exist on one line in powershell, however for legibility (below) the parameters have been put on separate lines. If you are putting your parameters on separate lines you will be required to have backticks quote ` for the parameters to execute

NOTE: you will need to update your version number Sample: .\CinchyDXD.ps1 export ` -s "<source Cinchy url>" ` -u "<source user id>" ` -p "<source password>" ` -c "C:\Cinchy CLI v4.0.2" ` -d "C:\CLI Output Logs" ` -g "8C4D08A1-C0ED-4FFC-A695-BBED068507E9" ` -v "2.0.0" ` -o "C:\CinchyDXD_Output" `

6. Enter the export parameters into the Powershell window

7. Hit Enter on your keyboard to run the export command

Powershell will begin to process the export. Once the export is complete, Powershell will provide you with an export complete message.

STEP 4: Validate Export

Now that the export is completed, be sure to validate the following:

  1. DXD Export Folder is populated

2. Data Experience Release table is populated in the source environment

3. Data Experience Release Artifacts table is populated in the source environment

Re-Install the Data Experience (DX)

STEP 1: Re-Run CinchyDXD Install

Using Powershell you will now install the Data Experience you have exported out of Cinchy.

  1. Open File Explorer, navigate to the exported folder (see Step 4: Validate Export)

2. In the folder path URL for the exported data experience type in powershell (to launch Powershell for that path)

3. Hit Enter on your keyboard, the powershell window will appear

4. In the Powershell window type in cin and hit tab on your keyboard and type install

5. Enter the install parameters into the Powershell window

NOTE: the parameters executed in Powershell can exist on one line in powershell, however for legibility (below) the parameters have been put on separate lines. If you are putting your parameters on separate lines you will be required to have backticks quote ` for the parameters to execute

Sample: .\CinchyDXD.ps1 install -s "<taget Cinchy url>" ` -u "<target user id>" ` -p "<target password>" ` -c "C:\Cinchy CLI v4.0.2" ` -d "C:\CLI Output Logs" `

6. Hit Enter on your keyboard to run the install command. Once the Data Experience has been installed you will get a message in Powershell that the install was completed.

STEP 2: Validate Install

Now that the install is completed, be sure to validate the following in the target environment.

  1. Models Table in populated in the Target environment with the model that was installed

2. Currency Exchange Rate table exist in the Target environment with the new column names

3. Currency Converter query exist in the Target environment with the new column names and labels

4. Data Experience Definitions table should not change unless you have added or removed column details within this table

5. Data Experience Releases table in the target environment is populated with the new release version number from the install (e.g. 2.0.0)