5️⃣
Cinchy Platform Documentation
Cinchy v5.8
Cinchy v5.8
  • Data Collaboration Overview
  • Release notes
    • Release notes
      • 5.9 release notes
      • 5.8 Release Notes
      • 5.7 Release Notes
      • 5.6 Release Notes
      • 5.5 Release Notes
      • 5.4 Release Notes
      • 5.3 Release Notes
      • 5.2 Release Notes
      • 5.1 Release Notes
      • 5.0 Release Notes
  • Support
  • Glossary
  • FAQ
  • Deployment guide
    • Deploying Cinchy
      • Plan your deployment
        • Deployment architecture
          • Kubernetes architecture
          • IIS architecture
        • Deployment prerequisites
          • Single Sign-On (SSO) integration
            • Enable TLS 1.2
            • Configure ADFS
            • AD Group Integration
      • Kubernetes
        • Disable your Kubernetes applications
        • Change your file storage configuration
        • Configure AWS IAM for Connections
        • Use Self-Signed SSL Certs (Kubernetes)
        • Deploy the CLI (Kubernetes)
      • IIS
  • Upgrade guide
    • Upgrade Cinchy
      • Cinchy Upgrade Utility
      • Kubernetes upgrades
        • v5.1 (Kubernetes)
        • v5.2 (Kubernetes)
        • v5.3 (Kubernetes)
        • v5.4 (Kubernetes)
        • v5.5 (Kubernetes)
        • v5.6 (Kubernetes)
        • v5.7 (Kubernetes)
        • v5.8 (Kubernetes)
        • Upgrade AWS EKS Kubernetes version
        • Update the Kubernetes Image Registry
        • Upgrade Azure Kubernetes Service (AKS)
      • IIS upgrades
        • v4.21 (IIS)
        • v4.x to v5.x (IIS)
        • v5.1 (IIS)
        • v5.2 (IIS)
        • v5.3 (IIS)
        • v5.4 (IIS)
        • v5.5 (IIS)
        • v5.6 (IIS)
        • v5.7 (IIS)
        • v5.8 (IIS)
      • Upgrading from v4 to v5
  • Guides for using Cinchy
    • User Guide
      • Data Browser overview
      • The Admin panel
      • User preferences
        • Personal access tokens
      • Table features
      • Data management
      • Queries
      • Version management
        • Versioning best practices
      • Commentary
    • Builder Guide
      • Best practices
      • Create tables
        • Attach files
        • Columns
        • Data controls
          • Data entitlements
          • Data erasure
          • Data compression
        • Formatting rules
        • Indexing & partitioning
        • Linking data
        • Table and column GUIDs
        • System tables
      • Delete tables
        • Restore tables, columns, and rows
      • Saved queries
      • CinchyDXD
        • Overview
        • DXD workflow
        • Package the data experience
        • Install the data experience
        • Release package
        • Changelog
        • References
          • Cinchy DXD CLI reference
          • Data Experience Definitions table
          • Data Experience Reference table
      • Multilingual support
      • Integration guides
    • Administrator Guide
    • Additional guides
      • Monitor and Log on Kubernetes
        • Grafana
        • OpenSearch dashboards
          • Set up Alerts
        • Monitor via ArgoCD
      • Maintenance
      • Cinchy Secrets Manager
      • GraphQL (Beta)
      • System properties
      • Enable Data At Rest Encryption (DARE)
      • Application experiences
        • Network map
          • Custom node results
          • Custom results in the Network Map
        • Set up experiences
  • API Guide
    • API overview
      • API authentication
      • API saved queries
      • ExecuteCQL
      • Webhook ingestion
  • CQL
    • Overview
      • CQL examples
      • CQL statements overview
        • Cinchy DML statements
        • Cinchy DDL statements
      • Cinchy supported functions
        • Cinchy functions
        • Cinchy system values
        • Cinchy User Defined Functions (UDFs)
          • Table-valued functions
          • Scalar-valued functions
        • Conversion functions
        • Date and Time types and functions
          • Return System Date and Time values
          • Return Date and Time parts
          • Return Date and Time values from their parts
          • Return Date and Time difference values
          • Modify Date and Time values
          • Validate Date and Time values
        • Logical functions
        • Math functions
        • String functions
        • Geometry and Geography data type and functions
          • OGC methods on Geometry & Geography instances
          • Extended methods on Geometry & Geography instances
        • Full Text Search functions
        • Connections functions
        • JSON functions
    • CQL functions reference list
  • Meta-Forms
    • Introduction
    • Install Meta-Forms
      • Deploy Meta-Forms (Kubernetes)
      • Deploy Meta-Forms (IIS)
    • Forms data types
    • Meta-Forms Builder Guide
      • Create a dynamic meta-form with tables
      • Create a dynamic meta-form example with Form Designer
      • Add links to a form
      • Rich text editing in forms
  • Data syncs
    • Get started with data syncs
    • IIS installation
      • Install Connections
      • Install the Worker/Listener
      • Install the Connections CLI
    • Build data syncs
      • Data sync types
      • Design patterns
      • Sync actions
      • Columns and mappings
        • Calculated column examples
      • Advanced settings
        • Filters
        • Variables
        • Auth requests
        • Request headers
        • Post sync scripts
        • Pagination
      • Batch data sync example
      • Real-time sync example
      • Schedule a data sync
      • Connection functions
    • Data sync sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML config example
      • Cinchy Table
        • Cinchy Table XML config example
      • Cinchy Query
        • Cinchy Query XML config example
      • Copper
      • DB2 (query and table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File-based sources
        • Binary file
        • Delimited file
        • Excel
        • Fixed width file
        • Parquet
      • Kafka Topic
        • Kafka Topic example config
        • Apache AVRO data format
      • LDAP
      • MongoDB collection
        • MongoDB collection source example
      • Mongo event
      • MongoDB collection (Cinchy event)
      • MS SQL Server (query and table)
      • ODBC Query
      • Oracle (query and table)
      • Polling event
        • Polling event example config
      • REST API
      • REST API (Cinchy event)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce platform event
      • Salesforce push topic
      • Snowflake
        • Snowflake source example config
      • SOAP 1.2 web service
      • SOAP 1.2 web service (Cinchy Event Triggered)
    • Data sync destinations
      • Cinchy Table
      • DB2 table
      • Dynamics
      • Kafka Topic
      • MongoDB collection
      • MS SQL Server table
      • Oracle table
      • REST API
      • Salesforce
      • Snowflake table
      • SOAP 1.2 web service
    • Real-time sync stream sources
      • The Listener Config table
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
    • CLI commands list
    • Troubleshooting
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Overview
  • Get started with OpenSearch Dashboards
  • Define your log level
  • Common log search patterns
  • Set up an index
  • Create a visualization
  • Create a dashboard
  • Update your OpenSearch password
  1. Guides for using Cinchy
  2. Additional guides
  3. Monitor and Log on Kubernetes

OpenSearch dashboards

PreviousGrafanaNextSet up Alerts

Last updated 1 year ago

Overview

When deploying Cinchy v5 on Kubernetes, Cinchy recommends using OpenSearch Dashboards for your logging. OpenSearch is a community-driven fork of Elasticsearch created by Amazon, and it captures and indexes all your logs into a single, accessible dashboard location. These logs can be queried, searched, and filtered, and Correlation IDs mean that they can also be traced across various components. These logging components take advantage of persistent storage.

You can view OpenSearch documentation here:

Get started with OpenSearch Dashboards

These sections guide you through setting up your first Index, Visualization, Dashboard, and Alert.

OpenSearch comes with sample data that you can use to get a feel of the various capabilities. You will find this on the main page upon logging in.

Define your log level

  1. Navigate to your cinchy.kubernetes/environment_kustomizations/instance_template/worker/kustomization.yaml file.

  2. In the below code, copy the Base64 encoded string in the value parameter.

patch: |-
  - op: replace
    path: /data/appsettings.json
    value: wcxJItEmCWQJQPZidpLUuV6Ll79ZUr8BimlMJysLwcxJItEmCWQJQPZidpLUuV6Ll79ZUr8BimlMJysL
  1. Navigate to the below Serilog section of the code and update the "Default" parameter as needed to set your log level. The options are:

    1. Verbose

      Verbose is the noisiest level, rarely (if ever) enabled for a production app.

      Debug

      Debug is used for internal system events that aren't necessarily observable from the outside, but useful when determining how something happened. This is the default setting for Cinchy.

      Information

      Information events describe things happening in the system that correspond to its responsibilities and functions. Generally these are the observable actions the system can perform.

      Warning

      When service is degraded, endangered, or may be behaving outside of its expected parameters, Warning level events are used.

      Error

      When functionality is unavailable or expectations broken, an Error event is used.

      Fatal

      The most critical level, Fatal events demand immediate attention.

"Serilog": {
    "MinimumLevel": {
      "Default": "Debug",
  1. Ensure that you commit your changes.

  2. Navigate to ArgoCD > Worker Application and refresh.

Common log search patterns

The following are some common search patterns when looking through your OpenSearch Logs.

  • If an HTTP request to Cinchy Web/IDP fails, check the page's requests and the relevant response headers to find the "x-correlation-id" header. That header value can be used to search and find all logs associated with the HTTP request.

  • When debugging batch syncs, filter the "ExecutionId" field in the logs for your batch sync execution ID to narrow down your search.

  • When debugging real time syncs, search for your data sync config name in the Event Listener or Workers logs to find all the associated logging information.

Set up an index

The first step to utilizing the power of OpenSearch Dashboards is to set up an index to pull data from your sources. An Index Pattern identifies which indices you want to explore. An index pattern can point to a specific index, for example, your log data from yesterday, or all indices that contain your log data.

If this is your first time logging in, the username and password will be set to admin/admin.

  1. Navigate to the Stack Management tab in the left navigation menu (Image 1).

  1. From the left navigation, click on Index Patterns (Image 2).

  1. Click on the Create Index Pattern button.

  2. To set up your index pattern, you must define the source. OpenSearch will list the sources available to you on the screen below. Input your desired source(s) in the text box (Image 3).

You can use the asterisk (*) to match multiple sources.

  1. Configure your index pattern settings (Image 4).

  • Time field: Select a primary time field to use with the global time filter

  • Custom index pattern ID: By default, OpenSearch gives a unique identifier to each index pattern. You can use this field to optional override the default ID with a custom one.

  1. Once created, you can review your Index Patterns from the Index Patterns page (Image 5).

  1. Click on your Index Pattern to review your fields (Image 6).

Create a visualization

You can pull out any data from your index sources and view them in a variety of visualizations.

  1. From the left navigation pane, click Visualize (Image 7).

  1. If you have any Visualizations, they will appear on this page. To create a new one, click the Create Visualization button (Image 8).

  1. Select your visualization type from the populated list (Image 9).

  1. Configure the data parameters that appear in the right hand pane of the Create screen. These options will vary depending on what type of visualization you choose in step 3. The following example uses a pie chart visualization (Image 11):

  • Metrics

    • Aggregation: Choose how you want your data aggregated. This example uses Count.

    • Custom Label: You can use this optional field for custom labelling.

  • Buckets

    • Aggregation: Choose how you want your data aggregated. This example uses Split Slices > Terms.

    • Field: This drop down is populated based on the index source your chose. Select which field you want to use in your visualization. This example uses machine.os.keyword.

    • Order By: Define how you want your data to be ordered. This example uses Metric: Count, in descending order of size 10.

    • Choose whether to group other values in a separate bucket. If you toggle this on, you will need to label the new bucket.

    • Choose whether to show missing values.

  • Advanced

    • You can optionally choose a JSON input. These will be merged with the OpenSearch aggregation definition.

  • Options

    • The variables in the options tab can be used to configure the UI of the visualization itself.

  1. You can also further focus your visualization:

  • Add a filter on any of your fields (Image 13).

  • Update your date filter (Image 14).

  1. Click save when finished with your visualization.

Create a dashboard

Once you have created your visualizations, you can combine them together on one Dashboard for easy access.

You can also create new visualizations from the Dashboard screen.

  1. From the left navigation pane, click on Dashboards (Image 15).

  1. If you have any Dashboards, they will appear on this page. To create a new one, click the Create Dashboard button (Image 16).

  1. The "Editing New Dashboard" screen will appear. Click on Add an Existing object (Image 17).

  1. Select any of the visualizations you created and it will automatically add to your Dashboard (Image 18). Repeat this step for as many visualizations as you'd like to appear.

  1. Click Save to finish (Image 19).

Update your OpenSearch password

This capability was added in Cinchy v5.4.

Your OpenSearch password can be updated in your deployment.json file (you may have renamed this during your original deployment).

  1. Navigate to "cluster_component_config > OpenSearch.

  2. OpenSearch has two users that you can configure the passwords for: Admin and Kibana Server. Kibana Server is used for communication between the opensearch dashboard and the opensearch server. The default password for both is set to "password";. To update this, you will need to use a machine with docker available.

  3. Update your Admin password:

    1. Your password must be hashed. You can do so by running the following command on a machine with docker available, inputting your new password where noted:

    docker run -it opensearchproject/opensearch /usr/share/opensearch/plugins/opensearch-security/tools/hash.sh -p <<newpassword>>
    1. Navigate to "opensearch_admin_user_hashed_password" and input your hashed password.

    2. Navigate to "opensearch_admin_user_password_base64" and input your encoded password.

  4. Update your Kibana Server password:

    1. Your password must be hashed. You can do so by running the following command on a machine with docker available, inputting your new password where noted:

    docker run -it opensearchproject/opensearch /usr/share/opensearch/plugins/opensearch-security/tools/hash.sh -p <<newpassword>>
    1. Navigate to "opensearch_kibanaserver_user_hashed_password" and input your hashed password.

    2. You must also provide your new password in cleartext. Navigate to "opensearch_kibanaserver_user_password" and input your cleartext password.

  5. Run the below command in the root directory of your devops.automations repo to update your configurations. If you have changed the name of your deployment.json file, make sure to update the command accordingly.

    dotnet Cinchy.DevOps.Automations.dll "deployment.json"
  6. Commit and push your changes.

  7. If your environment isn't set-up to automatically apply upon configuration,navigate to the ArgoCD portal and refresh your component(s). If that doesn't work, re-sync.

to retrieve your AppSettings.

Login to OpenSearch. You would have configured the access point during your ; traditionally it will be found at <baseurl>/dashboard.

We highly recommend you

Choose your source (Image 10). If the source you want to pull data from isn't listed, you will need to

to search your index data (Image 12). You can also save any queries you write for easy access by clicking on the save icon.

You must also provide your password in a base64 encoded format; input your cleartext password to receive your new encoded password.

Introducing OpenSearch
General OpenSearch Documentation
Using DQL (Dashboards Query Language)
Troubleshooting and Common Errors
Alerts
Anomaly Detection
Decode the value
deployment installation
update the password as soon as possible.
Use DQL
here
set it up as an index first.
Image 1: Select Stack Management
Image 2: Select Index Patterns
Image 3: Define your sources
Image 4: Configure your index pattern settings
Image 5: Review your Index Patterns
Image 6: Reviewing your Index Pattern fields
Image 7: Click Visualize
Image 8: Click Create New
Image 9: Select your Visualization type
Image 10: Select your Source
Image 11: Creating your Visualization
Image 12: Use a query on your Visualization
Image 13: Add a filter on any of your fields
Image 14: Update your date filter
Image 15: Click Dashboards
Image 16: Click Create Dashboard
Image 17: Click Add An Existing
Image 18: Add as many visualizations as you'd like
Image 19: Click Save.