5️⃣
Cinchy Platform Documentation
Cinchy v5.8
Cinchy v5.8
  • Data Collaboration Overview
  • Release notes
    • Release notes
      • 5.9 release notes
      • 5.8 Release Notes
      • 5.7 Release Notes
      • 5.6 Release Notes
      • 5.5 Release Notes
      • 5.4 Release Notes
      • 5.3 Release Notes
      • 5.2 Release Notes
      • 5.1 Release Notes
      • 5.0 Release Notes
  • Support
  • Glossary
  • FAQ
  • Deployment guide
    • Deploying Cinchy
      • Plan your deployment
        • Deployment architecture
          • Kubernetes architecture
          • IIS architecture
        • Deployment prerequisites
          • Single Sign-On (SSO) integration
            • Enable TLS 1.2
            • Configure ADFS
            • AD Group Integration
      • Kubernetes
        • Disable your Kubernetes applications
        • Change your file storage configuration
        • Configure AWS IAM for Connections
        • Use Self-Signed SSL Certs (Kubernetes)
        • Deploy the CLI (Kubernetes)
      • IIS
  • Upgrade guide
    • Upgrade Cinchy
      • Cinchy Upgrade Utility
      • Kubernetes upgrades
        • v5.1 (Kubernetes)
        • v5.2 (Kubernetes)
        • v5.3 (Kubernetes)
        • v5.4 (Kubernetes)
        • v5.5 (Kubernetes)
        • v5.6 (Kubernetes)
        • v5.7 (Kubernetes)
        • v5.8 (Kubernetes)
        • Upgrade AWS EKS Kubernetes version
        • Update the Kubernetes Image Registry
        • Upgrade Azure Kubernetes Service (AKS)
      • IIS upgrades
        • v4.21 (IIS)
        • v4.x to v5.x (IIS)
        • v5.1 (IIS)
        • v5.2 (IIS)
        • v5.3 (IIS)
        • v5.4 (IIS)
        • v5.5 (IIS)
        • v5.6 (IIS)
        • v5.7 (IIS)
        • v5.8 (IIS)
      • Upgrading from v4 to v5
  • Guides for using Cinchy
    • User Guide
      • Data Browser overview
      • The Admin panel
      • User preferences
        • Personal access tokens
      • Table features
      • Data management
      • Queries
      • Version management
        • Versioning best practices
      • Commentary
    • Builder Guide
      • Best practices
      • Create tables
        • Attach files
        • Columns
        • Data controls
          • Data entitlements
          • Data erasure
          • Data compression
        • Formatting rules
        • Indexing & partitioning
        • Linking data
        • Table and column GUIDs
        • System tables
      • Delete tables
        • Restore tables, columns, and rows
      • Saved queries
      • CinchyDXD
        • Overview
        • DXD workflow
        • Package the data experience
        • Install the data experience
        • Release package
        • Changelog
        • References
          • Cinchy DXD CLI reference
          • Data Experience Definitions table
          • Data Experience Reference table
      • Multilingual support
      • Integration guides
    • Administrator Guide
    • Additional guides
      • Monitor and Log on Kubernetes
        • Grafana
        • OpenSearch dashboards
          • Set up Alerts
        • Monitor via ArgoCD
      • Maintenance
      • Cinchy Secrets Manager
      • GraphQL (Beta)
      • System properties
      • Enable Data At Rest Encryption (DARE)
      • Application experiences
        • Network map
          • Custom node results
          • Custom results in the Network Map
        • Set up experiences
  • API Guide
    • API overview
      • API authentication
      • API saved queries
      • ExecuteCQL
      • Webhook ingestion
  • CQL
    • Overview
      • CQL examples
      • CQL statements overview
        • Cinchy DML statements
        • Cinchy DDL statements
      • Cinchy supported functions
        • Cinchy functions
        • Cinchy system values
        • Cinchy User Defined Functions (UDFs)
          • Table-valued functions
          • Scalar-valued functions
        • Conversion functions
        • Date and Time types and functions
          • Return System Date and Time values
          • Return Date and Time parts
          • Return Date and Time values from their parts
          • Return Date and Time difference values
          • Modify Date and Time values
          • Validate Date and Time values
        • Logical functions
        • Math functions
        • String functions
        • Geometry and Geography data type and functions
          • OGC methods on Geometry & Geography instances
          • Extended methods on Geometry & Geography instances
        • Full Text Search functions
        • Connections functions
        • JSON functions
    • CQL functions reference list
  • Meta-Forms
    • Introduction
    • Install Meta-Forms
      • Deploy Meta-Forms (Kubernetes)
      • Deploy Meta-Forms (IIS)
    • Forms data types
    • Meta-Forms Builder Guide
      • Create a dynamic meta-form with tables
      • Create a dynamic meta-form example with Form Designer
      • Add links to a form
      • Rich text editing in forms
  • Data syncs
    • Get started with data syncs
    • IIS installation
      • Install Connections
      • Install the Worker/Listener
      • Install the Connections CLI
    • Build data syncs
      • Data sync types
      • Design patterns
      • Sync actions
      • Columns and mappings
        • Calculated column examples
      • Advanced settings
        • Filters
        • Variables
        • Auth requests
        • Request headers
        • Post sync scripts
        • Pagination
      • Batch data sync example
      • Real-time sync example
      • Schedule a data sync
      • Connection functions
    • Data sync sources
      • Cinchy Event Broker/CDC
        • Cinchy Event Broker/CDC XML config example
      • Cinchy Table
        • Cinchy Table XML config example
      • Cinchy Query
        • Cinchy Query XML config example
      • Copper
      • DB2 (query and table)
      • Dynamics 2015
      • Dynamics
      • DynamoDB
      • File-based sources
        • Binary file
        • Delimited file
        • Excel
        • Fixed width file
        • Parquet
      • Kafka Topic
        • Kafka Topic example config
        • Apache AVRO data format
      • LDAP
      • MongoDB collection
        • MongoDB collection source example
      • Mongo event
      • MongoDB collection (Cinchy event)
      • MS SQL Server (query and table)
      • ODBC Query
      • Oracle (query and table)
      • Polling event
        • Polling event example config
      • REST API
      • REST API (Cinchy event)
      • SAP SuccessFactors
      • Salesforce Object (Bulk API)
      • Salesforce platform event
      • Salesforce push topic
      • Snowflake
        • Snowflake source example config
      • SOAP 1.2 web service
      • SOAP 1.2 web service (Cinchy Event Triggered)
    • Data sync destinations
      • Cinchy Table
      • DB2 table
      • Dynamics
      • Kafka Topic
      • MongoDB collection
      • MS SQL Server table
      • Oracle table
      • REST API
      • Salesforce
      • Snowflake table
      • SOAP 1.2 web service
    • Real-time sync stream sources
      • The Listener Config table
      • Cinchy Event Broker/CDC
      • Data Polling
      • Kafka Topic
      • MongoDB
      • Salesforce Push Topic
      • Salesforce Platform Event
    • CLI commands list
    • Troubleshooting
  • Other Resources
    • Angular SDK
    • JavaScript SQK
Powered by GitBook
On this page
  • Introduction
  • Setting up the Connection to a Kafka Schema Registry
  • Listener Config Attributes
  1. Data syncs
  2. Data sync sources
  3. Kafka Topic

Apache AVRO data format

PreviousKafka Topic example configNextLDAP

Last updated 1 year ago

Introduction

Apache AVRO was added as an inbound data format in

is a data format with added integration with the Kafka Schema Registry, which helps enforce data governance within a Kafka architecture.

Avro is an open source data serialization system that helps with data exchange between systems, programming languages, and processing frameworks. Avro stores both the data definition and the data together in one message or file. Avro stores the data definition in JSON format making it easy to read and interpret; the data itself is stored in binary format making it compact and efficient.

Some of the benefits for using AVRO as a data format are:

  • It's compact

  • It has a direct mapping to/from JSON

  • It's fast

  • It has bindings for a wide variety of programming languages.

For more about AVRO and Kafka,

Setting up the Connection to a Kafka Schema Registry

To set up the Apache AVRO connection to a Kafka Schema Registry, you will need to configure your Listener Configs table with the below specified attributes.

Listener Config Attributes

Topic Column

Name
Description

"topicName"

Mandatory. This is the Kafka topic name to listen messages on.

"messageFormat"

Put "AVRO" if your messages are serialized in AVRO

{ 
"topicName": "sampleavro", 
"messageFormat": "AVRO" 
}

Connections Attributes Column

Name
Description

"bootstrapServers"

Mandatory. List the Kafka bootstrap servers in a comma-separated list. Should be in the form of host:port

"url"

This is required if your data follows a schema when serialized in AVRO. It's a comma-separated list of URLs for schema registry instances that are used to register or lookup schemas.

"basicAuthCredentialsSource"

Specifies the Kafka configuration property "schema.registry.basic.auth.credentials.source" that provides the basic authentication credentials. This can be "UserInfo" | "SaslInherit"

"basicAuthUserInfo"

Basic Auth credentials specified in the form of username:password

"sslKeystorePassword"

The client keystore (PKCS#12) password

{
  "bootstrapServers": "host:port",
  "schemaRegistrySettings": {
    "url": "URL"
    "basicAuthCredentialsSource": "UserInfo" | "SaslInherit",
    "basicAuthUserInfo": "username:password",
    "sslKeystorePassword": "password"
  }
}
Cinchy v5.3.
Apache AVRO (inbound)
read the documentation here.