v5.0 - v5.5 Data Sync Documentation
  • Overview
  • Release Notes
    • V4.5
    • V4.6
    • v4.7
    • v4.8
    • v4.9
    • v4.10
    • v4.12
    • v4.13
  • v4 Connections Installation Guide
    • v5 Connections and CLI Installation Guide
  • v4 Worker/Listener Installation Guide
    • v5 Worker/Listener Installation Guide
  • Upgrades & Config Changes
  • Builder Guide
    • Overview
    • Types of Data Syncs
    • Subscribing to Event Streams
      • Supported Stream Sources
        • Amazon Simple Queue Service (SQS)
        • Cinchy Change Data Capture
        • Data Polling
        • Kafka Topic
        • MongoDB
        • Salesforce
          • Push Topic
          • Platform Event
    • Configuring a Data Sync
      • Connections Experience & XML Config Reference
        • Info
          • Parameters
            • String Escape
        • Source Config Reference
          • Schema Columns
            • Calculated Column Examples
          • Auth Requests
          • Request Headers
          • Pagination
          • Source Filter
        • Target Destination Config Reference
          • Column Mappings
          • Target Destination Filter
        • Sync Behaviour
        • Post Sync
        • Permissions
        • Jobs
        • Connections Usage Example
      • Data Validation
      • Transformations
        • StringReplacement
      • Supported Data Sources
        • File Based Source
          • Binary File
            • Binary File Data Source Example
          • Delimited File
            • Delimited File Data Source XML Example
          • Fixed Width File
          • Excel
            • Excel Source XML Example
        • Cinchy Event Broker/CDC
          • Cinchy Event Broker/CDC XML Config Example
        • Cinchy Table
          • Cinchy Table XML Config Example
        • Cinchy Query
          • Cinchy Query Data Source Example
        • Copper
        • DB2
        • Dynamics 2015
        • Dynamics
        • DynamoDB
        • Kafka Topic
          • Apache AVRO Data Format
          • Kafka Topic Source Example
        • LDAP
        • MongoDB Collection
          • MongoDB Collection Source XML Example
        • MongoDB Collection (Cinchy Event Triggered)
        • MS SQL Server Query
        • MS SQL Server Table
        • ODBC Table
        • ODBC Query
        • Oracle Table
        • Oracle Query
        • Parquet
        • Polling Event
          • Polling Event Example
        • REST API (Cinchy Event Triggered)
        • REST API
          • REST API XML Example
        • SAP SuccessFactors
        • Salesforce Object (Bulk API)
        • Salesforce Platform Event
        • Salesforce Push Topic
        • Snowflake
          • Snowflake Source XML Example
        • SOAP 1.2 Web Service
      • Supported Sync Targets
        • Cinchy Table
        • DB2 Table
        • Dynamics
        • Kafka Topic
        • MongoDB Collection (Column Based)
        • MS SQL Server Table
        • Oracle Table
        • REST API
        • Salesforce
        • Salesforce Object
        • Snowflake Table
          • Snowflake Table Target XML Example
        • SOAP 1.2 Web Service
    • Common Design Patterns
    • Testing a Data Sync
    • Promoting a Data Sync
    • Scheduling a Data Sync
    • CLI Command List
    • Connections Functions
    • Monitoring
  • Cinchy Platform Documentation
Powered by GitBook
On this page

Was this helpful?

  1. Builder Guide
  2. Subscribing to Event Streams
  3. Supported Stream Sources

Kafka Topic

This page describes the required listener config parameters for setting up a sync with a Kafka stream source.

1. Kafka Topic Listener Config

The following parameters should be included in your listener configuration when setting up a real time sync with a Kafka stream source. This process currently supports JSON string or AVRO serialized format.

To listen in on multiple topics, you will need to configure multiple listener configs.

3.1 Topic Column

Name

Description

"topicName"

Mandatory. This is the Kafka topic name to listen messages on.

"messageFormat"

{
  "topicName": "<(mandatory) kafka topic name to listen messages on>",
  "messageFormat": "<(optional) Put "AVRO" if the messages are serialized in AVRO>"
}

3.2 Connections Attributes Column

"bootstrapServers"

Mandatory. List the Kafka bootstrap servers in a comma-separated list. Should be in the form of host:port

"saslMechanism"

This will be either PLAIN, SCRAM-SHA-256, or SCRAM-SHA-512. SCRAM-SHA-256 must be formatted as: SCRAMSHA256 SCRAM-SHA-512 must be formatted as: SCRAMSHA512

"saslPassword"

The password for your chosen SASL mechanism

"saslUsername"

The username for your chosen SASL mechanism.

"url"

"basicAuthCredentialsSource"

Specifies the Kafka configuration property "schema.registry.basic.auth.credentials.source" that provides the basic authentication credentials. This can be "UserInfo" | "SaslInherit"

"basicAuthUserInfo"

Basic Auth credentials specified in the form of username:password

"sslKeystorePassword"

Optional. This is the client keystore (PKCS#12) password.

"securityProtocol"

Kafka supports cluster encryption and authentication, which can encrypt data-in-transit between your applications and Kafka.

Use this field to specify which protocol will be used for communication between client and server. Cinchy currently supports the following options: Plaintext, SaslPlaintext, or SaslSsl. Paintext: Unauthenticated, non-encrypted. SaslPlaintext: SASL-based authentication, non-encrypted. SaslSSL: SASL-based authentication, TLS-based encryption. If no parameter is specified, this will default to Plaintext.

{
  "bootstrapServers": "< (mandatory) kafka bootstrap servers in a comma-separated list in the form of host:port>",
  "saslMechanism": "<PLAIN|SCRAM-SHA-256|SCRAM-SHA-512>",
  "saslPassword": "",
  "saslUsername": "",
  "schemaRegistrySettings": {
    "url": "<(optional) This is required if your data follows a schema when serialized in Avro. A comma-separated list of URLs for schema registry instances that are used to register or lookup schemas. >",
   "basicAuthCredentialsSource": "<(optional) Specifies the Kafka configuration property "schema.registry.basic.auth.credentials.source" that provides the basic authentication credentials, this can be "UserInfo" | "SaslInherit">",
   "basicAuthUserInfo": "<(optional) Basic Auth credentials specified in the form of username:password>",
   "sslKeystorePassword": "<(optional) The client keystore (PKCS#12) password>"
  }
  "securityProtocol": "Plaintext | SaslPlaintext | SaslSsl"
}
PreviousData PollingNextMongoDB

Last updated 2 years ago

Was this helpful?

Optional. Put "AVRO" if your messages are

This is required if your data follows a schema . It is a comma-separated list of URLs for schema registry instances that are used to register or lookup schemas.

serialized in AVRO
when serialized in AVRO