v5.0 - v5.5 Data Sync Documentation
  • Overview
  • Release Notes
    • V4.5
    • V4.6
    • v4.7
    • v4.8
    • v4.9
    • v4.10
    • v4.12
    • v4.13
  • v4 Connections Installation Guide
    • v5 Connections and CLI Installation Guide
  • v4 Worker/Listener Installation Guide
    • v5 Worker/Listener Installation Guide
  • Upgrades & Config Changes
  • Builder Guide
    • Overview
    • Types of Data Syncs
    • Subscribing to Event Streams
      • Supported Stream Sources
        • Amazon Simple Queue Service (SQS)
        • Cinchy Change Data Capture
        • Data Polling
        • Kafka Topic
        • MongoDB
        • Salesforce
          • Push Topic
          • Platform Event
    • Configuring a Data Sync
      • Connections Experience & XML Config Reference
        • Info
          • Parameters
            • String Escape
        • Source Config Reference
          • Schema Columns
            • Calculated Column Examples
          • Auth Requests
          • Request Headers
          • Pagination
          • Source Filter
        • Target Destination Config Reference
          • Column Mappings
          • Target Destination Filter
        • Sync Behaviour
        • Post Sync
        • Permissions
        • Jobs
        • Connections Usage Example
      • Data Validation
      • Transformations
        • StringReplacement
      • Supported Data Sources
        • File Based Source
          • Binary File
            • Binary File Data Source Example
          • Delimited File
            • Delimited File Data Source XML Example
          • Fixed Width File
          • Excel
            • Excel Source XML Example
        • Cinchy Event Broker/CDC
          • Cinchy Event Broker/CDC XML Config Example
        • Cinchy Table
          • Cinchy Table XML Config Example
        • Cinchy Query
          • Cinchy Query Data Source Example
        • Copper
        • DB2
        • Dynamics 2015
        • Dynamics
        • DynamoDB
        • Kafka Topic
          • Apache AVRO Data Format
          • Kafka Topic Source Example
        • LDAP
        • MongoDB Collection
          • MongoDB Collection Source XML Example
        • MongoDB Collection (Cinchy Event Triggered)
        • MS SQL Server Query
        • MS SQL Server Table
        • ODBC Table
        • ODBC Query
        • Oracle Table
        • Oracle Query
        • Parquet
        • Polling Event
          • Polling Event Example
        • REST API (Cinchy Event Triggered)
        • REST API
          • REST API XML Example
        • SAP SuccessFactors
        • Salesforce Object (Bulk API)
        • Salesforce Platform Event
        • Salesforce Push Topic
        • Snowflake
          • Snowflake Source XML Example
        • SOAP 1.2 Web Service
      • Supported Sync Targets
        • Cinchy Table
        • DB2 Table
        • Dynamics
        • Kafka Topic
        • MongoDB Collection (Column Based)
        • MS SQL Server Table
        • Oracle Table
        • REST API
        • Salesforce
        • Salesforce Object
        • Snowflake Table
          • Snowflake Table Target XML Example
        • SOAP 1.2 Web Service
    • Common Design Patterns
    • Testing a Data Sync
    • Promoting a Data Sync
    • Scheduling a Data Sync
    • CLI Command List
    • Connections Functions
    • Monitoring
  • Cinchy Platform Documentation
Powered by GitBook
On this page
  • High Level Architecture
  • Batch Execution Flow
  • Real Time Execution Flow

Was this helpful?

Overview

NextRelease Notes

Last updated 1 year ago

Was this helpful?

You are currently browsing the Cinchy v5.0-v5.5 Data Sync documentation. For the most up to date documentation, please navigate to

Cinchy provides two methods of synchronizing data: batch sync and real-time sync. You can run a batch sync as a one time data load operation, or you can schedule it to run periodically. The real-time sync pushes individual events in real-time to the target system through the Cinchy listener and worker process.

High Level Architecture

Batch Execution Flow

At a high level, running a syncdata operation performs these steps:

  1. Connects to Cinchy and creates a log entry in the Execution Log table with a status of running.

  2. Streams the source and target into the CLI. Any malformed records or duplicate sync keys are written to source and target errors csvs (based on the temp directory)

  3. Compares the sync keys to match up source and target records

  4. Checks if there are changes between the matched records

  5. For the records where there are changes, groups them into insert, update, and delete batches.

  6. Sends the batches to the target, records failures in sync errors csv and Execution Errors table.

  7. Once complete, updates Execution Log entry with final status and execution output.

Real Time Execution Flow

At a high level, this is the process a real-time message goes through. We assume there is a listener and worker already set up.

  1. Listener is successfully subscribed and waiting for events from streaming source

  2. Listener receives a message from a streaming source and pushes it to SQL Server Broker.

  3. Worker picks up message from SQL Server Broker

  4. Worker fetches the matching record from the target based on the sync key

  5. If there are changes detected, the worker pushes them to the target system. Logs successes and failures in the worker's log file.

https://platform.docs.cinchy.com/data-syncs/getting-started-with-data-syncs