v5.0 - v5.5 Data Sync Documentation
  • Overview
  • Release Notes
    • V4.5
    • V4.6
    • v4.7
    • v4.8
    • v4.9
    • v4.10
    • v4.12
    • v4.13
  • v4 Connections Installation Guide
    • v5 Connections and CLI Installation Guide
  • v4 Worker/Listener Installation Guide
    • v5 Worker/Listener Installation Guide
  • Upgrades & Config Changes
  • Builder Guide
    • Overview
    • Types of Data Syncs
    • Subscribing to Event Streams
      • Supported Stream Sources
        • Amazon Simple Queue Service (SQS)
        • Cinchy Change Data Capture
        • Data Polling
        • Kafka Topic
        • MongoDB
        • Salesforce
          • Push Topic
          • Platform Event
    • Configuring a Data Sync
      • Connections Experience & XML Config Reference
        • Info
          • Parameters
            • String Escape
        • Source Config Reference
          • Schema Columns
            • Calculated Column Examples
          • Auth Requests
          • Request Headers
          • Pagination
          • Source Filter
        • Target Destination Config Reference
          • Column Mappings
          • Target Destination Filter
        • Sync Behaviour
        • Post Sync
        • Permissions
        • Jobs
        • Connections Usage Example
      • Data Validation
      • Transformations
        • StringReplacement
      • Supported Data Sources
        • File Based Source
          • Binary File
            • Binary File Data Source Example
          • Delimited File
            • Delimited File Data Source XML Example
          • Fixed Width File
          • Excel
            • Excel Source XML Example
        • Cinchy Event Broker/CDC
          • Cinchy Event Broker/CDC XML Config Example
        • Cinchy Table
          • Cinchy Table XML Config Example
        • Cinchy Query
          • Cinchy Query Data Source Example
        • Copper
        • DB2
        • Dynamics 2015
        • Dynamics
        • DynamoDB
        • Kafka Topic
          • Apache AVRO Data Format
          • Kafka Topic Source Example
        • LDAP
        • MongoDB Collection
          • MongoDB Collection Source XML Example
        • MongoDB Collection (Cinchy Event Triggered)
        • MS SQL Server Query
        • MS SQL Server Table
        • ODBC Table
        • ODBC Query
        • Oracle Table
        • Oracle Query
        • Parquet
        • Polling Event
          • Polling Event Example
        • REST API (Cinchy Event Triggered)
        • REST API
          • REST API XML Example
        • SAP SuccessFactors
        • Salesforce Object (Bulk API)
        • Salesforce Platform Event
        • Salesforce Push Topic
        • Snowflake
          • Snowflake Source XML Example
        • SOAP 1.2 Web Service
      • Supported Sync Targets
        • Cinchy Table
        • DB2 Table
        • Dynamics
        • Kafka Topic
        • MongoDB Collection (Column Based)
        • MS SQL Server Table
        • Oracle Table
        • REST API
        • Salesforce
        • Salesforce Object
        • Snowflake Table
          • Snowflake Table Target XML Example
        • SOAP 1.2 Web Service
    • Common Design Patterns
    • Testing a Data Sync
    • Promoting a Data Sync
    • Scheduling a Data Sync
    • CLI Command List
    • Connections Functions
    • Monitoring
  • Cinchy Platform Documentation
Powered by GitBook
On this page
  • 1. Overview
  • 2. Considerations
  • 3. Load Metadata
  • 4. Basic Parameters
  • 5. Column Mappings
  • 6. Filters

Was this helpful?

  1. Builder Guide
  2. Configuring a Data Sync
  3. Supported Sync Targets

Snowflake Table

1. Overview

Snowflake is a fully managed SaaS that provides a single platform for data warehousing, data lakes, data engineering, data science, data application development, and secure sharing and consumption of real-time/shared data.

Cinchy introduced Snowflake as a source and target Connector in v5.2 of the platform.

How Connections Loads Data into Snowflake

For batch syncs of 10 records or less, single Insert/Update/Delete statements are executed to perform operations against the target Snowflake table.

For batch syncs exceeding 10 records, the operations are performed in bulk.

The bulk operation process consists of:

  1. Generating a CSV containing a batch of records

  2. Creating a temporary table in Snowflake

  3. Copying the generated CSV into the temp table

  4. If needed, performing Insert operations against the target Snowflake table using the temp table

  5. If needed, performing Update operations against the target Snowflake table using the temp table

  6. If needed, performing Delete operations against the target Snowflake table using the temp table

  7. Dropping the temporary table

Real time sync volume size is based on a dynamic batch size up to configurable threshold.

2. Considerations

  • The temporary table generated in the bulk flow process for high volume scenarios transforms all columns of data type Number to be of type NUMBER(38, 18). This may cause precision loss if the number scale in the target table is higher

3. Load Metadata

In the Load Metadata pop-up (Image 1), enter the following information and click Load:

Parameter
Description

Connection String

This will be the encrypted connection string.

Table

This will be the name of your target table. Ex: dbo.employees

4. Basic Parameters

  1. Add in information about your ID Column if applicable (Image 2).

Note that if you want to use "Delete" action in your sync behaviour configuration then you must input a value for your ID Column and ID Column Data Type.

Parameter
Description

ID Column

The name of the identity column that exists in the target (OR a single column that is guaranteed to be unique and automatically populated for every new record).

ID Column Data Type

Either: Text, Number, Date, Bool, Geography, or Geometry

5. Column Mappings

When specifying the Target Column in the Column Mappings section, all names are case-sensitive.

6. Filters

PreviousSalesforce ObjectNextSnowflake Table Target XML Example

Last updated 2 years ago

Was this helpful?

Unencrypted example: account=wr38353.ca-central-1.aws;user=myuser;password=mypassword;db=CINCHY;schema=PUBLIC You can review Snowflake's Connection String guide and parameter descriptions

Add in your applicable column(s) (Image 3). for further details on each column type.

You may choose to use CQL to create a filter (Image 4). for more on filters.

See the documentation here
Review the documentation here
here.
Image 1: Load your Metadata
Image 2: Basic parameters
Image 3: Column Mappings
Image 4: Adding a Source Filter