v5.0 - v5.5 Data Sync Documentation
  • Overview
  • Release Notes
    • V4.5
    • V4.6
    • v4.7
    • v4.8
    • v4.9
    • v4.10
    • v4.12
    • v4.13
  • v4 Connections Installation Guide
    • v5 Connections and CLI Installation Guide
  • v4 Worker/Listener Installation Guide
    • v5 Worker/Listener Installation Guide
  • Upgrades & Config Changes
  • Builder Guide
    • Overview
    • Types of Data Syncs
    • Subscribing to Event Streams
      • Supported Stream Sources
        • Amazon Simple Queue Service (SQS)
        • Cinchy Change Data Capture
        • Data Polling
        • Kafka Topic
        • MongoDB
        • Salesforce
          • Push Topic
          • Platform Event
    • Configuring a Data Sync
      • Connections Experience & XML Config Reference
        • Info
          • Parameters
            • String Escape
        • Source Config Reference
          • Schema Columns
            • Calculated Column Examples
          • Auth Requests
          • Request Headers
          • Pagination
          • Source Filter
        • Target Destination Config Reference
          • Column Mappings
          • Target Destination Filter
        • Sync Behaviour
        • Post Sync
        • Permissions
        • Jobs
        • Connections Usage Example
      • Data Validation
      • Transformations
        • StringReplacement
      • Supported Data Sources
        • File Based Source
          • Binary File
            • Binary File Data Source Example
          • Delimited File
            • Delimited File Data Source XML Example
          • Fixed Width File
          • Excel
            • Excel Source XML Example
        • Cinchy Event Broker/CDC
          • Cinchy Event Broker/CDC XML Config Example
        • Cinchy Table
          • Cinchy Table XML Config Example
        • Cinchy Query
          • Cinchy Query Data Source Example
        • Copper
        • DB2
        • Dynamics 2015
        • Dynamics
        • DynamoDB
        • Kafka Topic
          • Apache AVRO Data Format
          • Kafka Topic Source Example
        • LDAP
        • MongoDB Collection
          • MongoDB Collection Source XML Example
        • MongoDB Collection (Cinchy Event Triggered)
        • MS SQL Server Query
        • MS SQL Server Table
        • ODBC Table
        • ODBC Query
        • Oracle Table
        • Oracle Query
        • Parquet
        • Polling Event
          • Polling Event Example
        • REST API (Cinchy Event Triggered)
        • REST API
          • REST API XML Example
        • SAP SuccessFactors
        • Salesforce Object (Bulk API)
        • Salesforce Platform Event
        • Salesforce Push Topic
        • Snowflake
          • Snowflake Source XML Example
        • SOAP 1.2 Web Service
      • Supported Sync Targets
        • Cinchy Table
        • DB2 Table
        • Dynamics
        • Kafka Topic
        • MongoDB Collection (Column Based)
        • MS SQL Server Table
        • Oracle Table
        • REST API
        • Salesforce
        • Salesforce Object
        • Snowflake Table
          • Snowflake Table Target XML Example
        • SOAP 1.2 Web Service
    • Common Design Patterns
    • Testing a Data Sync
    • Promoting a Data Sync
    • Scheduling a Data Sync
    • CLI Command List
    • Connections Functions
    • Monitoring
  • Cinchy Platform Documentation
Powered by GitBook
On this page
  • 1. Considerations
  • 2. Basic Parameters
  • 3. Schema Columns
  • 4. Filter

Was this helpful?

  1. Builder Guide
  2. Configuring a Data Sync
  3. Supported Data Sources
  4. File Based Source

Binary File

This page describes how to use a binary file as a data source in Connections.

PreviousFile Based SourceNextBinary File Data Source Example

Last updated 2 years ago

Was this helpful?

1. Considerations

  • Binary File does not support Geometry or Geography data types.

2. Basic Parameters

To connect a Binary File as a Data Source, fill in the following parameters (image 1):

Parameter

Value

Source

The location of the source file. Either a Local upload, Amazon S3, or Azure Blob Storage The following authentication methods are supported per source: Amazon S3: Access Key ID/Secret Access Key Azure Blob Storage: Connection String

Header Lines to Ignore

The number of records from the top of the file to ignore before the data starts (includes column header).

Footer Lines to Ignore

The number of records from the bottom of the file to ignore

Encoding

The encoding of the file. This default to UTF8, however also supports: UTF8_BOM, UTF16, ASCII.

Path

The path to the source file to load. To upload a local file, you must first insert a Parameter in the Info tab of the connection (ex: filepath). Then, you would reference that same value in this location (Ex: @Filepath). This will then trigger a File Upload option to import your file.

Auth Type

This field defines the authentication type for your data sync. Cinchy supports "Access Key" and "IAM" role. When selecting "Access Key", you must provide the key and key secret. When selecting "IAM role", a new field will appear for you to paste in the role's Amazon Resource Name (ARN). You also must ensure that:

Note: This field was added in Cinchy v5.6

3. Schema Columns

Geometry and geography data types are not supported for binary files.

  1. Binary File sources have a unique parameter for Standard Columns:

    1. Parse Content By - Choose from the following three options to define how you want to parse your content:

      • Byte Length - The content length in number of bytes

      • Trailing Byte Sequence - the trailing sequence in base64 that indicates the end of the field

      • Succeeding Byte Sequence - the trailing sequence in base64 that indicates the start of the next field, and thus the end of this one.

4. Filter

The role to have at least read access to the source

The Connections pods' role must specified in the data sync config

Add in your applicable columns (Image 2). for information on the column types available.

Adding a filter section allows you to enter a CQL filter statement for your results (Image 3). for more information on adding a Filter.

Review the documentation here
See here
must be configured
have permission to assume the role
Image 1: Binary File parameters
Image 2: Schema Columns
Image 3: Adding a Filter Section