v5.0 - v5.5 Data Sync Documentation
  • Overview
  • Release Notes
    • V4.5
    • V4.6
    • v4.7
    • v4.8
    • v4.9
    • v4.10
    • v4.12
    • v4.13
  • v4 Connections Installation Guide
    • v5 Connections and CLI Installation Guide
  • v4 Worker/Listener Installation Guide
    • v5 Worker/Listener Installation Guide
  • Upgrades & Config Changes
  • Builder Guide
    • Overview
    • Types of Data Syncs
    • Subscribing to Event Streams
      • Supported Stream Sources
        • Amazon Simple Queue Service (SQS)
        • Cinchy Change Data Capture
        • Data Polling
        • Kafka Topic
        • MongoDB
        • Salesforce
          • Push Topic
          • Platform Event
    • Configuring a Data Sync
      • Connections Experience & XML Config Reference
        • Info
          • Parameters
            • String Escape
        • Source Config Reference
          • Schema Columns
            • Calculated Column Examples
          • Auth Requests
          • Request Headers
          • Pagination
          • Source Filter
        • Target Destination Config Reference
          • Column Mappings
          • Target Destination Filter
        • Sync Behaviour
        • Post Sync
        • Permissions
        • Jobs
        • Connections Usage Example
      • Data Validation
      • Transformations
        • StringReplacement
      • Supported Data Sources
        • File Based Source
          • Binary File
            • Binary File Data Source Example
          • Delimited File
            • Delimited File Data Source XML Example
          • Fixed Width File
          • Excel
            • Excel Source XML Example
        • Cinchy Event Broker/CDC
          • Cinchy Event Broker/CDC XML Config Example
        • Cinchy Table
          • Cinchy Table XML Config Example
        • Cinchy Query
          • Cinchy Query Data Source Example
        • Copper
        • DB2
        • Dynamics 2015
        • Dynamics
        • DynamoDB
        • Kafka Topic
          • Apache AVRO Data Format
          • Kafka Topic Source Example
        • LDAP
        • MongoDB Collection
          • MongoDB Collection Source XML Example
        • MongoDB Collection (Cinchy Event Triggered)
        • MS SQL Server Query
        • MS SQL Server Table
        • ODBC Table
        • ODBC Query
        • Oracle Table
        • Oracle Query
        • Parquet
        • Polling Event
          • Polling Event Example
        • REST API (Cinchy Event Triggered)
        • REST API
          • REST API XML Example
        • SAP SuccessFactors
        • Salesforce Object (Bulk API)
        • Salesforce Platform Event
        • Salesforce Push Topic
        • Snowflake
          • Snowflake Source XML Example
        • SOAP 1.2 Web Service
      • Supported Sync Targets
        • Cinchy Table
        • DB2 Table
        • Dynamics
        • Kafka Topic
        • MongoDB Collection (Column Based)
        • MS SQL Server Table
        • Oracle Table
        • REST API
        • Salesforce
        • Salesforce Object
        • Snowflake Table
          • Snowflake Table Target XML Example
        • SOAP 1.2 Web Service
    • Common Design Patterns
    • Testing a Data Sync
    • Promoting a Data Sync
    • Scheduling a Data Sync
    • CLI Command List
    • Connections Functions
    • Monitoring
  • Cinchy Platform Documentation
Powered by GitBook
On this page
  • 1. Considerations
  • 2. Load Metadata
  • 2. Schema
  • 3. Filter

Was this helpful?

  1. Builder Guide
  2. Configuring a Data Sync
  3. Supported Data Sources
  4. File Based Source

Delimited File

PreviousBinary File Data Source ExampleNextDelimited File Data Source XML Example

Last updated 2 years ago

Was this helpful?

1. Considerations

  • Geometry, Geography, and Binary data types are not supported for Delimited Files.

2. Load Metadata

Source

The source of your file. This can be local upload, Amazon S3, or Azure Blob Storage.

delimiter

The delimiter character, use U+#### syntax (e.g. U+0001) for unicode characters

textQualifier

The text qualifier character, e.g. " (which would be represented as "")

headerRowsToIgnore

The number of records from the top of the file to ignore before the data starts (includes column header). See note below for more on this parameter.

encoding

The encoding of the file, defaults to UTF8 if not specified, but also supports UTF8_BOM, ASCII, and UTF16

path

The path to the source file to load. To upload a local file, you must first insert a Parameter in the Info tab of the connection (ex: filepath). Then, you would reference that same value in this location (Ex: @Filepath). This will then trigger a File Upload option to import your delimited file.

Auth Type

This field defines the authentication type for your data sync. Cinchy supports "Access Key" and "IAM" role. When selecting "Access Key", you must provide the key and key secret. When selecting "IAM role", a new field will appear for you to paste in the role's Amazon Resource Name (ARN). You also must ensure that:

Note: This field was added in Cinchy v5.6

If you use both useHeaderRecord="true" and HeaderRowsToIgnore = 1, two rows will be ignored. Refer to the below to ensure you are receiving the results you want: One row as headers: useHeaderRecord="true" and HeaderRowsToIgnore = 0

Two rows as headers: useHeaderRecord="true" and HeaderRowsToIgnore = 1 Three rows as headers: useHeaderRecord="true" and HeaderRowsToIgnore = 2

2. Schema

Geometry, Geography, and Binary data types are not supported for Delimited Files.

3. Filter

The role to have at least read access to the source

The Connections pods' role must specified in the data sync config

Add in your applicable column(s) (Image 2). for further details on each column type.

You may choose to use CQL to create a filter (Image 3). for more on filters.

See the documentation here
Review the documentation here
must be configured
have permission to assume the role
Image 1: Load the Metadata
Image 2: Schema
Image 3: Adding a Filter