v5.0 - v5.5 Data Sync Documentation
  • Overview
  • Release Notes
    • V4.5
    • V4.6
    • v4.7
    • v4.8
    • v4.9
    • v4.10
    • v4.12
    • v4.13
  • v4 Connections Installation Guide
    • v5 Connections and CLI Installation Guide
  • v4 Worker/Listener Installation Guide
    • v5 Worker/Listener Installation Guide
  • Upgrades & Config Changes
  • Builder Guide
    • Overview
    • Types of Data Syncs
    • Subscribing to Event Streams
      • Supported Stream Sources
        • Amazon Simple Queue Service (SQS)
        • Cinchy Change Data Capture
        • Data Polling
        • Kafka Topic
        • MongoDB
        • Salesforce
          • Push Topic
          • Platform Event
    • Configuring a Data Sync
      • Connections Experience & XML Config Reference
        • Info
          • Parameters
            • String Escape
        • Source Config Reference
          • Schema Columns
            • Calculated Column Examples
          • Auth Requests
          • Request Headers
          • Pagination
          • Source Filter
        • Target Destination Config Reference
          • Column Mappings
          • Target Destination Filter
        • Sync Behaviour
        • Post Sync
        • Permissions
        • Jobs
        • Connections Usage Example
      • Data Validation
      • Transformations
        • StringReplacement
      • Supported Data Sources
        • File Based Source
          • Binary File
            • Binary File Data Source Example
          • Delimited File
            • Delimited File Data Source XML Example
          • Fixed Width File
          • Excel
            • Excel Source XML Example
        • Cinchy Event Broker/CDC
          • Cinchy Event Broker/CDC XML Config Example
        • Cinchy Table
          • Cinchy Table XML Config Example
        • Cinchy Query
          • Cinchy Query Data Source Example
        • Copper
        • DB2
        • Dynamics 2015
        • Dynamics
        • DynamoDB
        • Kafka Topic
          • Apache AVRO Data Format
          • Kafka Topic Source Example
        • LDAP
        • MongoDB Collection
          • MongoDB Collection Source XML Example
        • MongoDB Collection (Cinchy Event Triggered)
        • MS SQL Server Query
        • MS SQL Server Table
        • ODBC Table
        • ODBC Query
        • Oracle Table
        • Oracle Query
        • Parquet
        • Polling Event
          • Polling Event Example
        • REST API (Cinchy Event Triggered)
        • REST API
          • REST API XML Example
        • SAP SuccessFactors
        • Salesforce Object (Bulk API)
        • Salesforce Platform Event
        • Salesforce Push Topic
        • Snowflake
          • Snowflake Source XML Example
        • SOAP 1.2 Web Service
      • Supported Sync Targets
        • Cinchy Table
        • DB2 Table
        • Dynamics
        • Kafka Topic
        • MongoDB Collection (Column Based)
        • MS SQL Server Table
        • Oracle Table
        • REST API
        • Salesforce
        • Salesforce Object
        • Snowflake Table
          • Snowflake Table Target XML Example
        • SOAP 1.2 Web Service
    • Common Design Patterns
    • Testing a Data Sync
    • Promoting a Data Sync
    • Scheduling a Data Sync
    • CLI Command List
    • Connections Functions
    • Monitoring
  • Cinchy Platform Documentation
Powered by GitBook
On this page
  • 1. Considerations
  • 2. Parameters

Was this helpful?

  1. Builder Guide
  2. Configuring a Data Sync
  3. Supported Data Sources
  4. File Based Source

Fixed Width File

PreviousDelimited File Data Source XML ExampleNextExcel

Last updated 2 years ago

Was this helpful?

1. Considerations

  • Geometry, Geography, and Binary data types are not supported for Fixed Width Files.

2. Parameters

Parameter

Value

headerRowsToIgnore

the number of records from the top of the file to ignore before the data starts (includes column header)

footerRowsToIgnore

the number of records from the bottom of the file to ignore

path

The path to the source file to load. To upload a local file, you must first insert a Parameter in the Info tab of the connection (ex: filepath). Then, you would reference that same value in this location (Ex: @Filepath). This will then trigger a File Upload option to import your file.

encoding

the encoding of the file, defaults to UTF8 if not specified, but also supports UTF8_BOM, ASCII, and UTF16

maxBufferSizer

the maximum buffer size

Auth Type

This field defines the authentication type for your data sync. Cinchy supports "Access Key" and "IAM" role. When selecting "Access Key", you must provide the key and key secret. When selecting "IAM role", a new field will appear for you to paste in the role's Amazon Resource Name (ARN). You also must ensure that:

Note: This field was added in Cinchy v5.6

In the column definitions, add the length="number" attribute to denote the width of the column.

<FixedWidthDataSource 
    headerRowsToIgnore="1"
    footerRowsToIgnore="0"
    path="C:\Users\Cinchy\Sample.csv" 
    encoding="UTF8"
    maxBufferSize="10">
    <Schema>
      <Column name="Name" dataType="Text" maxLength="10" length="10" trimWhitespace="true" />
      <Column name="Description" dataType="Text" maxLength="50" length="50" trimWhitespace="true" />
  </Schema>
  <Filter/>
</FixedWidthDataSource>

You can use trimwhitespace to clean up a fixed width file, the trimming will occur after parsing the columns.

The role to have at least read access to the source

The Connections pods' role must specified in the data sync config

must be configured
have permission to assume the role