Monitoring

If your data sync configuration has failed, here are a few items of consideration:

  • Have your credentials changed in either the source or target (e.g. expired password)?

  • Is your sync key unique in your source and target?

  • Is the configuration entered in the [Cinchy].[Data Sync Configurations] table?

  • If source is a file, does it exist at the specified location?

Cinchy Tables

When running a datasync interactively, the output screen will display the result of the job on the first line, there are two (2) potential outcomes:

  • Data sync completed successfully

  • Data sync completed with errors (see <temp folder>for error logs)

If the data sync runs on a schedule, there are two (2) tables in the Cinchy domain that can be reviewed to determine the outcome:

  1. Execution Log Table - this is where you can find the output and status of any job executed

    Please note, you can see when the job ran by the Create timestamp on the record. (Display Columns -> add Created column to the view)

  2. Execution Errors Table - this table may have one or more records for a job that failed with synchronization or validation errors

Execution Log Table Schema

Column

Definition

Execution ID

This is the number assigned to the job that has been executed and is incremented by one (1) for each subsequent job that is executed

Command

This column will display the CLI command that has been executed (e.g.Data Sync, Data Export etc.)

Server Name

This is the name of the server where the CLI was executed. If you run the CLI from a personal computer this is the name of your computer.

File Path

In case of a Data Sync, if the source is a file, this field will contain a link to the file.

In case of a Data Export, the field will be a link to the file created by the export.

Note that these are paths local to the server where the CLI was executed.

Parameters

This column will display any parameters passed to the command line

State

This column will display the state of the job (e.g. Succeeded, Failed or Running)

Execution Output

This column will display the output that would have been displayed if the job was executed from the command prompt.

Execution Time

This column will display how long it took to execute the job

Data Sync Config

This column will have a link to the name of your configuration file

Execution Error Table Defined

Column

Description

Error Class

This column will display the category of errors that has been generated (e.g. Invalid Row, Invalid Column, Invalid File etc.)

Error Type

This column will display the reason for the error (e.g. Unresolved Link, Invalid Format Exception, Malformed Row, Max Length Violation etc)

Column

This column will display the name of the column that generated the error

Row

This column will display the row number(s) of records from the source that generated the error

Row Count

This column will display the number of records affected by this error

Execution ID

This column will have a link that ties back to the error to the Execution Log

Tip

To automatically check if the job was successful, you have three (3) exit codes that can be checked for the job:

  • 0 - Completed without errors

  • 1 - Execution failed

  • 2 - Completed with validation errors

Sample Code

$CLICommand = "dotnet C:\CinchyCLI\Cinchy.CLI.dll syncdata -q ..." 
Invoke-Expression $CLICommand 
switch ($LASTEXITCODE) { 
0 { Write-Host "Completed without errors" } 
1 { Write-Host "Execution failed" } 
2 { Write-Host "Completed with validation errors" } 
}

Logs

The syncdata command will use the folder, indicated after the -d parameter in the command line, to create and store temporary files. If the data sync is successful, all the temporary files are automatically purged. However, if there is an error the following CSV files will exist:

  • ExecutionLogID_SourceErrors.csv

  • ExecutionLogID_SyncErrors.csv

  • ExecutionLogID_TargetErrors.csv

For more information on detailed error messaged click here if required.

Last updated