Error Handling, Recoverability are the important aspects of any ETL tool. Some of the ETL tools have some sort of in-built error-handling and automatic recovery mechanisms in place.

Alternatively we usually design our ETL framework to handle runtime errors or exceptions. The three mains goals for a perfect ETL framework are Error Handling and Logging, Recoverability and Restartability. If a job does not complete properly, we must fix the problems that prevented the successful execution of the job and run the job again. However, during the failed job execution, some dataflows in the job may have completed and some tables may have been loaded, partially loaded, or altered. Therefore, we need to design the ETL jobs to be recoverable i.e. rerun the job and retrieve all the data without duplicate or missing data. There can be various techniques to recover from unsuccessful job executions.

Let us check the in-built features available in SAP Data Services. Recovery mechanisms are available in SAP Data Services for batch jobs only.

Data Services Error Log

Error logs helps us to determine the reason of job execution failure. The Error log lists errors generated by Data Services, by the source or target DBMS, or the operating system. If the execution completed without error, the Error log is blank. On the other hand Trace logs are used to determine where an execution failed. Let us check the significance of the columns in an Error log.

  • Pid- The process thread identification number of the executing thread.
  • Tid- The thread identification number of the executing thread.
  • Number- An Error number prefix followed by a number.
  • TimeStamp The date and time when the thread generated the message.
  • Message An error description that occurred as the thread was executing.

Now lets check the significance of the Error number prefixes:

  • ADM - Administration errors.
  • BAP - BAPI errors.
  • BIW - SAP BW errors.
  • CON - Connection errors. The connection indicated could not be initialized or failed during execution.
  • DBS - Database management system errors.
  • EML - Email errors.
  • FIL - Filespec errors.
  • OPT - Optimization errors.
  • PAR - Parser errors.
  • R3C - SAP connectivity errors.
  • R3S - SAP syntax errors.
  • REP - Repository errors.
  • RES - Resolver errors.
  • RUN - Runtime errors.
  • SCH - Job launcher errors.
  • SRV - Job Server errors.
  • SYS - System exceptions.
  • USR - User function errors.
  • VAL - Validator errors.
  • XRN - Transform errors.

Error Handling for File Formats

During job execution, for flat-file sources Data Services processes one row at a time. We can configure the File Format Editor to identify rows in flat-file sources that contain the following types of errors:

  • Data Conversion Errors — Suppose a field might be defined in the File Format Editor as having a data type of integer but the data encountered is actually varchar.
  • Row Format Errors — In the case of a fixed-width file, the software identifies a row that does not match the expected width value.

In the File Format Editor, the Error Handling set of properties performs the following actions on selection:

  • Log the data-type conversion or row-format warnings to the Error Log.
  • Can limit the number of warnings to log without stopping the job.
  • Capture or write the errors in the Error File.
  • Check for either of the two types of source flatfile error ( Data Conversion & Row Format ).
  • Writes the invalid rows to a specified Error file in the Job Server.
  • Stop processing the source file after reaching a specified number of invalid row count.
  • The error file format is a semicolon-delimited text file.
  • The contents of the error files are:
  1. Source file path and name, as multiple input source files can be processed using the file format in the dataflow.
  2. Row number of the error record in the source file.
  3. Detailed error description for the rejected record.
  4. Column number where the error occurred.
  5. All columns values of the invalid row seperated by semicolons.
  • If the file format's Parallel process thread option is set to any value greater than 0 or {none}, then the row number in source file value will be -1 for the invalid records.


Have a question on this subject?

Ask questions to our expert community members and clear your doubts. Asking question or engaging in technical discussion is both easy and rewarding.

Are you on Twitter?

Start following us. This way we will always keep you updated with what's happening in Data Analytics community. We won't spam you. Promise.

  • Handling XML source files in SAP Data Services

    This article will demonstrate how to read data from XML based source files using SAP Data Services. Here our objective is to load employee and department information respectively from the source XML based file.

  • Fools Guide to BODS - Introductory Tutorial

    This article is a guide to the readers who are totally ignorant about the tool SAP Business Objects Data Services. It is presented in a very simple way so that readers would be familiar with common terms, terminologies and would be able to run...

  • Data Services Scenario Questions Part 5

    In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.

  • Working with Data Services Flatfiles

    In this article we will try to cover all the possible scenarios related to flatfiles in SAP Data Services.

  • How to use Lookup and Join in SAP Data Services

    In this tutorial, we will learn how to perform lookup and join in SAP BusinessObjects data services using the Query Transform. In the way, we will also see how to use some of the basic inbuilt functions that are provided in data services. A short...

  • Fools Guide to BODS - Registering Repository to CMC

    In our earlier article, we have learnt how to create a repository in BODS. Once that part is done, The final process before launching the BODS application is registering the repository with...

  • SAP BODS Transforms

    This article deals with the various types of transformations available in SAP BODS. Transformations are in-built, optional objects used in dataflow to transform source data to desired output dataset objects available in Local Object Library under...

  • Text Data Processing using SAP Data Services

    This article deals with Text Data Processing using SAP Business Objects Data Services with the intension of Text Analytics. SAP BODS provides a single ETL platform for both Structured and Unstructured data as well as Data Quality, Data Profiling...

  • Fools Guide to BODS - Repository Creation

    In the first chapter of this article we have learnt the very basic of BODS. In this part of the article we will begin with BODS Repository creation process

  • Data Services Metadata Query Part 1

    Data Services provides full access to the repository metadata tables and views for metadata analysis. To access this metadata either we can use SQL SELECT statements or use the metadata reporting from Management Console.