In this article we will try to cover all the possible scenarios related to flatfiles in SAP Data Services.

Requirement 1: Let us look into the first requirement: Generate respective flatfiles containing employee information for each department. So for each and every department we will generate flatfiles in a dynamic manner from our employee database table.

  1. First of all we will create a Job in Data Services. The main objective is to generate flat files dynamically using WHILE LOOP transform in Data Services. This is how the final implementation Job will look like:

    Job to generate dynamic flat files

  2. So first of all we will design a Dataflow to populate a temporary database template table to populate the distinct set of available Departments present in the EMP table. This table will contain the list of values of the departments together with a sequence number. We will use this sequence number to loop through the available LOV's using the While Loop transform.

    Dataflow to populate LOV

  3. Select the EMP database table as Source. Next place a Query transform to select the DISTINCT department numbers present in the EMP table.

    QUERY DISTINCT Department

  4. Next we will generate sequence number for each of the distinct departments using the in-built Data Services function gen_row_num()

    QUERY Generate Row Number

  5. Lastly we place a template table as the target table to load the distinct set of departments in the employee table.

    Target Template table

  6. Next we define the following Variables and initialize the variable values to be used in WHILE loop.

    Data Services Variables

  7. Next we set the variable $INIT_CNT to 0. This variable will be used as a counter variable of the While Loop. The variable $LOOP_CNT is set to the maximum values of the Departments in the Employee table. This will be the number of times the while loop will execute which in turn will generate the respective employee flat files for each departments.

    Variable Initialization Script

  8. Now we take a look inside the WHILE Loop and see the final implementation logic to generate dynamic flat files for each department. The logic of the while loop is simple, that is it will iterate as many times as the distinct set of departments.

    WHILE Loop

  9. The below script sets the Department Number and the corresponding file that needs to be generated dynamically by the next following Dataflow. Here the variable initialization counter is incremented by 1. Next the variable $DEPTNO is set to the first depratment value as present in the LOV department table we loaded earlier. Also the corresponding filename to be generated is set to the variable $FILENAME. So with each iteration of the WHILE loop it will set the department for which it has to extract the data from the employee table and generate the corresponding department file.

    Script to set Department

  10. Next we go to the final dataflow which will extract the data from the EMP table and generate flatfiles for each department. Before that in order to pass the variable values we set earlier to the dataflow, we define the following Input Parameters at the Dataflow level and also assign the variable values to these parameters at Workflow level.

    Define Parameters

    Assign Parameters values

  11. Now the final dataflow looks like below. We have the EMP table as source followed by a Query transform and lastly the flat file target.

    Dataflow to generate files

  12. In the Query transform we filter the employee records for the department number as set in the Dataflow Parameter. Also additionally we generate a sequence or serial number of the employees in each of the flat files.

    QUERY transform to filter DEPTNO

  13. Lastly we place the file-format of the generated employee file for each department. Note we need to set the File name property of the file-format to the dataflow Parameter we assigned earlier.

    Target Flat file Format

We are done with our first requirement.

Requirement 2: Now we try the reverse scenario i.e. load the database employee table with the information in the form of flatfiles received from source systems. These flatfiles have employee information based on each department; Also the number of files or departments is not know in Advance. Note, that the part of the flatfile name bears the dept information that also need to be mapped to deptno column of the employee table.

Here we go:

  1. We will design a dataflow to read all the input flatfiles having employee information extract the department number information from the filename and populate to the target relational table. Below is how the dataflow looks like:

    Dataflow to read multiple Flat file Format

  2. First of all we place the fileformat as our source. Note the filename is using wild character search. i.e. DEPT*.txt; which means it will read all files in the given root directory having filename starting with DEPT with extension .txt. Next the Include file name column property of the Source Information of the file format is set to Yes with column name as FILENAME. This will help to read the filename information and pass in the dataflow transform.

    Source Flat file Format

    Source Information Flatfile Format

  3. Next we use the Query Transform to format the Dataset. We will extract the department number from the filename itself, as shown below:

    Query Format

  4. Lastly we place the Target Template table.

    Query Format

Hence our second requirement is also done.


Have a question on this subject?

Ask questions to our expert community members and clear your doubts. Asking question or engaging in technical discussion is both easy and rewarding.

Are you on Twitter?

Start following us. This way we will always keep you updated with what's happening in Data Analytics community. We won't spam you. Promise.

  • XML file generation using SAP Data Services

    This article will demonstrate how to generate XML target files using SAP Data Services. Here our objective is to generate XML file with employee and department information.

  • Getting Started SAP BODS

    This article is a step by step guide to learn the basic of SAP BODS. Starting from the Basic we will cover the essential topics like SCD implementation, Fact loading, CDC Mechanisms, Persistent Cache, Substitution Parameters, Variables and...

  • DENSE RANK in Data Services

    In this article, we will learn how to implement DENSE RANK operation in SAP Data Services.

  • Fools Guide to BODS - Repository Creation

    In the first chapter of this article we have learnt the very basic of BODS. In this part of the article we will begin with BODS Repository creation process

  • Auditing in SAP Data Services

    A proper data reconciliation process must be in place in any data Extraction-Transformation-Load (ETL) process. A successful reconciliation process should only indicate whether or not the data is correct. But data reconciliation is not easy....

  • How to implement SCD Type 2 using History Preserve Transform in Data Services

    This tutorial teaches you how to use the "History Preserving Transform" in SAP Data Services (BODS) by demonstrating a practical use of this transform for the implementation of SCD Type 2. We have also provided hands-on video below so that you can...

  • Data Services Metadata Query Part 1

    Data Services provides full access to the repository metadata tables and views for metadata analysis. To access this metadata either we can use SQL SELECT statements or use the metadata reporting from Management Console.

  • Working with Data Services Flatfiles

    In this article we will try to cover all the possible scenarios related to flatfiles in SAP Data Services.

  • Text Data Processing using SAP Data Services

    This article deals with Text Data Processing using SAP Business Objects Data Services with the intension of Text Analytics. SAP BODS provides a single ETL platform for both Structured and Unstructured data as well as Data Quality, Data Profiling...

  • Table Comparison Transform to Implement Slowly Change Dimension (SCD) in Data Service

    In this tutorial we will learn a new SAP Data services transform, known as Table Comparison Transform and we will see how we may use this transform to implement "slowly changing dimension" (SCD) Type - I. Like before, we have added a video tutorial at the...