In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.

Consider the following Source data in a flat file:


Scenario 1: Lets try to load the Cumulative Sum of salaries of the departments into the target table. The target table data should look like below:



1. Let us first define the Source File Format. This same file format will be reused for the next set of the scenario questions.

File Format

2. Next we create a new Batch Job, say JB_SCENARIO_DS. Within the Job we create a Data Flow, say DF_SCENARIO_1.

3. At the Data flow level i.e. Context DF_SCENARIO_1, we Insert a new Parameter using the Definitions tab. Lets name it as $PREV_SAL with Data type decimal(10,2) and Parameter type as Input.

Parameters- Data flow

Parameter Properties

At the Job level i.e. Context JB_SCENARIO_1, we initialize the Parameter $PREV_SAL using the Calls tab. We set the Argument value to 0.00

Parameters- Job

Parameter Value

4. Next we create a New Custom Function from the Local Object Library. Lets name it CF_CUME_SUM_SAL.

Custom Function

Within the Custom Function Smart Editor, first we Insert two Parameters, namely $CURR_SAL and $PREV_SAL with Data type decimal(10,2) with Parameter type as Input and Input/Output respectively.

Custom Function Definition

Also we modify the Return Parameter Data type to decimal(10,2).

5. Next we define the custom function as below and Validate the same.

Return $PREV_SAL;

The purpose of defining the Parameter and Custom Function is to perform Parameter Short-circuiting. Here within the function, we basically set the $PREV_SAL Parameter of type Input/Output to sum of salaries till the current processing row. Since it is of type Input/Output the calculated sum value or the retained sum of salary is passed back into the Dataflow Parameter. So by using Custom Function we can modify and pass values to a Dataflow Parameter. Hence the Parameter defined at Dataflow level is short-circuited with the Input/Output Parameter of the Custom Function.

6. Lets go back and design the Data flow. First of all we take the File Format defined earlier, from the Local Object Library as Source.

Data flow

7. Next we place a Query transform, say QRY_CUME_SUM. First we select the columns DEPTNO and SALARY from the Schema In of the Query transform and Map to Output.

Next we specify a New Function Call in Schema Out of the Query transform. Choose the Custom Functions from the Function categories and select the Function name CF_CUME_SUM_SAL.

Next we Define Input Parameters. We specify the inputs as below:


Function Input Parameters

Select the Return column as the Output Parameter.

Query transform

8. Finally we place a Template Table as Target in the Target Datastore.

Data Preview

Click here to read the next scenario - Getting the value from the previous row in the current row

Have a question on this subject?

Ask questions to our expert community members and clear your doubts. Asking question or engaging in technical discussion is both easy and rewarding.

Are you on Twitter?

Start following us. This way we will always keep you updated with what's happening in Data Analytics community. We won't spam you. Promise.

  • SAP BODS Transforms

    This article deals with the various types of transformations available in SAP BODS. Transformations are in-built, optional objects used in dataflow to transform source data to desired output dataset objects available in Local Object Library under...

  • Handling XML source files in SAP Data Services

    This article will demonstrate how to read data from XML based source files using SAP Data Services. Here our objective is to load employee and department information respectively from the source XML based file.

  • Fools Guide to BODS - Designer

    In our earlier article on Data Services we have learnt how to register the local repository with CMC. In this part of the article we will start using Data Services Designer.

  • Fools Guide to BODS - Registering Repository to CMC

    In our earlier article, we have learnt how to create a repository in BODS. Once that part is done, The final process before launching the BODS application is registering the repository with...

  • Data Services Scenario Questions Part 7

    In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.

  • Data Services Metadata Query Part 1

    Data Services provides full access to the repository metadata tables and views for metadata analysis. To access this metadata either we can use SQL SELECT statements or use the metadata reporting from Management Console.

  • How to install SAP BODS - Standalone

    This article is a step by step guide on how to install standalone SAP BODS Batch Job Server, Client Components, Web based Administrative Console and how to configure the same.

  • One Stop to SAP BODI/BODS

    BODI Business Objects Data Integrator or BODS Business Objects Data Services is a GUI workspace that allows to create jobs that extracts data from heterogeneous sources, transforms that data using built-in transforms and functions to meet business...

  • GROUP RANK in Data Services

    In this article, we will learn how to implement RANK and DENSE RANK operations on GROUP in SAP Data Services.

  • Data Services Scripting Language

    We can use the Data Services Scripting Language to write scripts and custom functions to address complex logical expressions.