In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.

Consider the following Source data in a flat file:

DEPTNOSALARY
101000
202000
303000
404000

Scenario 1: Lets try to load the Cumulative Sum of salaries of the departments into the target table. The target table data should look like below:

DEPTNOSALARYCUMULATIVE_SALARY
1010001000
2020003000
3030006000
40400010000

Solution:

1. Let us first define the Source File Format. This same file format will be reused for the next set of the scenario questions.

File Format

2. Next we create a new Batch Job, say JB_SCENARIO_DS. Within the Job we create a Data Flow, say DF_SCENARIO_1.

3. At the Data flow level i.e. Context DF_SCENARIO_1, we Insert a new Parameter using the Definitions tab. Lets name it as $PREV_SAL with Data type decimal(10,2) and Parameter type as Input.

Parameters- Data flow

Parameter Properties

At the Job level i.e. Context JB_SCENARIO_1, we initialize the Parameter $PREV_SAL using the Calls tab. We set the Argument value to 0.00

Parameters- Job

Parameter Value

4. Next we create a New Custom Function from the Local Object Library. Lets name it CF_CUME_SUM_SAL.

Custom Function

Within the Custom Function Smart Editor, first we Insert two Parameters, namely $CURR_SAL and $PREV_SAL with Data type decimal(10,2) with Parameter type as Input and Input/Output respectively.

Custom Function Definition

Also we modify the Return Parameter Data type to decimal(10,2).

5. Next we define the custom function as below and Validate the same.

$PREV_SAL = $CURR_SAL + $PREV_SAL;	
Return $PREV_SAL;

The purpose of defining the Parameter and Custom Function is to perform Parameter Short-circuiting. Here within the function, we basically set the $PREV_SAL Parameter of type Input/Output to sum of salaries till the current processing row. Since it is of type Input/Output the calculated sum value or the retained sum of salary is passed back into the Dataflow Parameter. So by using Custom Function we can modify and pass values to a Dataflow Parameter. Hence the Parameter defined at Dataflow level is short-circuited with the Input/Output Parameter of the Custom Function.

6. Lets go back and design the Data flow. First of all we take the File Format defined earlier, from the Local Object Library as Source.

Data flow

7. Next we place a Query transform, say QRY_CUME_SUM. First we select the columns DEPTNO and SALARY from the Schema In of the Query transform and Map to Output.

Next we specify a New Function Call in Schema Out of the Query transform. Choose the Custom Functions from the Function categories and select the Function name CF_CUME_SUM_SAL.

Next we Define Input Parameters. We specify the inputs as below:

$CURR_SAL = FF_SRC_DEPT.SALARY
$PREV_SAL = $PREV_SAL

Function Input Parameters

Select the Return column as the Output Parameter.

Query transform

8. Finally we place a Template Table as Target in the Target Datastore.

Data Preview

Click here to read the next scenario - Getting the value from the previous row in the current row


Have a question on this subject?

Ask questions to our expert community members and clear your doubts. Asking question or engaging in technical discussion is both easy and rewarding.

Are you on Twitter?

Start following us. This way we will always keep you updated with what's happening in Data Analytics community. We won't spam you. Promise.

  • Data Services Flatfiles Tips

    Often we come across scenarios where we have the flat file definition in an excel sheet and we need to create corresponding File Format in SAP Data Services. Alternatively we import file format definition from a Sample Source file.

  • Data Services Scenario Questions Part 4

    In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.

  • Data Services Scenario Questions Part 5

    In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.

  • Getting Started SAP BODS

    This article is a step by step guide to learn the basic of SAP BODS. Starting from the Basic we will cover the essential topics like SCD implementation, Fact loading, CDC Mechanisms, Persistent Cache, Substitution Parameters, Variables and...

  • Data Services Scenario Questions Part 6

    In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.

  • Data Services Metadata Query Part 2

    This article is a continuation of the previous topic related to Data Services Metadata Query. Let us explore more into the Data Services Repository Metadata.

  • How to install SAP BODS - Standalone

    This article is a step by step guide on how to install standalone SAP BODS Batch Job Server, Client Components, Web based Administrative Console and how to configure the same.

  • How to use Data Services Pivot Transformation

    In this article, we will learn how to use SAP Data Services Pivot Transform. The Pivot transformation allows us to change how the relationship between rows is displayed. For each value in each pivot column, Data Services produces a row in the...

  • SAP BODS Cluster Installation

    This article is a step by step guide on how to configure SAP BODS for High Availability using Windows Cluster services. To take advantage of fail-over support for SAP BusinessObjects Data Services services in a Windows Clustering Environment,...

  • Data Services Scenario Questions Part 1

    In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.