We can use the Data Services Scripting Language to write scripts and custom functions to address complex logical expressions.

Jobs and work flows can use scripts to define detailed steps in the flow of logic. A script can run functions and assign values to variables, which can then be passed to other steps in the flow.

Expressions are a combination of constants, operators, functions, and variables that evaluate to a value of a given data type. Expressions can be used inside script statements or added to data flow objects.

Scripting Guidelines

  • Statements in a script object or custom function must end with a semicolon (;)
  • Comment lines must start with a # character.
  • Use single quotes for string constants.
  • Uses the backslash (\) as the escape character.
    E.g. 
    single quote (') -> ' WHERE NAME LIKE \'SAURAV\' '
    backslash (\) -> 'C:\\DS'
    

    Also strings including curly braces or square brackets cause a processing error. We can avoid the error by preceding the braces or brackets with a backslash (\).

  • Data Services does not strip trailing blanks from strings that are used in scripts or custom functions. To remove trailing blanks, we can use the rtrim or rtrim_blank function.
  • Variable Names must be preceded by a dollar sign ($).
  • Global Variables used in a script or expression must be defined at the Job level using the Variables and Parameters window.
  • Local Variables used in a script or expression must be defined in the Job or Workflow context that calls the script, using the Variables and Parameters window; Local variables used in a Custom Function must be defined using the Smart Editor.
  • The square brackets ([]) indicate that the value of the expression should be substituted.
  • The curly braces ({}) indicate that the value of the expression should be quoted with single quotation marks.

Sample Examples

  1. #Drop Table if it exists in Database
    sql( 'DataStore_Name', 'IF EXISTS(SELECT NAME FROM DBO.SYSOBJECTS 
     WHERE NAME = \'Table_Name\' AND XTYPE = \'U\') DROP TABLE Table_Name');
    
    #Create Table in Database
    sql( 'DataStore_Name', 'CREATE TABLE Table_Name( Col1 VARCHAR(10), 
     Col2 INT )' );
    print( 'Table Created' );
    
  2. #Print Total Records in a Database Table
    $LV_Count = total_rows(DataStore_Name.Owner_Name.Table_Name);
    print( 'Total Records in Table: ' || $LV_Count );
    
    #Alternatively
    print( 'Total Records in Table: 
     [total_rows(DataStore_Name.Owner_Name.Table_Name)]' );
    
  3. #Delete Data from table
    
    $LV_Load_Date = sql('DataStore_Name','SELECT MAX(load_date) 
     FROM Batch_Control_Tbl');
    
    sql( 'DataStore_Name', 'DELETE FROM TABLE_NAME WHERE load_date =
     {$LV_Load_Date}');
    
  4. #  Various ways to define select
    sql( 'DataStore_Name', 'SELECT sal FROM emp WHERE EMPNO ='
     || $LV_EMPNO );
    sql( 'DataStore_Name', 'SELECT sal FROM emp WHERE ENAME LIKE
     {$LV_EMPNO}' );
    sql( 'DataStore_Name', 'SELECT sal FROM emp WHERE ENAME LIKE 
     \'' || $LV_EMPNO || '\'' ); 
    
  5. #Copy Contents of two files into a single file (Windows)
    print(exec('cmd', 'copy "C:\\POC\\Header.txt"+"C:\\POC\\Detail.txt" 
     "C:\\POC\\ConvertedFile.txt"', 8));
    

Other useful custom scripts

  1. Drop a database table after determining the database type:
    # $G_Table_Name is the global variable to hold table name say STG_COMPLAIN
    # DS_SRC_REPO is the Data Store name
    # $V_String is a local variable
    
    #1 For Microsoft_SQL_Server
    if (upper( db_type( 'DS_SRC_REPO')) = 'MICROSOFT_SQL_SERVER')
    begin
    $V_String = 'SELECT NAME FROM sys.objects WHERE NAME = {$G_Table_Name}';
    end
    #2 For Oracle
    if (upper( db_type( 'DS_SRC_REPO')) = 'ORACLE')
    begin
    $V_String = 'SELECT TABLE_NAME FROM USER_TABLES WHERE TABLE_NAME = {$G_Table_Name}';
    end
    
    # If table exists, then drop
    if ( sql( 'DS_SRC_REPO',$V_String) IS not null )
    begin
    sql( 'DS_SRC_REPO','DROP TABLE '||$G_Table_Name);
    sql( 'DS_SRC_REPO','CREATE TABLE [$G_Table_Name] ( COMPLAIN_ID int ) ');
    end
    
  2. Set repository password same as username else defined via Global variable during Job Execution
    # If the repository password is not same as the repository user name then 
    we need to set the the global variable $G_Repo_Password 
    
    $G_Repo_Password = nvl($G_Repo_Password,datastore_field_value('DS_SRC_REPO','user'));
    
  3. Test repository connection at run-time
    # $G_Repo_Server is the global variable for Repository Server name 
    # $G_Repo_Username is the global variable for Repository User name;
    # $G_Repo_Password is the global variable for Repository Password;
    # $G_Repo_Name is the global variable for Local Repository name;
    
    #1 For Microsoft_SQL_Server with SQL server Authentication
    IF(db_type('DS_SRC_REPO') = 'Microsoft_SQL_Server' 
    and datastore_field_value('DS_SRC_REPO','mssql_windows_authentication') <> 'yes')
    begin
    $G_Repo_Server = nvl($G_Repo_Server,datastore_field_value('DS_SRC_REPO','sql_server_dataserver')); 
    $G_Repo_Username = nvl($G_Repo_Username,datastore_field_value('DS_SRC_REPO','user'));
    $G_Repo_Password = nvl($G_Repo_Password,datastore_field_value('DS_SRC_REPO','user'));
    $G_Repo_Name = nvl($G_Repo_Name,datastore_field_value('DS_SRC_REPO','sql_server_database'));
    $G_Repo_Info_for_al_engine ='-U[$G_Repo_Username] -P[$G_Repo_Password] 
    -S[$G_Repo_Server] -NMicrosoft_SQL_Server -Q[$G_Repo_Name]';
    
    print('INFO - The database type of the repository is: Microsoft_SQL_Server with SQL server Authentication');
    end 
    
    # Test repository connection 
    # Successful message: Connected successfully to the repository. (BODI-300098)
    # Unsuccessful message: Could not connect to the repository. (BODI-300099)
    $V_Test_Repo = exec('al_engine.exe',' '||$G_Repo_Info_for_al_engine||' -test_repo',0);
    print('INFO - Test repository connection: '||$V_Test_Repo);
    If(index( $V_Test_Repo ,'BODI-300098',1) is null )
    begin
    print('ERROR - Could not connect to the repository <[$G_Repo_Name]>'
    ||',please check the value of the variables $G_Repo_Server,$G_Repo_Name,$G_Repo_Username and $G_Repo_Password.');
    raise_exception('ERROR - Could not connect to the repository <[$G_Repo_Name]>'
    ||',please check the value of the variables $G_Repo_Server,$G_Repo_Name,$G_Repo_Username and $G_Repo_Password.');
    end 
    
  4. Import database table at run-time
    #IMPORT DATABASE TABLES INTO DATASTORE
    print( 'Info - Importing table <'|| $G_Table_Name || '>');	
    
    $L_Message = exec( 'al_engine.exe',' '|| $G_Repo_Info_for_al_engine
    ||' -I' || 'DS_SRC_REPO.DBO.' || $G_Table_Name, 0);
    
    If( $L_Message is null)
    begin
    print('Info - The table <'|| $G_Table_Name || '> is imported successfully');
    end
    else
    begin
    print( $L_Message );
    end
    
  5. Copy the error file to custom error directory at Catch block (Windows)
    print( 'Exception Handling' );
    print( error_number( ) );
    print( error_context( ) );
    print( error_message( ) );
    $LOC = get_error_filename( );
    $LOC = replace_substr( $LOC , '/', '\\' );
    
    $FILE_NAME = 'C:\\TEMP\\Err_Customer_' || to_char( sysdate( ), 'YYYYMMDD' ) || '.txt';
    
    print( exec( 'cmd', 'copy "' || $LOC || '" ' || $FILE_NAME, 8 ));
    
  6. Terminate a Job when the rows are rejected during loading
    if ( get_file_attribute( 'C:\\TEMP Services\\bin\\STG_CUSTOMER_REJECT.txt', 'size' ) > 0 )
    begin
    raise_exception_ext( 'Job Terminated Due to Bad Records',1);
    end
    
  7. Delete a file if exists
    $FileName= 'C:\\LANDING_DIR\\Customer_' || to_char( sysdate( ), 'YYYYMMDD' ) ||  '.csv';
    print ( $FileName );
    
    IF ( file_exists ( $FileName ) = 1 )
    begin
    exec( 'cmd', 'DEL E:\\DWBIC_Files\\FilesProcessed.txt', 8 );
    end
    else
    begin
    print( 'File does not exist' );
    end
    
  8. Wait for a file
    $FileName= 'C:\\LANDING_DIR\\Sales_UK_' || to_char( sysdate( ), 'YYYYMMDD' ) ||  '.csv';
    print ( $FileName );
    
    While ( file_exists ( $FileName ) = 0 )
    Begin
    print( 'Waiting for File' );
    Sleep(10000);
    End
    print ( 'File Received Successfully' );
    print ( 'File Processing Started' );
    
  9. Print the contents of a file in the Execution log ( Windows )
    print( 'Files Processed: ' );
    print( exec( 'cmd', 'TYPE C:\\LANDING_DIR\\FilesProcessed.txt' ) );
    
  10. List the XML files and count in a directory
    wait_for_file('C:\\LANDING_DIR\\*.xml', 1, 1, -1, $file_name_list, $list_size, ','  );
    
    #PRINT THE FILE NAMES
    print( $file_name_list );
    
    #PRINT THE NUMBER OF FILES
    print( $list_size );
    


Have a question on this subject?

Ask questions to our expert community members and clear your doubts. Asking question or engaging in technical discussion is both easy and rewarding.

Are you on Twitter?

Start following us. This way we will always keep you updated with what's happening in Data Analytics community. We won't spam you. Promise.

  • Data Services Scenario Questions Part 2

    In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.

  • Data Services Scenario Questions Part 7

    In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.

  • GROUP RANK in Data Services

    In this article, we will learn how to implement RANK and DENSE RANK operations on GROUP in SAP Data Services.

  • SAP Data Services Analytic Functions

    In this tutorial we will learn how to implement Cumulative Sum in SAP Data Services (BODS). Since there is no available in-built Analytic Functions in SAP Data Services, we will achieve the end result by exploiting some in-built Data Services...

  • Working with Data Services Flatfiles

    In this article we will try to cover all the possible scenarios related to flatfiles in SAP Data Services.

  • Data Services Metadata Query Part 2

    This article is a continuation of the previous topic related to Data Services Metadata Query. Let us explore more into the Data Services Repository Metadata.

  • Why SAP Data Services

    SAP BusinessObjects Data Services delivers a single enterprise-class solution for Data Integration, Data Quality, Data Profiling, and Text Data Processing that allows us to integrate, transform, improve, and deliver trusted data to critical...

  • One Stop to SAP BODI/BODS

    BODI Business Objects Data Integrator or BODS Business Objects Data Services is a GUI workspace that allows to create jobs that extracts data from heterogeneous sources, transforms that data using built-in transforms and functions to meet business...

  • Error Handling, Recoverability- SAP Data Services

    Error Handling, Recoverability are the important aspects of any ETL tool. Some of the ETL tools have some sort of in-built error-handling and automatic recovery mechanisms in place.

  • Data Services Scenario Questions Part 6

    In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.