We are pleased to start this new tutorial page for SAP BusinessObjects Data Services (BODS). If you do not know SAP Data Services (BODS) yet but wish to master this ETL tool, you have come to the right place. Even if you are already familiar with this tool, we hope you can still learn a lot of things from our articles as we progress further. Also please note, all the articles in this series will be backed by corresponding video tutorials and we highly recommend you to watch the videos as and when you read the articles. Let's get started.

Doing is Learning

They say, you can't swim unless you don't swim. You can't just read the theories on Archimedes' principle of Buoyancy and then jump into the swimming pool - that's not going to work. In order to learn swimming, you need to try swimming. Keeping this same principle of "doing is learning" in mind, we have decided to start this tutorial straight with a hands-on session in BODS. We will start with creating our first ETL job in BODS and then we will come back to discuss the theories, if any. Alright?

Table of Contents

Please follow the tutorials as provided below

  1. Creating a Data Services Batch Job
  2. Using Lookup and Join in Data Services
  3. Case Transform in Data Services
  4. Merge Transform in Data Services
  5. Handling Multiple Input File in Data Services
  6. Pivot Transform in Data Services
  7. Reverse Pivot Transform in Data Services
  8. Table Comparison Transform and SCD Type 1 implementation in Data Services
  9. History Preserving Transform and SCD Type 2 in Data Services
  10. Partial History Preservation SCD Type 3 in Data Services

Creating your first Batch Job

Ok. In BODS, you can create both Batch job and Real Time Job. Batch jobs are those that run in batches at a predefined time and after a predefined time period (frequency). Any batch job in BODS basically contains one or more data flow or workflow. A workflow, on the other hand, can contain one or more data flows.

You can think of data flow as a single logical unit where the whole logic to transport data from one schema to other, is specified. A data-flow, being a logical unit, can not execute on its own. It must be encapsulated inside a batch job in order to execute it. Data flows can also be grouped under one or different workflows and those workflows can, in turn, be executed through the batch job.

Now that you know what is data flow / workflow and batch job, let's see the next video which will show you how to create these objects in the data services GUI screen. In this video, we will create one data flow, that will be used to generate 10,000 test records for a sample customer table in target database (SQL Server).

Video: Creating Data Services (BODS) Batch Job

Hands-on tutorial video that shows you how to create your first batch job in SAP data services (BODS)

Understanding the Data Flow

As you have seen in the above video, we are using two transforms namely, Row Generator and Query Transform to populate a sample table (template table) in the target database with 10,000 rows. Row Generator transform is used to generate 10,000 rows programatically from within the data flow (without a specific source system), whereas Query Transform is used to generate sample customer names dynamically by concatenating the serial number with "Cust_" word.

Hopefully this tutorial and video are helpful.


Have a question on this subject?

Ask questions to our expert community members and clear your doubts. Asking question or engaging in technical discussion is both easy and rewarding.

Are you on Twitter?

Start following us. This way we will always keep you updated with what's happening in Data Analytics community. We won't spam you. Promise.