Performance Tuning Strategies to Optimize Your SAP Data Services Environment

  • by Don Loden, Director - Data & Analytics, Protiviti
  • May 30, 2012
Step beyond traditional database tuning concepts, such as indexing and partitioning, and get expert tips to move toward a parallel approach for code design and job construction. Examine parallel execution within SAP Data Services to see how it distributes operations and uses the strengths across your system. Get expert advice on leveraging the pushdown capabilities delivered with SAP Data Services, including how to properly distribute logic and optimize performance.
Key Concept

SAP Data Services is SAP’s data integration and data quality solution that offers many different opportunities for tuning and scaling ETL code. Traditional database tuning should be the foundation of ETL development with SAP Data Services, but efforts should not stop at the database. Concepts such as breaking logical order, thinking in parallel, and a solid understanding of SAP Data Services features including caching, bulk insert, and Data_Transfer are crucial components for tuning the ETL code outside of the database.

Performance and tuning are constants that every developer must confront and reconcile on a daily basis. A common scenario surfaces in almost every organization: The business strives to increase functionality that in turn increases logical complexity, yet a load window never increases with these demands. The business may be working to increase measures or key performance indicators (KPIs). There may be a huge data volume increase due to recent data integration, or the organization may have an advanced business intelligence (BI) strategy with master data operations that globalize aspects of the business requiring expensive processing timelines. All these scenarios increase complexity and load times. The key point is that as BI strategy evolves, doing more processing in less time becomes inevitable. This is why performance tuning strategies are some of the most important design considerations throughout the BI life cycle.

Proper extract, transform, and load (ETL) performance is not an accident, nor is it something that just occurs without thought and planning. SAP Data Services on its own can assist with realizing performance gains. However, to be truly successful you must consider performance objectives in the design through the life cycle of the project. In this article, I frame my discussion around using SAP Data Services for maximum performance. Through examples, I demonstrate best practices in designing high performing ETL code. I’m assuming you have familiarity with ETL concepts and knowledge of the SAP Data Services platform. 

Don Loden

Don Loden is an information management and information governance professional with experience in multiple verticals. He is an SAP-certified application associate on SAP EIM products.  He has more than 15 years of information technology experience in the following areas: ETL architecture, development, and tuning; logical and physical data modeling; and mentoring on data warehouse, data quality, information governance, and ETL concepts. Don speaks globally and mentors on information management, governance, and quality. He authored the book SAP Information Steward: Monitoring Data in Real Time and is the co-author of two books: Implementing SAP HANA, as well as Creating SAP HANA Information Views. Don has also authored numerous articles for publications such as SAPinsider magazine, Tech Target, and Information Management magazine.

See more by this author


No comments have been submitted on this article. 

Please log in to post a comment.

To learn more about subscription access to premium content, click here.