SAP Commissions, recently rebranded as SAP SuccessFactors Incentive Management, is one of the most robust and reliable software for managing sales compensation programs. Businesses worldwide have relied on this tool for more than two decades, thanks to its ability to process large amounts of data and complex rules. To get the best out of any tool, one needs to know how to use the tool correctly. SAP Commissions is no different, especially when it comes to optimizing the system performance. System performance can be a crucial factor in ensuring that commission operations run smoothly. In the two-part blog series, I’ll share valuable insights on performance considerations and potential pitfalls, along with practical tips for effectively addressing them. There are several areas where performance issues can occur in SAP commissions, these include:
- Stage Hooks
- Pipeline
- Report Preparation and Generation
- Data Extracts
- Data Imports
- Inbound data preparation
Performance Factors in SAP Commissions
1. Stage Hooks:
Stage hooks are typically stored procedures that can be executed before or after pipeline stages. Stage hooks are used to manipulate and enhance the commission’s data to implement complex custom business logic. Poorly designed stage hooks that query and manipulate a lot of data can be a potential failure point and a performance risk. The commission’s database design typically means that you may need to perform complex SQL joins and write extremely complex queries. Experience has shown that splitting up complex queries into smaller, simpler, and more manageable chunks is the way to go. This helps to minimize risks from long-running queries, rollback segment errors, temp space errors, and out-of-memory conditions. Additional indexes may need to be created to support stage hook queries. Identifying the bottlenecks and problem queries is important. Logging the start and end of each query can help identify slow-running queries. It is also a best practice to ensure a stage hook can run with all the pipeline run modes, including full, incremental, and preposition. Proper transaction management and locking mechanisms are essential to handle concurrency effectively and to avoid Deadlocks. Review and optimize your stage hooks regularly. Business requirements may change over time, and with that old or unnecessary logic can accumulate, impacting performance. Keep your codebase clean and up to date. Over time, historical commission data may accumulate and slow down query performance. Queries that look at historical pipeline data can degrade as more and more periods are added to the database. Implement archiving and data purging strategies to delete or move out historical data.
2. Pipeline Processing
Pipeline processing in SAP Commissions refers to the sequence of steps involved in commission calculation. Several factors can impact pipeline performance:
- Allocate and Reward Rules: The more rules you have, the longer the processing time. Review and consolidate rules when possible. Check and see how often a rule fires, because there could be scenarios where a rule fires many times and other rules only fire a few times. Don’t be surprised when you find rules that have never been fired.
- Number of Payees and Transaction Volume: The number of payees and transaction volume directly affect the processing time. If possible, aggregate the transactions during the data import but ensure that commission detail reports still have the level of granularity needed. Implement data segmentation and parallel processing to manage this.
- Complex Business Rules: Inefficient or complex business rules can lead to performance issues. There are some key factors to observe such as how many credits are generated by a transaction on average and then how many credits contribute to a Primary Measurement. Rules that use rolled-up results can cause runtime issues because this could result in all positions being assigned to one grid worker. This is often an issue in Reward and can be affected by the design and setup of the Reporting and Position Relation hierarchies.
Certain rule types such as Order level rules, do not scale well. Consider alternative solutions for processing order-level rules. The best source of information on how the plans/rules are performing can be found in the pipeline summary logs and the Allocate and Reward logs. The logs will state how many times each rule is evaluated and fired as well as listing the number of result objects which are created.
- Grid Setup: Inappropriate grid configurations can lead to uneven workload distribution. Ensure that it is evenly distributed among grid workers. Consider the design and setup of Reporting and Position Relation hierarchies.
- SQL Plugins: One risk which is often overlooked is using the built-in SQL plugins, for example, QueryForValue. For these to work effectively, the SQL needs to be very fast (in the millisecond range). If the SQL plugin takes seconds and is executed by the rules very often, it will add significant time to the pipeline. One would need access to the system monitoring tables and a good understanding of these tables to be able to diagnose/fix the problems associated with SQL Plugins.
3. Data Preparation in Stored Procedures For Reports
It’s a common practice to use stored procedures to preprocess data for reports. The SQL can get very complex and cause performance bottlenecks. There are several schools of thought on how to produce reporting data. The main one is to have the pipeline rules do all of the data preparation work and create all of the necessary reporting data. This is ok but it can increase the complexity and lower the performance of the rules. Often some operations must be done in SQL stored procedures over the commission rules. I try to generate as much of the reporting data in the pipeline itself but often find it necessary to create some stored procedures to handle complex scenarios. Below are a few common approaches I use for fine-tuning complex SQL queries:
- SQL Complexity: Complex SQL queries can lead to slow report generation. Keep report SQL as simple as possible and use appropriate indexing.
- Data Segmentation: Segment data for reporting to manage data volumes effectively.
- Caching: Implement caching for frequently accessed or preprocessed data to reduce the need for redundant calculations during report generation.
- Reporting Tools: Consider using reporting tools or data warehousing solutions to handle complex data manipulation and offload processing from SQL and stored procedures.
- Monitoring and Tuning: Regularly monitor the performance of report generation processes. Identify and address bottlenecks, slow queries, and other performance issues.
- Testing and Profiling: Rigorously test report generation with various data volumes and scenarios to identify performance limitations and optimize as needed.
In part 2 of this blog series, I’ll cover Data Extracts, Data Imports, and Inbound Data Preparation.
About Author
David Ritson works as Sr. SPM Architect at SpectrumTek. He has 20+ years of experience in designing and implementing SAP Success Factor Incentive Management (earlier known as Callidus, CallidusCloud, and SAP Commissions). Over his career he has been instrumental in successful implementation at more than 125 worldwide organizations, often implementing systems handling millions of transactions per pipeline run and leading the product evolution. While at SAP/Callidus he worked in the Chief Services Architect role and liaised with engineering and support. He has also designed and developed several of the pipeline stages and imports.