Tuesday, December 30, 2008

Performance Management

Performance management helps organizations achieve their strategic goals. Rather than discarding the data accessibility previous systems fostered, performance management harnesses it to help ensure that an organization’s data works in service to organizational goals to provide information that is actually useful in achieving them and focus on the Operational Networking Processes between that performance level. The main purpose of performance management is to link individual objectives and organizational objectives and bring about that individuals obey important worth for enterprise. Additionally, performance management tries to develop skills of people to achieve their capability to satisfy their ambitiousness and also increase profit of a firm.

Wikipedia


Performance Management
By: Gary Rinehart

Decision Interface
White Paper - December 2008




Performance management promises to be the new—and perhaps last—frontier in software development for the enterprise. Many vendors are scrambling to articulate a performance management position that fits their existing product capabilities, while they build out the rest of the functionality that is required to offer true performance management.


It is interesting that AMR Research has coined a new term, Pervasive Performance Management, and has even written an article[1] to explain that they mean by this term. To back up this terminology, AMR Research has conducted a “poll” and the survey says. . .



  • Pervasive PM is of high importance (what is it again?)


  • Importance will continue to grow over the next three years (really!)


  • 37% of companies are either currently using or implementing pervasive PM technologies (what are these technologies?)


  • 45% plan to evaluate pervasive PM investments in the next three years (hmmm)


  • Many of the perceived challenges for pervasive PM are IT related

However, once you get past the marketing hype, you begin to understand that the promise of performance management is not really about software, but rather about managing a set of business processes to achieve a desired result. In reality, performance management has been around for decades, but has lacked both the efficiency and effectiveness to provide much value deep within the organization. This is where art mixes with science—or, more accurately, where technology mixes with the business decision process.


Consider a typical decision-making process with these simple steps (see Figure 1):



  1. Understand actual historical performance


  2. Construct a view of the future


  3. Test options to identify the best alternative


  4. Make the decision and execute the change in direction, if any


  5. Use actual results to refine assumptions and continuously improve the process


High-Value Decisions Benefit from a High Degree of Automation


Figure 1: The decision value curve shows how high-value decisions benefit from a high degree of automation


Organizations that have taken these steps are ahead of the curve when it comes to providing a good environment for decision making. However, simply mastering these steps will not provide the return on investment most companies are looking for in today’s competitive business environment. The key to achieving the desired results is the automation of the decision cycle itself, or more importantly, removing unnecessary manual interface points from it.


For example, in most cases the above steps are loosely tied together by a highly manual process requiring a separate product or specialized analysis tool for each step. In fact, most decision processes rely primarily on a business intelligence tool for retrieving historical data that is then exported to Excel, and becomes the subject of management meeting after management meeting. In the end, it’s really about the accuracy and timeliness of the decision, and with the right combination of best practices and technology, this can be achieved. Let’s explore this thinking further.



Most organizations start their trek down the road of understanding and decision making here. Organizations need a firm grip on their operational history, and they must be aware of when delivery performance is low, promotional lift is high, and average sales prices are falling. Historical metrics are very valuable when trying to understand where you’ve been, so you can be smarter about where you’re going. The primary vehicles used on this quest are data integration tools, data warehouses, and reporting tools. They all combine to offer insight into what has already happened. This category of solution is termed business intelligence (BI).


A view into what has happened is a worthwhile step on the path to automating a decision process. Actual numbers identify past problems, and can help an enterprise isolate the causes, allowing management to take steps to ensure the same problems do not recur.


The major limitation of using actual data, however, is that the events have already occurred, providing no management guidance about what is likely to happen. For example, when a manufacturer receives a large and unexpected order from a top customer, a management alert is generated because current inventories will not support the spike in demand. The manufacturer knows that certain actions must be taken to resolve the potential out-of-stock situation, but it doesn’t know where to start. In most cases, managers rely heavily on intuition-based methods to make such decisions. Decision automation begins to step in here, but many decisions are still based on instinct and experience.



Depending upon the business problem, there may be leading indicators for future performance. If available, leading indicators are the next step in decision-process automation because they are actual data; yet, because of their correlation with future performance, they represent a business manager’s first indication of what is likely to happen in the future.


While leading indicators may provide a powerful tool for predicting the future, they typically have limitations:



  • True leading indicators with a high rate of return are somewhat rare and depend on business processes


  • Their predictive value is typically very tactical

For example, in a telesales business, decreasing overall call volume portends a drop in revenue. In the housing market, when interest rates rise, housing prices are likely to soften.


Leading indicators are typically tracked through BI solutions. While these BI solutions would be able to report the actual indicator, the projected view of the business based upon this indicator is limited by scalability. In addition, the more complexity in the variables, the more difficult it will be for these reporting tools to accurately project forward.


Another method of obtaining a view of the future relies upon the organization’s ability to integrate planning data from both operational and financial perspectives into the overall decision process so that, as actual performance changes, a direct cause-and-effect relationship can be drawn between actual and planned performance. With this method, the organization has a better idea of what changes can be made to solve the immediate issue as well as to ensure that new issues are not triggered by a tactical decision.


Taking the same manufacturing example above and applying operational production-planning information, the business analyst can see the effect of the spike in demand on the future production plan, which could very well be an out-of-stock situation in six to eight weeks. Having this additional insight also allows the business analyst to see where an increase in production from underutilized capacity might result in reversing the out-of-stock situation. This deeper level of insight results from the stronger role automation takes in the decision process—if an organization takes this approach to integrate planning data from both operational and financial systems.



While a view into the future is extremely helpful for business managers, it does not provide certainty. Understanding potential future states and their implications is a tremendous help to business judgment. This can be as discrete as changing a single assumption in a granular part of the business, or as complex as changing multiple assumptions and looking at end states for the entire business. This activity is typically called business modeling and is normally accomplished with complex spreadsheet models and algorithms.


However, many of these “what-if” solutions are limited because they are:



  • Done offline: Ad hoc analysis is typically performed as a back-office activity, without access to the run time environment


  • Limited in scope: For planning applications, users can only make “what-if ” changes to the plan; in modeling environments, users can only make these changes within the confines of the specific departmental model


  • Limited in user reach: In modeling environments, only specialists who understand the model can perform the “what-if” analysis. In planning environments, this analysis is typically limited to a planner, within the scope of his/her particular portion of the plan; rarely can a planner understand the aggregate impact across the entire plan

Effectively automating this portion of the decision process requires “what-if” activities that are integrated into the solution. In addition, “what-if” capabilities must be unconstrained—they must permit any user to change any assumption (or set of assumptions) across the solution and get an instant view of the incremental impact the change will have across the metrics that matter. Furthermore, by leveraging a solution with slice-and-dice capabilities, analysts, business managers, and executives can get a 360-degree view of the impacts of the change.


Key to testing these hypotheses and understanding decisions’ impacts across the organization is the feature called “versioning.” Versioning enables users to create multiple “what-if” versions, then compare, share, and merge them. This aspect of automating the decision process is invaluable, especially when decisions impact a wide swath of departments and objectives across the enterprise. This level of automation provides the controls necessary to implement good decisions that positively impact business performance across the organization, with everyone on the same page.


Using our manufacturing example, we can see how building various decision scenarios can benefit the decision maker by offering different methods to solve the potential future out-of-stock situation. For example, one scenario might look at ways to solve the problem by maximizing revenue potential, while another might focus on reducing inventory cost. By providing different methods for solving the same problem, the decision maker is empowered to take several variables or value-levers into consideration, resulting in a more insightful and favorable outcome. Furthermore, by integrating the data into a single decision interface, this process can be highly iterative and responsive. It can provide an environment for continuous planning, or in the case above, continuous supply and operations planning.


Today, planning solutions are focused on the plan creation process for a single plan. The output of the application is the plan itself (e.g., merchandize planning). While these solutions put tremendous effort into the creation of the plan, they do nothing to help business managers understand how a particular plan works and interacts with other plans. For example, when our manufacturing company intersects the demand and supply plans, it may find it is actually planning to run out of stock for some SKUs. Thus, offering an infrastructure that integrates and synchronizes multiple plans with actual historical data provides the best environment for true decision automation.



Enterprises do not have a single plan. Instead, they have a collection of many plans, focused on different areas of the business from different points of view (e.g., dollars vs. units, day vs. month, SKU vs. family, division vs. region). Each plan is created without a clear understanding of what is happening in other plans, making more difficult (or even impossible) coming to a decision that is “enterprise aware.” However, by synchronizing and aligning various plans, managers are able to see an integrated, forward-looking view of the business. This gives them a directional view of what the enterprise as a whole plans to accomplish over the period, and is a key driver in the decision process.


However, it is rare to find a solution that automates this activity with respect to the decision process. Those that do are often built in complex code or proprietary application development environments, are specific to the process in question, and are not easily modified when process changes arise. Automating this aspect of the decision process requires a workflow engine capable of writing the results to the system of record when the decision has been approved. In addition, the decision workflow must include an audit of the potential changes to be made, not just the results. This is important because it gives the decision maker a clear picture of both the dependent and independent variables attached to a single decision.


Again using our manufacturing example, the business analyst can settle on a few options for solving the potential out-of-stock situation, with each option flexing a different aspect of the business from the most to least profitable action. Each scenario is submitted to a decision workflow that is routed to the appropriate operational managers for approval. Once consensus is reached, a single decision is then executed and the necessary data is written back to the operational system—in this case, the actual production plan is updated for several future periods to accommodate the recent demand spike.


Attaining this level of automation in the decision process is rare, but those companies that can reach it can minimize the impact of human errors based on “gut instinct” and institute fact-based decisions that positively impact the organization as a whole.



While it is simple to understand why organizations want to track historical decisions, the simple fact is that because most decisions are made by highly manual processes incorporating a mass of data using complex spreadsheet models, keeping track is impossible in many cases. The best way to understand the effectiveness of your decisions is to have a persistent audit trail of actions that were taken. This is the by-product of automating the previous four steps in the decision process.


Because most companies track history fairly well, it should be easy to establish your own decision-audit process in the spirit of continuous improvement, or (more importantly) to comply with requirements such as those outlined in Sarbanes-Oxley. Continuing with our manufacturing example, the analyst now has the hindsight to understand the impact of the available options and the decisions made as actual operations play through. The benefit here is continuous improvement and refinement of the business and decision-making processes. Tracking decisions is also a good way to establish plans for variable compensation or “management by objective” incentives based on target attainment.



You might look at this five-step process and say to yourself, “This is not rocket science.” For the most part, you would be correct. The problem is that each step requires a different set of data coupled with a specific set of skills on behalf of the user. Combine that with the fact that few vendors can provide solutions for all five steps in the process, and you have a real challenge.


As a result, decisions are made in many organizations by what I call the decision triangle—people, process, and Excel. This triangle works for the most part, but it’s highly manual in terms of process and data, and thus requires considerable resources for a single decision to be executed. This explains the strong focus on the promise of packaged performance management solutions: they can provide data integration, persistence, and automated decision processing. They are able to reduce the number of resources required to execute the decision (see Figure 2).


Figure 2: Performance management and the decision triangle


A decision that wasn’t executed, or that wasn’t executed in time to achieve the desired results, was just a good idea. Being able to ensure a decision is executed within the relevant time frame closes the decision loop and allows the company to ensure “control” over its business direction as it “tunes” the business to its ever-changing market.


As a business moves further along the decision-automation value curve, it will realize increased benefits from faster, more educated decisions utilizing fewer resources, and ultimately provide a compelling return on its investment in both process improvements and technology. The real promise of performance management hinges on the ability of solutions to address the full lifecycle of the decision process, and most importantly, to enable accurate and timely decision making.


[1] “Could IT be an Obstacle to Successful Pervasive PM?”, AMR Research, March 28, 2008

No comments:

Post a Comment