Interesting session this afternoon at the Gartner BI Summit - "Business Intelligence and the Business Process Platform". David Newman presented some thought-provoking stuff on how the combination of business intelligence and a business process platform might be used to create an environment in which processes are self-configuring and driven by customers/transactions. As companies move to more composite application assembly using SOA and business process management software one gets "true" business activity monitoring and "analytical process control". To quote the presentation:
Analytical Process Controlling (APC) enables an organization to investigate process execution realities, evaluate their business impact, and ultimately improve and optimize its business processes.
The presentation went on to discuss some of the necessities for widespread adoption of this approach. It made me think about how an EDM approach, combining business rules and predictive analytics, might deliver on this promise today. Let's assume you have automated critical decisions within a process using business rules and then enhanced these decisions with predictive analytic models. How does this deliver on this promise?
- You could use business rule logging to record the critical decisions within a process instance, allowing you to "investigate process execution realities"
- You could use rules-based and decision-model based simulation to see how changes in the rules or models used would have changed the outcome of processes (assuming your key decisions were automated they would determine the outcome of your process, at least the outcome that impacts the bottom line like the price offered or claim amount paid. This would let you "evaluate the business impact".
- By making the rules easy to change when business users see value in so doing and by combining them with regularly updated, easy to deploy predictive models you could "improve" the business process.
- By building a decision model, showing how the rules and models interact, you could go to the last step and run simulations that would let you "optimize" the process given the constraints under which you operate.
If you did this you would end up with a transaction that drives the process. The data in the transaction, or metadata attached to it, would determine what scores the models would generate and the combination of scores and data would determine which rules fired and that would decide which steps to take to complete the transaction. This let's you "invert" the process and have it flow from the customer to the organization. An example of this would be an origination process where the data entered by the customer is used to drive models and rules that determine which products are available and what additional data and steps are required.
It's not exactly the vision of an APC, but it's as close as you are going to get today I think...