Skip to main content
It’s OK to “ignore the man behind the (predictive analytics) curtain” with EDM

(By guest Blogger, Gib Bassett)

The title of this post includes a quote from the film, “The Wizard of Oz,” when Dorothy’s dog Toto draws back a curtain revealing the “real” wizard (a man) pulling levers and pushing buttons to illuminate a frightening hologram of a smoky and not too nice false wizard.  The jig is up, and the wizard is exposed for what he is, just a man.  If only the same could be said of the experts driving data mining and predictive analytic software in today’s leading companies.

While not exactly wizards in the Harry Potter sense, many of these folks have advanced degrees in statistics and mathematics, and thus are able to help their employers comb through mountains of data to identify actionable insights for improving business performance.  For executives, it may as well be smoke in mirrors, such is often the complexity of these analyses.

Some software vendors would have you believe otherwise, but in reality the tools which enable predictive analysis are most often designed for an analyst first, to make him or her more productive.  There has yet to be a software application into which you can speak, “give me a list of the customers I should retain over the next 12 months to optimize my profit given x, y and z” that doesn’t require a “wizard” of sorts performing the analysis in the background for you.  And there may never be.

Having said (or written) that, it’s easy to understand the conclusions reached by Stephen Swoyer in his November 7 column on The Data Warehousing Institute (TDWI) website titled, “Predictive Analytics: Slow Adoption Despite Big Benefits.”  Swoyer observes that adoption of traditional rearward looking BI has been much stronger than predictive analytics – not surprising when one considers a traditional BI solution is architected with a defined target database that is fed information from a consistent source, and from which reports and other analysis are generated.  The “mystery” of data collection and storage has largely become a repeatable science, yet data preparation remains the single largest time and money component in data mining (predictive analysis) projects.  The reason this remains the case is because it typically takes someone with a good deal of analytical talent to identify and massage source data such that meaningful predictive analysis can occur; it is as much art as it is science.

Further recent evidence of this can be found in the FinTech 100 report from American Banker and Financial Insights magazines.  This annual ranking of top vendors serving the financial services industry includes an article titled, “Software Playing Catch-Up with the Promise of Data Analytics,” by Jeanne Capachin, a research vp with Financial Insights.  Capachin observes that “Many of the big failed CRM projects of the late 1990s and the early part of this decade were failures as the analytic solutions were put ahead of strong information management.”  She uses “information management” to broadly apply to both data warehouses at one extreme and source data for predictive analytics/data mining at the other, but in both cases the implication is there is little turnkey about predictive analytic software.  Yet despite these hurdles, ” ...almost 80% of the institutions surveyed plan to increase spending on business intelligence and analytic applications over the next 12 months” -- so attractive are the benefits reported by successful companies.

Capachin’s conclusions parallel those in my upcoming December 3 DM Review article, “Decision Services: Pragmatic Real-Time Analytics.”  What you can infer from Capachin’s comment is that early CRM solutions were not just immature, but over hyped with respect to their analytical components.  While these have improved, gains have been made mostly in the tools used by expert users.  Decision Services are an SOA (Services Oriented Architecture) concept designed to encapsulate the rules and/or models to be executed by operational systems and therefore offer a foundation for predictive analytics, at a pace appropriate for an organization given its business, technical and analytical maturity.

For example, before moving to predict and act upon customer behavior via interaction channels like your website or call center, does it not first make sense to automate the decisions you must now take with your customers via these channels?  For example, the things you know to be self evident, such as whether or not you wish to grant a loan to someone with a given credit score?  Or a particular pattern of fraud that has been flagged for investigation repeatedly?  Capturing and automating decisions such as these with a solution grounded in Enterprise Decision Management (EDM) principles sets the stage for successful deployment of predictive analytics, at a time when an organization is ready.

Which next raises the issue of where to begin, an important adoption issue Swoyer’s article points out via a quote by TDWI Director of Research Wayne Eckerson:

"Most (business managers) have only a vague notion about the business areas or applications that can benefit from predictive analytics," he writes. "[M]ost don’t know how to get started: whom to hire, how to organize the product, or how to architect the environment."

Decision Yield is a methodology which pulls back the curtain of a company’s operations, identifying the decisions most likely to benefit from an EDM solution.  The result can serve as a map of sorts toward reaping the benefits of predictive analytics, and it doesn’t take a degree in wizardry to understand.

related posts