Skip to main content
Systematic Experimentation – Tomorrow’s Analytics

By Andrew Jennings

In my last post, I discussed “Analytics 3.0” as described by Thomas Davenport, which involves embedding analytics in key businesses processes.

Of course, embedded analytics is already here, and it’s more evolutionary than revolutionary. The next big transformation in analytics (we can call it Analytics 3.1, or better yet, 4.0) will be systematic experimentation.

Without a systematic approach to testing and learning, analytic methods rely on random chance to solve problems. Knowing how to tease out predictive relationships in data or how to search for unknown patterns will be a key determinant of success in the Analytics 3.1/4.0 era. This will enable organizations to make analytic discoveries faster and more efficiently.

This is an area of analytic science we are researching aggressively at FICO and I believe we will see significant innovations in the near future.

Cloud computing will be one of the fundamental enabling technologies of systematic experimentation. The cloud allows appropriate computing and data resources to be brought to bear on problems at acceptable costs to facilitate analytic breakthroughs that were previously out of reach.

For the first time, winners and losers in analytic battles will not be determined simply by which organization has access to more data or which organization has more money. With the cloud acting as the great equalizer, systematic experimentation will place a premium on smarter analytics to achieve world-changing breakthroughs in a faster, more efficient and more methodical manner.

related posts