I am often asked by clients what “world class” analytic organizations are doing to stay ahead of their competition. The answer used to be pretty standard – hire the best people, invest in training and retaining them, enable them with the best tools and make sure their contribution is well understood in terms of the impact on the business. Then my advice was typically about making sure that analytic teams are not just smart in their craft but also skilled at communicating why that matters to non-technical audiences.
Recently, I have had to rethink my response. Here’s my prediction for 2019: Machine learning will not only be critical in improving the accuracy of data models but also to driving data efficiency.
As data - both traditional and new - starts to pile up, many organizations are having difficulty in getting projects off the ground. We’ve always said 80% of the job is getting the data ready - 80% of the work, and the most tedious. Has this task now become 95% of the job? Speed-to-value is an important criteria to stay competitive. The problem is that efficiency goals and productivity metrics are not the most exciting objectives to put in front of analytic teams.
Working on interesting and challenging business problems and continuing to advance one’s technical skills, e.g., machine learning, are very exciting for these teams. Business executives are demanding that their teams start building machine learning models. There is a surge of interest on this topic with “machine learning” generating an increasingly high number of queries on Google where terms like “analytics” are starting to drop off.
My perspective is that machine learning is critical in solving the practical problem that organizations face in getting data ready for modeling. In fact, machine learning is relevant from transforming raw data all the way to validating the models.
Machine learning helps convert raw data into organized, summarized and enriched data, ready for use in a variety of analytic tasks, including developing models. In this way, it automates and scales what human experts can do. When faced with millions and millions of pieces of transactional history, an automated way to generate thousands of characteristics, then algorithmically surface only the top 200 most important characteristics for the analysts to further analyze, saves weeks or months of time.
Other types of machine learning tools do what humans can’t do. That includes finding complex interactions across diverse varieties of data. An example is a social network analysis tool, which uncovers linkages between entities with common characteristics (like a shared address, mutual acquaintance or e-commerce transactions). But interactions are difficult to understand, so visualization tools are key to bring these interrelationships to the attention of analysts/human experts to focus on. All these machine learning techniques are applied to automate, comprehensively search and discover what’s hidden in the data — and all this before the analyst starts developing the model!
Working with the volumes and variety of Big Data is taking valuable time away from analysts. Adopting machine learning to enhance not just the accuracy of models but also drive efficiency is critical to amplify a key asset - the analytic team – and stay competitive.
While you’re here, why not check out our other prediction pieces for 2019
- Government Predictions 2019: Automate, Enhance and Secure
- Cyber Predictions 2019: The Year of Cyber Insecurity
- Consumer Banking Predictions 2019: Four Trends to Watch
- Public Policy Predictions 2019: Regulatory Reforms Ahead
- Fraud & Payments Predictions 2019: Go Cashless – with Care
- Collections Predictions 2019: SOP Won’t Cut It
- Analytics Predictions 2019: Innovations for Ethical AI