Blog

Predictive Analytics is More than a Magic Act

Posted by: Gary Corrigan

Find me on:

Nov 3, 2014 8:04:22 AM

The way analytics are described in some marketing and sales circles, it almost seems like you just need to push a button, numbers are spit out, and your company gains this amazing knowledge transfusion to make better decisions. It’s like having a direct link to a magic eight ball that actually works. As James Taylor (yeah, you guessed it, not that James Taylor) explained, there is much more work involved for realizing the value that predictive analytics bring. Companies that experience failures when they start their predictive analytics projects tend to lose sight of the most essential steps.

Taylor—the CEO of Decision Management Solutions—explained how to avoid common pitfalls of predictive analytics in the MIT Sloan Review article, The Four Traps of Predictive Analytics (link to article: http://sloanreview.mit.edu/article/the-four-traps-of-predictive-analytics/).

In explaining his first trap, he pointed out that for predictive analytics to work, organizations need to identify exactly which data sets they want to analyze and establish the clear purpose of this analysis.

According to Taylor, predictions can only come from four areas: risk, opportunity, fraud, and demand. This narrows the playing field. Organizations that are expecting more predictive capability outside of these four areas wouldn’t be able to gain the value they seek.

Now, even if the types of predictions you seek fall in line with these four areas of discovery, you also need to build the appropriate models. And as Taylor pointed out, building one model for all areas isn’t going to cut it. Each area demands different models, and the variances can go as far as a different model for each question that is asked.

For models to be successful, the data also needs to be prepared for analytics. The Computer World article, 12 Predictive Analytics Screw-ups, identified a common missed step that revolved around not going through a QA process to check data integrity.

However, if QA shows that the data isn’t in perfect condition, it doesn’t mean you can’t start on a predictive analytics project. If anything, Computer World says that organizations spend too much time trying to perfect their data (even though data is rarely perfect) and end up derailing their project launch. Issues inevitably come up and they need to be resolved throughout the project time.

Some of the other noteworthy mistakes from the article included:

  • Rushing the project because you think your data is perfect
  • Ignoring SMEs when building out models
  • Defining a project around a foundation your data can’t support

As you can see, predictive analytics requires much more than the sleight of hand for decision-makers to gain the ROI they demand. There is substantial planning involved. If companies consider and put forth the effort that is required at the onset, they will experience greater success.

What are some of the outcomes you want to predict using your data? Let us know.

   

Blog Search

Subscribe to Email Updates

Popular Posts

Recent Posts

Follow Me