QueBIT Blog

QueBIT Blog: 3 Predictive Analytics Practices Every Data Analyst Should Follow

Written by Jennifer Field | Dec 2, 2015 2:44:56 PM

One of the most important lessons we are learning in the Big Data and Analytics Age is that simply having access to innovative technology isn’t enough to improve business outcomes. Technology needs the proper human input and interaction to elicit the types of results business leaders are expecting. Technology shouldn’t be counted on as a magic wand that will always deliver upon request.

In the words of one Thomas Watson, “Our machines should be nothing more than tools for extending the powers of the human beings who use them.”

With that thought in mind, it’s easy to see why certain predictive analytics initiatives fail. Failure, in many instances, can be attributed to user dependency. Predictive analytics are meant to be a roadmap for decision-making. But the predictions aren’t always right.

So when these predictions are taken as absolutes, and they don’t end up working out, users only have themselves to blame.

They need to inject their own rational reasoning and knowledge along with the added insight they gain from analytics to arrive at the best decision possible. They also need to be prepared to make corrections after the fact to keep improving their models, and keep working with the software to generate actionable results. 

 As the Harvard Business Review explained, “Predicting the future can – in the spirit of Dan Ariely’s Predictably Irrational – unfortunately bring out the worst cognitive impulses in otherwise smart people. The most enduring impact of predictive analytics come less from quantitatively improving the quality of prediction than from dramatically changing how organizations think about problems and opportunities.”

Predictive Analytics Times discussed three different approaches organizations should consider to improve their cognitive analysis:

  • Handle Predictive Models with Care — as environmental factors change in a business ecosystem, assumptions that are fed to a predictive model also need to be updated. If not, the models can lose predictive power and accuracy over time.
  • Start Slow and Iterate Quickly — if decision-makers are lukewarm on the idea of acting on insights due to risk factors, they should run trial initiatives before a wider expansion. At least a smaller roll out would minimize the long-term damage.
  • Don’t be Afraid to Analyze a Larger Feature Set — there are times when organizations may hesitate to run analytics on a smaller feature set (data attributes) because they want a more narrow focus. However, having a sizable feature set can help to triangulate the data better and drive more reliable predictions overall.

Buying in to the process behind predictive analytics is more important than the outcomes themselves.

Being invested in predictive analytics means learning from its shortcomings. There has to be an aspect of asking why when certain predictions don’t pan out. 

Maybe the assumptions were incorrect. Maybe the data factors that were originally inputted didn’t account for new factors. Maybe the right data wasn’t used to run a predictive model.

Once those questions are answered, then human input to guide the analysis in the right direction becomes the most important next step to extract the most value from predictive analytics.

How are you using predictive analytics for your business?