Along with trying to find the highest probabilities for particular business outcomes, it is equally important to put your data to the test once it’s ready. In that regard, taking a “trial and error” approach is what Nate Silver recommends to derive the most value from large data sets.
Really, what Silver advocates is bypassing the theory stage, jumping into the heart of advanced analytics and decision-making, and not being afraid of the missteps that could follow. It’s definitely a scary thought. However, without making mistakes, how can a company get better?
In fact, Dominos has embraced the idea of “not being afraid to make mistakes” and their latest ad campaign is largely based on this principle. The pizza franchise pokes fun at itself for coming up with ideas that didn’t work, but being undeterred. You can find their commercial here: http://www.usatoday.com/story/money/business/2014/04/10/dominos-pizza-fast-food-pizza-chicken/7503235/
InformationWeek explained Silver’s “trial and error” philosophy (as an extension of his overall beliefs on big data) even further: “There’s no magic to Silver’s methods. There is hard work, a willingness to make mistakes and adjust, and a realization that the common wisdom is not wisdom at all.”
With all that being said, Silver also advocates testing and running simulations to make sure that the data is ready for action. Specifically, in his FiveThirtyEight blog (in which he details a seven-step process for forecasting the 2012 Senate election), the “practical statistician” lists simulation as the seventh step.
Among the keys he lists out include:
- Error needing to be associated with any forecast (in the Senate cases, those errors were of the national and local variety)
- Understanding the mean forecast and the standard error for the margin between two candidates
- Keeping the margin of error at the appropriate levels
With this formula fixed into the model, Silver had the ability to provide a probabilistic assessment of the outcome of any one given race.