How analytics can prevent rather than explain tragedy

17.06.2016
The political response to yet other mass shooting, this time in Orlando, showcases not only a problem in politics but also a core problem associated with how we make decisions. We are all plagued by a human condition called Confirmation Bias, which means we tend to see things in a framework of positions we have already taken and not focus on the actual causes. In this instance, one politician doubled down on blocking Muslims from entering the country, while the other focused on gun control talking points. Meanwhile, the reason why all those people died was more about the inability to identify what should have been a clear coming threat.

We had all the elements of disgruntled employee, history of domestic violence, high intolerance and behavior consistent with a coming attack (purchase of multiple weapons and mass ammunition). This has all been reported after the fact, largely using pre-existing data suggesting that if we want to actually prevent this in the future we need to connect the data elements (which apparently no one is currently doing) and provide an effective early warning system. However, this is the one thing that it appears no one running for president is discussing because they jumped to solutions without looking at the causes first.

This is how many of us approach most decisions. We make the decision first then force fit the data to back the decision up. But if we approached the problem the other way around we’d have a better chance of being right as opposed to being able to successfully argue we are right.

This is a core problem as we move to analytics as decision support.

There are situations where analytics could actually accelerate bad decisions. Years ago I attended a class that stuck with me. This was back when we did a lot more market and business analysis as practice than I think we do today. The instructor drew a chart on the wall, a typical x/y chart. Horizontally was speed with faster to the right, and vertically was direction with the correct direction on the top and the wrong direction on the bottom. He argued that most companies focus on speed more than they focus on direction and the end result was they ended up going in the wrong direction faster which, by any measure, was actually making things worse.   You need to focus on direction first then speed.

Analytics can be used to help formulate a decision, which is relatively hard because you have to cast a broad data net and, done right, you may have no idea whether the answer is one management will like.   Or you can take a decision that has already been made and use analytics to compile just the reasons why this decision was smart. This is competitively simple and has no career risk.

I saw this in action back when I was in competitive analysis myself. I was part of one of two competitive analysis sister groups. Our group was missioned to report as accurately as possible, but the manager of the other group found that he got more rewards by telling his management what they wanted to hear. The end result, we got defunded and the organization the other group supported eventually failed and was sold off.   There was clearly a right and wrong path, but the end result was that both groups lost their jobs. However, the group focused just on telling management what they wanted to hear did professionally better in the interim even though they contributed greatly to the failure of their unit.  

I think the unfortunate conclusion is that we are far more likely to use analytics badly, with the end result being the failure of our firms, than in using it correctly to make better decisions.   Since it is line management that likely will make these very bad decisions it does suggest IT could play a role of ensuring this isn’t done or at least making management aware that the tool is being misused.  

IT, because it sits outside of the line management structure, could likely do this far better than some poor analyst in it and the end result would be a true value add for the firm.   And by focusing on process rather than the result, IT is less likely to get into a pissing match with the decision-maker who may not like that the analytics product has identified him or her as an idiot.  

Something to noodle on this weekend.  

(www.cio.com)

Rob Enderle