How unconscious bias, naysayers, and technical jargon can derail an advanced analytics project.
When it comes to a company implementing or expanding their advanced analytics programs, the technical part is relatively easy. It’s the commonplace built-in culture of unconscious bias, miscommunication, and not understanding what actual analytics success is, that often cause these programs to fail
That’s the assertion from Gerhard Pilcher, the CEO of Elder Research, and Jeff Deal, the company’s vice president. In a recent webinar, they were the first to admit that they “sometimes got in the way of analytics or an analytics model that could help us.”
Pilcher and Deal found that in the order of their business, they were able to achieve a 90 percent success rate in rolling out the technical side of an analytics solution, but that only 65 percent of those projects achieved an actual return on investment (ROI). They figured there must be a reason for the wide difference in technical and financial success—a gap between how companies do analytics on a technical level, and how they talk about what success in analytics actually means.
Advanced analytics challenges
Deal argues that our unconscious bias—the assumption about that our belief systems must be correct—gets in the way of many analytics roll-outs. Businesses often think that analytics are just a method of making iterative improvements while they focus their attention on other, more meaningful projects.
He gave an example of a telecom company that was struggling with high churn in the months before the original iPhone was released. He says, “The people were convinced that the iPhone was the solution to the problem, and if they could get the ability to sell that, they would solve their churn problem.” In the meantime, the company hired out Elder Research to use analytics to reduce churn, and it was a successful effort—at least until the iPhone came out, at which point they promptly stopped the churn product.
When churn went back up, Deal’s assumption was proven correct: the company’s unconscious bias prevented them from seeing the real problem, which wasn’t the lack of an iPhone. Deal and Pilcher argue that companies need to continuously assess which solutions are making meaningful impacts and which are just assumed to be doing so.
Idioms and the language of analytics
Another big problem is how companies talk about analytics and the benefits that it will have. Pilcher says that technical people tend to talk about analytics in the frame of statistics and data science, which is difficult for sales or marketing people to understand. By failing to sell the analytics story, it’s difficult to get buy-in across the board.
Pilcher argues that businesses should first ask a lot of questions, and then define the actual problem they’re trying to solve, before even thinking about which tool is right for the job. The more these companies can convert the verbiage they use for this conversation from technical language to business language, the better.
Deal agrees: “There is an incredibly array of tools available to people, some of them are very expensive enterprise tools, some are open source tools that are easily learned by someone with a little knowledge. By taking a little time to better define the problem, and ask the right questions, organizations can select the right tool, save a lot of money, and get started off on the right foot.”
Return on investment and inertia
Finally, Pilcher and Deal have found that a number of obstacles can get in the way of following through with an analytics solution and perceiving the value it brings to the organization. This comes in the form of a lack of leadership, the stopping power of “naysayers,” and an inability to understand what value analytics is bringing to the organization.
Unless these issues are addressed, they found, it creates an “inertia” that can derail a successful technical analytics rollout.
Pilcher says, “Success comes when a leader is willing to invest back in it, believes in it enough, and is persistent enough to get to that level of involvement.” By making a big deal of analytics successes, they can pull other onboard and convince them that it’s a viable solution moving forward. And by operating under a cycle of continuous improvement—Pilcher and Deal organize it as “plan, cue, response, habit”—leaders can continue to prove the benefits of investing in analytics.
Naysayers can be curtailed by bringing them into the fold quickly, and giving them voice into how the solution is rolled out. Many of these people feel as though their way of doing work is being attacked, and Deal argues that they need to “see themselves as part of the solution, rather than someone that a solution is being imposed upon.”
Finally, everyone needs to agree on what analytics success actually looks like, otherwise the various definitions are never going to align. But, by pulling together these various aspects of analytics misalignment, companies and reduce the risk that they become another victim of “the gap”—a successful analytics solution that no one knows, or wants, to actually take advantage of.