Predicting future demand for a specific good or service is a challenging endeavor that remains so close yet so far for data scientists…for now.
Predictive analytics attracts a lot of enterprise investment for a simple reason — to be forewarned is to be forearmed. But not all predictive analytics applications are equally feasible.
For example, predictive maintenance applications are relatively straightforward because the units of measure involved are well-defined. Predicting future demand for a specific good or service, however, is a much more challenging endeavor that remains tantalizingly beyond the reach of data science.
See also: Predictive marketing users report higher revenues
Dr. Stefan Sigg, chief research and development officer at Software AG, says that real challenge with predicting demand for anything is the plethora of factors that can have an impact. Everything from weather and traffic conditions to swings in customer moods or unexpected political controversies makes it difficult to accurately predict demand.
That’s not stopping organizations from trying. But consistently being able to predict demand remains beyond the ability of known data science because it’s not well understood which algorithms and statistical models should be applied at any given time. “Predictive demand is a lot like pursuing the Holy Grail,” says Sigg.
The reason so many organizations are investing in predictive demand projects is that it would allow them to optimize pricing for a product or service. In addition, inventories and supply chains could be made more efficient in ways that would enable an organization to achieve a sustainable competitive edge.
Demand forecasting, of course, has been around since medieval times. But it’s always been an inexact science. In fact, despite all the advances in analytics predictive demand remains more art than actual science.
Still a long digital road ahead
Most demand-forecasting software relies on historical data to determine future probable outcomes. Most businesses use that information to, for example, project future revenues. Big data analytics and tools such as sentiment analysis are improving the ability to forecast accurately in some uses cases.
But even when the latest technologies are applied to demand forecasting most companies can’t respond fast enough to changes in demand because they have in place the type of event-driven IT environment need to turn massive amounts of raw data into something akin to actionable intelligence. Because of these issues, the number of businesses that miss or exceed quarterly projections each year remains relatively constant.
Many companies are hoping that once everything becomes more instrumented in a digital economy there will be enough data collected to truly predict demand. But even if that occurs it may take years to figure out how to apply the right algorithms and models to all those data streams.
In recent years, Sigg notes, far too many organizations outsourced any ability they may have had to once write code, so many of them are today essentially starting from scratch in terms of being able to develop software.
The search for the real “Holy Grail” has in one way or another motivated several great quests over the centuries. Wiseguy Reports this week predicted the global predictive analytics market will grow from $3.4 billion to $12.6 billion in 2023, a compound annual growth rate of 20 percent. While that may sound impressive, many of the organization embarking on the analytics quest those forecasts represent would be well-advised that what outcome they are trying to predict is an actual solvable equation.