In the past few years, crypto miners have been buying up pretty much every high-capacity GPU available on the market. So, the crypto miners’ pain may well be enterprises’ gain.
While it is bad form to sneer at the rapid fall of cryptocurrencies, there are some serious opportunities emerging as a result. For those not aware, crypto miners in the past few years have been buying up pretty much every high-capacity GPU available on the market. This bid up prices and reduced availability to the point where even major cloud providers could not get their hands on current models.
When combined with Moore’s Law, this has led to a situation where the average GPU hardware being used for anything other than crypto is several years old and probably 4x less powerful than normal market conditions would support. But this has also led many software companies to avoid optimizing their wares for GPU. So, on average, the software you are using is probably 10x slower than it should be.
That is probably the largest market opportunity in a generation, and smart companies should be looking now at how to exploit it. Speeding up your word processor or spreadsheet by 10x is unlikely to unlock any major business value. But there are several important areas which will.
The most obvious are database systems and particularly those operating on big data. The digitization of the world overall has not slowed down, and as a result, systems built on top of legacy databases are struggling these days just to keep up. This isn’t always apparent to end users as a database issue but typically manifests as painfully slow screen refresh rates or stuck busy cursors.
See also: Moving to the Cloud for Better Data Analytics and Business Insights
This has been mitigated somewhat by a move to cloud computing with automatic horizontal scaling (adding more CPUs). However, as data volumes get really big, the process of moving data across and between CPU boxes becomes rate-limiting. The result is non-linear returns, where doubling the compute applied only gets you, for example, 50% more speed.
The implicit response by most companies in this circumstance is essentially to stop even looking at all the data. For instance, you might aggregate hourly data to daily or daily to monthly. Under normal operating conditions with well-understood data, this can be fine. However, it comes at some risk. The reason is that modern data science techniques require access to the primary granular data to drive a fundamental type of insights: anomaly detection.
Anomalies can be either good or bad, but they are rarely neutral. They represent your best and your worst customers and your company’s best and worst responses. They include issues of high business risk and also of rewards. So, solving a technology limitation by ignoring outliers is penny-wise and pound-foolish.
A classic example might be the utilities which, until recently and still today, sometimes use 1km resolution data to monitor strike tree and forest fire risk. A single pixel in such a system might have 1,000 healthy trees and one dead one. But it only takes a single tree hitting a power line to a wildfire big enough to bankrupt a major utility. The business risk, in that case, is hidden within decades-old data collection decisions underneath even older database technology- but it is nonetheless very real. And today would be a very good time to start addressing it since sources and methods have evolved rapidly over the last five years and have generally not exploited either GPU analytics or new hardware.
A similar situation exists with prospect and customer data within many businesses. An accounting mindset and older technology can lead to routine aggregation of data into monthly and quarterly reports ad nauseum. But you should never forget that your customers are individuals whose cumulative experience across multiple touch points forms the basis for the likelihood to buy or recommend – or lack of it. Just as with the risk above, market opportunities are hidden by default in common aggregations like sums and averages.
This brings up another very important issue in business analytics, which is who within an enterprise is empowered to find such risks or opportunities. Perhaps the most important reason to upgrade older systems with GPU analytics is the availability of interactive no-code visual analytics. As the name implies, this allows a much wider number of people within an organization to notice a risk or opportunity and to dig in interactively to confirm or dismiss it. This could well be a salesperson or a front-line employee not traditionally thought of as a ‘data analyst’ or ‘data scientist.’
Next Steps
All business situations are unique, so an enterprise’s next move here may vary. But as a simple next step, managers should consider which parts of the business functions they are responsible for are using datasets or software tools more than five years old. Then look more specifically at ‘big’ data available relative to their current systems and what value it might bring. If they see an area of opportunity, then they have to consider what kind of quick pilot they might be able to organize to validate it. Paradoxically, without access to interactive GPU analytics, it can be hard to evaluate. So, businesses should talk to vendors and consider testing in a cloud environment. The crypto miners’ pain may well be enterprises’ gain.