Using Artificial Intelligence for IoT Integration: Bit Stew’s Approach

PinIt

Supervised and unsupervised learning approaches rapidly map data into a semantic model that can be used in an IT architecture.

GE Digital’s acquisition of Bit Stew Systems, a small startup with about 100 employees, should come as no surprise to those familiar with the challenges of industrial IoT projects.

GE’s Predix is a major industrial IoT platform that targets sectors such as manufacturing, aviation and energy, with use cases in predicting maintenance and optimizing performance of massive assets, such as multi-million dollar gas pipelines, jet engines, or gas turbines.

Bit Stew, based in Vancouver, Canada, has software that uses machine learning algorithms to filter and integrate data from industrial equipment, databases, and control systems, creating a semantic data model for use throughout an IT architecture – from cloud to edge.

“Most of our customers maintain 30 connected systems to our platform, and are managing millions of connected devices,” Franco Castaldini, vice president of marketing and product management at Bit Stew, told RTInsights.

 “The integration of our technologies will provide the Predix platform with a greater capability to integrate data while it is in motion from the edge to the cloud,” Harel Kodesh, CTO at GE Digital, said in a press release. “This combination will help us to accelerate our industrial offerings, providing customers with contextual understanding of their assets and operations.”

Importance of IoT integration

In addition to GE Digital’s acquisition of Bit Stew, SAP recently acquired small startup  Plat.One, which developed a way to communicate with over 40 different machine protocols.

According to a 2016 report from Gartner, data integration is one of the top barriers to adoption to industrial IoT use cases. Gartner has estimated that through 2018, half the cost of implementing IoT solutions will be spent on integration. Through 2020, 75 percent of IoT projects will use some form of stand-alone (third-party) integration platform.

“For many industrial enterprises  … some large companies budget anywhere from $5 million to $8 million per project just to deal with the data integration problems before tackling the high-value functionality,” Bit Stew said in a technical paper. Integration costs are the reason nearly half of all industrial IoT projects fail, the company added.

Related: Solving IoT integration – vendor landscape

Using artificial intelligence for data integration

Mix Core, which uses a NoSQL architecture, applies schema to data as it is ingested, allowing integration of data six times faster than traditional extract, transform, and load (ETL) processes, the company said.

For one East Coast utility, Mix Core was able to integrate 52 different data sources in about 10 days, whereas traditional ETL processes would have taken more than a year to complete, Castaldini said.

Bit Stew said it has developed a semantic model that covers operational controls, sensor channels, business information, environmental data, and geospatial data, as well as all the relations and associations. Under an approach called “intelligent semantic modeling,” the system uses both supervised and unsupervised learning methods to use and extend the model.

Under the system, source data is sent and read by a “features extractor,” which creates feature vectors stored in an index. A relationship associator uses the vectors to create a matrix, then a field classifier uses supervised and unsupervised learning algorithms to relate and map the data. A modeler uses the relationship matrix and the data map to create a semantic model, which can be sent to a target system, such as a database.

Automatic data decoding

Single multi function display

Artificial intelligence can be used to integrate data from flight control systems and jet engines.

Sandy Mangat of Bit Stew explained how Mix Core could tackle the problem of integrating data from a major U.S. airline.

“There’s data coming off of the jet engine and the entire flight system, including the airport code. But some of it might be data we have never seen—a sequence of numbers and letters.  So how would the platform and be able to map any of those to the semantic models, without having custom decoders?”

The machine learning algorithms will detect three letters in a sequence, and figure out it represents an airport code. “And then it creates a composite information pattern for the data source, path, and how the machine should handle it,” Mangat said. Then the data is mapped to a semantic model. “You can readily adapt to different data types using that sequencing,” she said.

Data cleansing and analytics

The AI processing platform also performs data cleansing and processing, dealing with problems such as out-of-sequence data streams, delayed streams, data gaps, and signal noise.  A Mix Core Analytics Framework (MCAF) contains a library of algorithms and methods, including:

  • Prediction and forecasting – how a process is trending, and confidence in steady-state operation.
  • System reliability – likelihood of system failure.
  • Machine classification and clustering – how to offload repetitive decisions to a machine.
  • Natural language processing: Given a piece of ostensible “text” such as field notes from crews, what does it “mean”?
  • Temporal structural behavioral patterns: What outliers, interdependence within and between temporal processes, and gleaning of predictable patterns can be discovered?

There are also algorithms and methods for association rule mining, filtering and de-noising, business decision rule-making, exploratory diagnostics. The system can layer  on top of major Big Data technologies such as GE Predix, Hadoop, Teradata and others.

Use cases for industrial IoT data integration

Customers use Mix Core for operational asset performance and predictive maintenance — sensor data can be integrated for large turbines, pipelines, and refineries.

Large industrial customers have also used the Mix Core Analytics Framework for ingesting and analyzing data on wind, temperature, moisture and air pressure combined with predictive techniques on wild fire spread, lightning strikes, and earthquakes to understand the impact on field assets. Still another use case is revenue loss detection, such as using smart meter data to predict under-billing.

Silhouette of Power LinesAccording to a case study from Bit Stew, BC Hydro in British Columbia used Mix Core to quickly identify outages and mobilize field crews to restore power faster. The data integration tackled smart meters; grid-asset health tracking; distribution grid management; work management, and geospatial information. That included 2 million smart meters, more than 5,000 relays, 2,000 routers, and 30 different operational and IT systems including homegrown and legacy systems.

“BC Hydro also implemented real-time analytics to the data, generating rich visualizations and providing a ‘single pane of glass’ view that enables operators to see important activity happening across the smart grid,” according to a case study from Bit Stew. BC Hydro’s operators  “now have a contextual view of the entire operations.

Operators can easily access geo-spatial views of all grid assets and generate real-time visualizations of meter outages, communications performance issues, distribution grid load and voltage issues, as well as other critical events.” That helps operators “more easily assess situations, triage alerts and alarms, and gain the actionable intelligence they need.”

For one of GE’s enterprise customers, a gas pipeline operator, Bit Stew reported that it modeled, mapped, and indexed the data for creating asset risk profiles for pipelines. The customer will roll out the solution at 15,000 miles of gas pipeline in North America. “The outcome was gaining a real-time contextual understanding of operations, something previously unattainable,” Bit Stew said.

Edge intelligence

Intel, which makes gateways for uses in industrial IoT edge analytics, describes an architecture in which Mix Core sits at both the cloud and the edge. A system could detect a disturbance from time-series vibration data using outlier detection mechanisms.  For example, a small component failure could stop a production line. Mix Core could facilitate an alert to the appropriate maintenance worker. Summary information was communicated from the edge MIx Core instance to the central MIx Core instance for overall operational awareness and logging.

Mix Core would pick up messages off of the MQTT bus, ingest the data into its NoSQL index, model, map and analyze the data; alert technicians; and communicate summary information to the Mix Core cloud instance.

With partners Intel and Cisco, Bit Stew said that in one case, the technology is used to analyze vibration data at 25,000 samplings per second for failure detection using analysis such as F-Test and Spectrogram. In another case, predictive failure detection was based on packet-sniffing data captures of communication traffic between actuators and the control systems. “All this is conducted at the edge, eliminating timely and costly data transfers from sensors to the data center or cloud—and all using the identical adaptive data integration and MCAF libraries,” the company stated.

More on this topic:

Machine learning and AI

Data integration tools

Edge computing

Chris Raphael

About Chris Raphael

Chris Raphael (full bio) covers fast data technologies and business use cases for real-time analytics. Follow him on Twitter at raphaelc44.

Leave a Reply

Your email address will not be published. Required fields are marked *