Note: This is the second blog of a two-part series examining some of the differences among advanced analytics techniques, including cognitive analytics. This post applies the examination to the example of asset failure prediction as part of an overall asset performance management (APM) strategy. If you missed it, read Part 1.
I’ve stated numerous times that it can be difficult to sort through the many and confusing ways analytics are being packaged and marketed and discussed how industrial platforms are now adding another layer to the decision-making process. In sorting through the market noise, I’ve maintained my position that analytics aren’t about best or worst; they are about fit. In this second blog about cognitive analytics, I’ll work through some of the reasons why users are considering this specific technique, using the example of asset failure prediction.
Given that 82 percent of all assets fail randomly, many variables might need to be considered to predict failure with a high degree of accuracy. Contributing factors might reside within a wide range of sources: work orders, visual and audio recordings, operating data, sensors, engineering inspection notes, operator logs, etc. As well, there may be additional value in third party information, such as weather, as well as historically untapped information like customer data. In these situations, the data is abundant, but much of it might be unstructured and rarely, if ever, used.
Four Capabilities Cognitive Analytics Can Apply to APM
Cognitive analytics are suited for the volume and random nature and of this data, particularly if unsupervised learning is called for. When applied to assets, cognitive analysis can identify new patterns or outliers that will lead to degradation and failure.
With these considerations in mind, deep learning analytics provides four capabilities for problem solving asset failures. These are:
- Detecting similarities and anomalies: The ability to identify an outlier or commonalities is the root of failure prediction. Cognitive methods aren’t locked into a data relationship, so they can detect anomalies and similarities in data where no prior relationships were understood. This ability to discover the previously unknown contrasts with engineered algorithms and single-layer analysis that detect the hidden but known.
- Adaptability: Cognitive algorithms work well for unsupervised learning. This characteristic is suited for operating environments where the data from physical objects, such as assets within the same class, naturally varies from use to use. The underlying objective doesn’t change, but the analytics can adapt as the asset use conditions or any related data variances occur.
- Scalability: Cognitive analytics improve the more they are exposed to data. As the analytic examines additional data and receives feedback, predictions become more precise and accurate. Assets (and related processes) are constantly producing new data, whether directly or via systems and people. These data may provide new insight into predicting asset failure, so scalable learning is required if the analytics are to keep pace with change.
- Digitizing knowledge: This prescriptive capability is often overlooked or misunderstood by end users. Cognitive analytics can be applied to discover, crowdsource, contextualize, and share knowledge from disparate sources that are often tightly siloed, collected but rarely ever used, or not part of an “active” knowledge base. These data usually are a mix in structured and unstructured formats residing within a range of sources, including work logs, applications, event reports, images, emails, manuals, historians, the Internet, etc. Additionally, techniques like natural language processing (NLP) can deliver prescriptive feedback to support the point of decision. The supported application does so by interacting with humans using common language. For example, a technician working in a difficult environment, elevated in a wind turbine or oil platform, can speak directly with an application that can engage in a complete discussion with an NLP application. The application can use that analysis to have a discussion, identify problems, query knowledge sources, and send answers as the work is being undertaken.