Introduction
The rise in computational power over the last decade has begged the question of if and to what extent quantitative methods such as data science have in improving reliability programs. While data science has the power to revolutionize the reliability industry, it will only be able to do so with strong guidance and review from subject matter experts (SMEs).
The combination of SME and data science enables facilities to develop solutions to a variety of reliability challenges based on each method’s unique strengths. SMEs provide a wealth of knowledge, much-needed context, and experience that has proved instrumental in making facilities safer and more reliable. Data science, combined with machine learning (ML) techniques, has revolutionized how facilities sift through a tremendous volume of data and find insights in near real-time.
The ability to make better decisions by leveraging data continues to be a theme across the industry and will help decision-makers make more informed strategic decisions at a faster pace. This article will highlight the efficacy of a combined SME and data science approach by showing four example applications:
- Using equipment data and associated corrosion rates across multiple reformer units to show how predictive models using data science compare to traditional industry templates and expertise-driven models.
- Leveraging Bayesian statistics to introduce uncertainty into remaining life calculations and probability of failure, empowering the expert to define variables better to identify and reduce uncertainty, improve equipment remaining life estimations, and reduce overall risk.
- Leveraging data science to quantify the confidence of damage detection, including driving benefit to cost for taking readings on or omitting particular condition monitoring locations (CMLs).
- Leveraging natural language processing on CMMS and IDMS data to identify anomalies for equipment that should have been flagged for positive material identification but were not.
In each of these applications, we will discuss the challenge and how bringing various data science methodologies into the solutions approach allowed experts to make quicker and more strategic decisions yielding enhanced outcomes.
Modeling Corrosion Rates with Facility Data
Challenge: Can we leverage historical inspection data to create the next generation of accurate corrosion models?
Corrosion estimates are typically determined by SMEs using a variety of methods. First, SMEs make heavy use of industry standard tools, such as API specifications that map facility conditions to corrosion rates. Additionally, SMEs typically review historical inspection data to get an idea of how corrosion has manifested in the past. They will often rely on the wealth of experience they have accumulated to predict how corrosion may manifest in the future.
There are several limitations with current methods. For example, the theoretical corrosion rates provided by industry standard tools may differ significantly from how corrosion manifests at a given facility. Each facility is unique and will experience its own unique corrosion profiles due to different environmental conditions, maintenance and operation practices, and other factors. Further, while historical data provides an important touchpoint for corrosion rate analysis, the sheer volume of available data can be overwhelming for a human SME to analyze adequately. Finally, while an SME’s experience can provide valuable insight, in some cases, SMEs can be subjective and may not necessarily serve as an accurate predictor of corrosion.
Comments and Discussion
Add a Comment
Please log in or register to participate in comments and discussions.