Introduction
Baseball is arguably the most data-driven of all the major US sports, largely due to the sheer amount of data available in a 162-game season. In the 1970s, the baseball world coined the term SABRmetrics (now known colloquially as sabermetrics), which is the use of empirical data to measure in-game activity. Historically, it was thought that the team batting average (hits divided by the total number of at-bats) was the best metric to use when trying to determine a correlation to team runs scored. However, this measure doesn’t account for other ways of reaching base, so the metric of on-base percentage was created, which puts all manners of getting on base in the numerator and divides by the total number of plate appearances. From here, the world of sabermetrics exploded. The creation of slugging percentage, on-base plus slug, wins above replacement, and so many others were created to help teams understand not only the value of their players, but also how their players may perform against the competition. As they learned more about how to apply statistics and predictive models, computers grew from the old IBM S/360 with programs written in FORTRAN and Basic into cloud computing and current powerful languages like R and Python; teams are now learning how to take even more advantage of all the data they have been collecting over the past 100 years.
Simple Arithmetic
So how do baseball and the sports world relate to the energy and manufacturing industry? Inspection departments have been collecting inspection data for as long as they have been operating; whether it was qualitative or quantitative, it was collected. When API published documents 510 and 570, guidance was finally given about how to use some of this data. Initially, this guidance was in the form of short-term and long-term corrosion rates, which we have been using fairly effectively. However, we still experience leaks in our equipment, especially in piping. So, the question is this: Are long-term and short-term corrosion rates a proper method of determining a correlation to loss of containment?
This author holds the opinion that the industry should be doing better than these basic calculations. In the 1950s, when API 510 was originally issued, a calculator looked similar to a typewriter and could only complete basic arithmetic. In 1993, when API 570 was issued, the new Texas Instruments TI-82 had just been released as an updated version of the original graphing calculator. It was unrealistic to expect the industry to change its data perspective then. However, it was expected that with the publication of these standards, process plants would see a downward trend in the number of loss-of-containment incidents reported worldwide. Looking at Figure 1, it is evident there has been a slight reduction in accidents per year when trended, beginning around 1993 [1].
Comments and Discussion
There are no comments yet.
Add a Comment
Please log in or register to participate in comments and discussions.