Introduction
Fired equipment (e.g., fired heaters, fired boilers, flares, and thermal oxidizers) are critical components in almost every major refining or chemical process unit developed. These fired systems tend to be a complex interaction of many disciplines, including process engineering, combustion engineering, metallurgical/mechanical engineering, operations, and inspection, each of which play a role in how the systems are operated, optimized, and maintained. These core disciplines must interact consistently and effectively together to maximize the value of fired equipment while also staying safe and reliable. “Siloing” information and not understanding the holistic nature of equipment health tend to lead to poor safety, reliability, and optimization of fired equipment which can easily cost a medium-sized refinery (say 150 MBPD crude charge) 1 to 10 million dollars per year, depending upon circumstances. A critical tool in monitoring and remaining proactive on process and mechanical concerns on fired heaters and boilers is infrared (IR) thermography. It is one of the most important tools to leverage across the cross-functional support teams to maintain safety and reliability while operating the equipment within defined operating limits to maximize margin capture and optimize reliability. Many organizations use IR thermography via an inspection department. The data may or may not be of dubious accuracy. We have found that it is commonly not acted upon or shared outside of the inspection department. A wide variation of measured temperatures, as high as +/- 150° F from actual mid-wall temperature of the tube, will occur if data correction of thermography is not performed correctly. By achieving the needed levels of accuracy with IR thermography and engaging cross-functional support teams, this data can be quite valuable to safely and reliably operate.
Discussion
High-temperature IR Thermography was pioneered as a method for understanding tube temperatures inside fired equipment, while in operation, in the 1970–1980s. The initial urgency of its development was tied to tube failures resulting in safety and reliability events. A method to accurately measure and monitor tube metal temperature was needed to identify issues and prevent failures proactively. There are some important parameters to consider when trying to achieve the desired level of accuracy such as emissivity, technique, correction factors, etc. The accuracy of the analysis is dependent upon the accuracy of the temperature measurements.
Over time the equipment used for high-temperature thermography has improved and it is now a mature technology. While the equipment continues to improve, completing accurate analysis of the data remains very challenging. It is a complex process to accurately measure tube wall temperatures due to the variable nature of surface scale, combustion characteristics, and reflected radiation. The underlying physics of how to correct for these factors can be understood by correct application of Planck’s Radiant Function and the Stefan–Bolzmann equation, shown in Figures 1 and 2. Also critical to the analysis is understanding the underlying process conditions of the subject heater. This complexity for correcting the data has led to inaccurate uses of thermography throughout industry, which in general erodes the technology’s overall credibility. Given the correct equipment and technique, proper application in the field, and correct data analysis, thermography can be very accurate. Some applications, with adequate data and leveraging the proper tools, can be consistently accurate to within 5° C of true operating conditions. This can be a very valuable approach to keep equipment safe and reliable and may enable operations to capture high-profit margin opportunities.
Comments and Discussion
There are no comments yet.
Add a Comment
Please log in or register to participate in comments and discussions.