Inspectioneering
Blog

Understanding Inspection Data Management Systems, Part 2: Selecting the Right IDMS

By Efrain Rios at Fortress Oil & Gas, LLC. June 6, 2018

INTRODUCTION

In the first blog of this 5-part series, "Understanding Inspection Data Management Systems, Part 1: An Overview of Common Issues and Causes," readers were given an overview of inspection data management system (IDMS) applications, common challenges that users often face, and the most frequent sources of those issues.  In this second part, readers will learn the most prominent steps in an IDMS selection process and the key areas beyond core functionality that must be considered to ensure that the right IDMS is selected.  The other three parts in this series will be released in the coming months and are listed below:

  • Part 3: Filling Functionality Gaps Through Development
  • Part 4: The Importance of Standardization
  • Part 5: The Implementation Hurdle

WHY SELECT A NEW IDMS?

In years past, inspection results were captured by hand and all necessary calculations (corrosion rate, remaining life, inspection due date, etc.) were done by hand on paper.  Formal and informal records were all stored in filing cabinets and information management was limited to a single location, the file room.  However, today we have the technology to instantly process calculations, retrieve pertinent records, mine data, and report information from anywhere in the world.  Utilizing an IDMS to perform these basic functions clearly makes it easier for companies to effectively manage their mechanical integrity programs.

Most companies implemented an IDMS over twenty years ago and many are now finding that their growing business needs have outpaced the evolution of their chosen software.  Other companies delayed IDMS implementation and are still managing their inspection program using pen and paper.  In either case, these companies must eventually select and implement a new application to better manage their businesses.

WHAT ARE SOME AVAILABLE OPTIONS?

There are many IDMS applications on the market today and each addresses the core functionality requirements in its own way.  What generally separates them from one another is:

  1. How users interact with the data,
  2. How the settings are configured to process the data, and
  3. What kind of ancillary functionality exists. 

Some of the most widely recognized IDMS names in the industry include Metegrity Visions Software, PCMS, Meridium, Aware, RBMI, ACET, and UltraPIPE (aka PS AIM).  However, the market continues to change as old applications are improved upon and new applications are created.  Therefore, it is very important that each new IDMS selection project begin with a refreshed list of applications.

SELECTION PROCESS OVERVIEW

Whether implementing an IDMS for the first time or replacing the current application, users must determine which available software best fits their business needs.  This can be overwhelming for many and some project teams don’t know where to begin.  Figure 1 below illustrates the overall process highlighting the prominent stages.

Figure 1. IDMS Selection Process Overview

 

Define Business Needs

  1. Identify Key Stakeholders:  This is a critical first step that is far too often overlooked.  Many companies involve only their IT department because the project involves software selection.  Other companies leave the entire project to the inspection teams.  The reality is that this kind of project requires collaboration between several different teams within the company, representing both end users and support functions.  Having the proper organizational representation from the very beginning helps ensure that all necessary requirements are considered throughout the selection process and that the business chooses the best overall fit for the company.

  2. Define Support Function Requirements:  A properly assembled IDMS selection team will contain members representing broader business interests beyond the core functionality needs of end users.  Their input to the list of requirements will help ensure that the chosen application with its desired functionality will meet longevity needs without increasing corporate risk.  Common parties to consider in this category include IT, Cybersecurity, Procurement, and Legal.  Below are just a few examples of how these extended teams can help ensure the right decision is made.
      • IT – Technology necessary for deployment, potential integration with other applications and technology, user access limitations, disaster recovery protocols, and programming language longevity issues
      • Cybersecurity – Penetration (“hack-ability”) tests, “poison pill” technology, protective security barriers, and corporate policy alignment
      • Procurement – IDMS market engagement, benchmark analysis history, pricing discussions
      • Legal – Contract reviews (Master Software Licensing Agreement, End User Licensing Agreement) and regulatory compliance functionality

  3. Define End User Work Processes:  In some cases, companies will select an IDMS based on certain criteria (e.g., low cost) even though it does not fit the company’s work processes very well.  They consciously decide to change their business to align with a software application’s functionality.  It’s hard to imagine a construction crew loading 1,500 lbs. of material into the back of a 4-cylinder sedan simply because the sedan was less expensive than a pick-up truck.  However, companies often choose the least expensive IDMS available and then change their work processes to fit the software.  It is very important to first define the work processes that the business needs and then develop the list of functionality requirements to support those processes.  Having well defined processes will not only prevent the business from changing how it operates but will also expedite implementation since the software must be configured to align with these processes.

  4. Document Desired Functionality:  With the end user work processes defined and the support functions’ requirements documented, the list of desired functionality and features can be completed.  This is the final stage in determining what the business would like from any potential IDMS candidate.  It aggregates the company’s needs from core functionality to ancillary features and helps to eliminate, early in the process, those applications which are too far removed from the company’s vision.  Key items to review in this stage are swim-lane diagrams, roles and responsibilities matrices, forms, templates, reporting needs, legislative requirements, and corporate policies.  The final list should include not only the technical requirements of the end users (e.g., the need to capture inspection anomalies and prioritize the findings for resolution), but also incorporate the needs of the support functions (e.g., the need for inspectors to meet legislative deadlines to sign and accept a vessel’s continued service).

Create Selection List

  1. Review Full IDMS Market:  In exploring the available options on the market, the project team should cast a wide net and return as many IDMS candidates as possible.  It is not uncommon to have over 10 applications on this early list.  The goal is to leave no stone unturned and a long list of potential applications will allow the company to thoroughly review all available options and ensure that it selects the overall best fit.  The project team should also obtain a summary of each application’s functionality.

  2. Narrow Down List of Options:  Now that the project team has documented their desired functionality and obtained a features list from each the software companies, the team can remove any obviously unqualified candidates from the selection pool.  This will narrow down the list of applications by removing those which obviously do not meet the most basic needs of the project team.  All IDMS applications now remaining on the list will have their functionality demonstrated to the project team during the upcoming live sessions.

Define Grading System

  1. Finalize Grading Criteria:  The project team should now separate the list of desired functionality into two categories: “Need” and “Want.”  Each item on the “Need” list must be fully functional before the application can be deployed to users.  The “Want” list, however, is purely desirable functionality that is nice to have, but not critically necessary for implementation.  This is an important step that forces companies to draw a clear line between what they would like to have and the minimum requirements necessary to support their work processes.

  2. Allocate Points to Prioritized Criteria List:  With the list now separated, a maximum point value can be applied to each item on the “Need” list.  These values should be weighted based on the level of effort required for programmers to develop the functionality.  This allows the project team to account for (1) the unlikelihood of having every need met by a single application and (2) the programming requirements for critical functionality that is missing.  As an example, adding a checkbox to note whether scaffolding is required for access to a certain condition monitoring location (CML) will require much less programming effort than adding an entire screen to capture thickness data and manage corrosion rates.  Therefore, the checkbox should receive less points than the screen for CML data.  From a scoring perspective, this rewards applications for meeting more needs out-of-the-box while not over-penalize them for having gaps that are relatively simple to close.

    Maximum point values can now also be applied to items on the “Want” list, however, this should be weighted based on how desirable the functionality is.  For example, a company may want an application that incorporates artificial intelligence, but there may be a more immediate goal to automate an electronic interface with their work management system (Maximo, SAP PM, etc.).  In this example, more points should be assigned to the work management system interface than the artificial intelligence feature.

    The point values assigned to these two categories now serve as a built-in prioritization of features with clear separation of critical and desirable functionality.  It also provides a foundation for the project team to quantify how well an IDMS fits the business and how each application ranks in comparison to one another.

Grade and Compare Applications

  1. Score During Live Demo Sessions:  Prior to the demonstration sessions, the project team should distribute a list of scenarios they would like the IDMS companies to demonstrate.  This ensures a consistent approach for each session and helps provide the project team with an “apples to apples” view for more effective comparison and grading.  As the project team assigns points to each demonstrated feature, it is important for them to consider not only whether a certain feature exists but also how it functions in a given scenario.  Simplicity and close alignment with work processes should yield more points for a given feature (up to its maximum value).  Also, it is highly recommended to record these demonstrations and all subsequent Q&A for use with later follow-up discussions.

Analyze Results and Select IDMS

  1. Analyze Scoring Results:  With the desired features now listed, categorized, prioritized, weighted, and scored, the points for each IDMS candidate can now be summed and the totals compared.  The highest point total will correspond to the application that best fits the business’s needs.  However, it must me mentioned that it likely will not perfectly meet the need. As previously mentioned, it is unlikely that the project team will have every need met by any single IDMS.  In most cases, at least one critical feature will be missing.  Since each critical item on the “Need” list must be fully functional before the application can be deployed to users, a Functionality Enhancement Plan must now be created and managed.  This plan will outline the requirements for developing, testing, and incorporating missing critical functionality into the selected IDMS prior to its deployment.  This topic will be the subject Part 3: Filling Functionality Gaps Through Development.

  2. Select IDMS Application:  End users and support functions working collaboratively and following the process described herein provides the greatest chance to select the right IDMS, one which not only works well today, but also well in the future.  Utilizing the categorization and point weighting systems outlined herein will help ensure that the best fit is selected, and minimal additional effort will be required prior to implementation.

KEY CONSIDERATIONS

With businesses’ needs and the IDMS market continually changing, it is impossible to list here every feature which should be considered in the selection process.  However, the list below does outline several key areas and can serve as a prompt for a project team’s discussions.  This is not an exhaustive list, but instead a starting point from which each project team can collaboratively build their own.

End User Considerations

  • Equipment Registry
  • Inspection Management
    • Planning (support crafts, access methods, etc.)
    • Scheduling
    • Execution and Data Capture
    • Results Analysis
    • Anomaly Management
  • Document Storage and Access
    • Equipment Records
    • Drawings
    • Photos
  • Key Performance Indicators (KPIs)
  • Standard and Ad-Hoc Reporting
  • Structured Workflow Management
  • Integrated Risk Assessments
  • Mobile Data Collection
  • Interface to Work Management Systems

Support Function Considerations

  • Software Longevity
    • Programming Language Support
    • Software Company’s Corporate Commitment
  • User Count and Associated Cost
    • Site Specific or Enterprise Licensing Structure
    • Maintenance and Support Fees
    • Monthly Subscription Cost Per User
    • Peripheral Applications (AutoCAD, MS Office, etc.)
  • User Locations and Accessibility
    • Remote Location Internet Connectivity
    • Web Based Applications
    • CITRIX Installation
  • Cybersecurity
    • Internal Corporate Standards
    • Penetration Tests
    • Web Browser Security Flaws
    • Poison Pill Technology
  • Legislative or Regulatory Compliance
  • Master Software Licensing Agreement
  • End User Licensing Agreement
  • Data Hosting Strategy
  • Disaster Recovery Plan
  • Defect Management Processes

CONCLUSION

The process described above helps users select more than just a good IDMS.  It helps them choose the right IDMS for their company.  It requires engagement from other members of the organization and a thorough, collaborative effort throughout the process to ensure that the IDMS selected fits the company well for as long as possible.

Part 3 of this series, "Filling Functionality Gaps Through Development,” will discuss what to do when critical functionality is missing and why customization is not always a great option.


Comments and Discussion

Posted by Keith Davidson on June 11, 2018
Clear, concise, and quick read. Good stuff!!... Log in or register to read the rest of this comment.

Add a Comment

Please log in or register to participate in comments and discussions.


Inspectioneering Journal

Explore over 20 years of articles written by our team of subject matter experts.

Company Directory

Find relevant products, services, and technologies.

Job Postings

Discover job opportunities that match your skillset.

Case Studies

Learn from the experience of others in the industry.

Event Calendar

Find upcoming conferences, training sessions, online events, and more.

Industry News

Stay up-to-date with the latest inspection and asset integrity management news.

Blog

Read short articles and insights authored by industry experts.

Acronyms

Commonly used asset integrity management and inspection acronyms.

Asset Intelligence Reports

Download brief primers on various asset integrity management topics.

Videos

Watch educational and informative videos directly related to your profession.

Expert Interviews

Inspectioneering's archive of interviews with industry subject matter experts.