This article will summarize the original assessment in 2000 and the ensuing
improvement efforts from 2000 through 2003.
Air Liquide America (ALA) is part of the Air Liquide Group, the global leader
in industrial and medical gases, headquartered in Paris, France. Its products
include air gases such as nitrogen, oxygen, and argon as well as hydrogen,
CO2, electricity, and steam. In the United States, the company maintains
more than 125 production facilities and 700 customer installations.

Late in 1999, ALA’s management team realized that a higher reliability performance
was key to existing and prospective clients. Concerned that current performance
levels needed to be raised, a standardized benchmark assessment of maintenance
and reliability capabilities was commissioned.

This article will summarize the original assessment in 2000 and the ensuing
improvement efforts from 2000 through 2003. The focus is on the improvements
made from 2000 through 2003 and the benchmark update in late 2003. The progress
achieved by ALA over 3 years is highlighted.

Situation in 2000
Since the company’s beginnings in the United States, the maintenance function
was decentralized and primarily the responsibility of the plant managers.
The plants were supported by technical resources at headquarters, but were
largely autonomous. Few reports or key performance indicators (KPIs) measured
maintenance or reliability performance. Performance was mainly measured by
two indicators—costs and headcount. This arrangement served ALA well for
many years, but by the late 1990s, reliability issues began to affect customer
satisfaction and maintenance costs were rising and unpredictable.

In 1999, ALA commissioned a regional maintenance concept designed to support
up to two dozen sites from regional reliability centers (RCs). Initially
the RCs were staffed primarily with a manager and technicians from the plant
sites. Maintenance engineering support was provided as was planning and scheduling.
However, the RCs were mostly reactive in nature, trying to provide resources
for plant shutdowns and emergency responses.

After a year, the RCs were having moderate impact on reliability and maintenance
costs remained unpredictable. As a result, the plants saw minimal value in
the new centralized approach. Rather than abandon the effort, ALA executives
decided to commission a Maintenance Benchmarking Study to define the issues
that could accelerate progress.

Benchmarking process
Benchmark assessments usually involve the collection of pertinent data and
a mechanism to validate the data. For most studies, a base of comparison
data already exists. The challenge is to collect “apples-to-apples” client
data. Using an unvalidated database can introduce a wide variation in data
submissions for key information such as maintenance costs, replacement asset
value, and personnel counts. These variations, in turn, can significantly
affect comparisons and interpretations.

In the ALA studies in 2000 and 2003, all data was validated through on-site
review of definitions, data reconciliation, and interviews. Similarly, comparison
data also was validated project-by-project to ensure that comparisons were
as consistent as possible.

On-site validation not only provides an opportunity to validate submitted
data, but also allows observation of maintenance practices. In reality, the
validation visit provides the opportunity to:
• Validate data
• Interview personnel
• Tour and observe the plant and its conditions
• Develop data comparisons and key issues in a team-based environment
• Draw conclusions that the plant team understands and supports
• Develop consensus lists of plant strengths and improvement opportunities

Interviews conducted during the site visits allow the process to move beyond
collection and comparison of data. The interviews typically highlight problem
areas, obstacles to improvement, and, very often, support conclusions implied
by data comparisons.

When the on-site work is completed and the benchmark team has discussed the
issues and key improvement needs, the assessment report documentation begins.
At this stage, the team understands the hard number comparisons and the key
areas for improvement. The final report and the subsequent presentation to
management are designed to highlight issues to be addressed and resources
required. The report also quantifies the potential financial gain from improvements.

Benchmark 2000 conclusions
In applying the benchmark assessment process at ALA in 2000, standard techniques
were used with accommodations for the typically smaller size of ALA’s United
States plant locations. The same validation processes, interviews, and team
approaches were applied, as described above.

The areas identified for improvement from the initial 2000 assessment were:
• Improve cost control through improved reliability
• Coordinate maintenance and reliability with capital projects
• Restore key support resources
• Redesign the reliability center concept
• Strengthen or replace the computerized maintenance management system (CMMS)
• Develop a contractor management strategy
• Institutionalize root cause failure analysis (RCFA)
• Perform reliability centered maintenance (RCM) analyses
• Institute work planning and work scheduling
• Strengthen spare parts management

Benchmark comparisons were provided in the assessment to allow the ALA team
to gauge appropriate staffing levels for direct maintenance as well as for
support functions including planning, reliability improvement, and parts
management. The comparisons with external data and practices also provided
a frame of reference for total maintenance costs, maintenance organization
structures, and maintenance philosophies.

Given the number of issues highlighted by the study, it was clear that ALA
would have to prioritize its targets. A potential cost reduction of up to
25 percent was identified, but it would come slowly, given the economic downturn
in 2000.

Using the benchmarking report as a basis, a maintenance improvement team
began to develop strategies and implementation plans. The elements of that
strategy can be downloaded.