Commissioning (Cx) one piece of equipment, or even one system in a data center, can be a fairly straightforward process. But, when the challenge at hand is Cx of a 373,000-sq-ft mission critical super structure, where as many as 50 pieces of equipment must be tested daily for three months, requiring more than 16,000 total Commissioning Authority (CxA) hours worked, the domino effect comes into play. In many cases, each task is proceeded by another and accuracy, timeliness of reporting, and communication protocols are critical to success.
Large, high-profile projects will have multiple, parallel processes occurring simultaneously, and pressures driven by the cost of delayed turnover tend to shorten timelines even though there’s more to do. Staying on top of it all is challenging at best, and strategic tracking is extremely important. On large, high-profile projects, the Cx team has a unique opportunity to learn from early experiences and mistakes, ultimately turning them into process improvements further down the road.
For example, where 180 of the same type of equipment will be tested, clear lessons learned will be apparent after the first few samples have been tested. These can then be applied to improve testing efficiency for the remainder of the sample size. When a Cx team can slow down just enough to recognize and capitalize on these initial lessons learned, the Cx process will benefit in significant time savings, which will minimize retesting and rework, and ultimately provide the owner with more confidence based on improved overall process implementation.
COMMON LESSONS LEARNED
Large, complex Cx projects will share similar, easily-identifiable characteristics. Common themes, pain points, and most importantly, lessons learned, will emerge. Consider the following widespread lessons learned and best practices for Cx of large, fast-paced, high-profile data center projects.
Communicate, communicate, communicate. Both internally among the Cx team and across the entire project team, effective communication will solve many problems and avoid costly delays in a schedule that has little room to accommodate them. This includes proper activity handoff, testing plans, and tactical scripting. For example, if a Cx team member starts an activity in the field, but doesn’t finish it, how is its progress tracked? Do other Cx team and project team members know completion is outstanding? Is there a plan in place to reschedule the activity, assign the proper resources and support personnel, provide the needed test equipment, and confirm that finishing the task at another time will not conflict with other previously-scheduled activities? When there are thousands of activities to complete, they must be tracked in a visible and easily adaptable way.
Best practice: Set up a project road map. The Cx plan should act as a project roadmap set up early on and then communicated to the entire project team. Development of a plan with the incorporation of a fully populated equipment list will drive the format and development of tests. This format will then influence all of the Cx planning documentation, including schedules, documentation management checklists, and staffing plans.
Best practice: Document every activity. If an activity is started, the Cx team has to manage tracking procedures to ensure that it gets finished. One way to do this is to break the schedule down into smaller, more manageable bites at the equipment level or even down to a specific testing level.
This added granularity helps expose what components of the process need to be completed as predecessors to others so that the process can continue as planned without interruption. Smaller chunks of effort also increase the likelihood that they will be completed in their entirety and reduces the risk that a portion of the activity will be forgotten about and never completed. Having a dedicated Cx team member audit the tests completed every 48 hours to try to catch incomplete documentation has proved to be extremely helpful if something is missed, as it can be rectified while still fresh in the mind of the original tester.
- Best practice: Communicate daily progress. Developing daily progress reports provides a record of what happened, how things went, what remains, and any major issues discovered. This also allows the Cx team and the project team to consistently manage the actual completed activities vs. the scheduled activities and to course correct as they go. Updates to resource needs, scheduling updates, and overall schedule impacts can be discussed as a team, and a new plan can be developed to ensure that the most efficient path is taken while all other project aspects are also considered. In addition, unrelated limitations, like construction activities in the area, interruptions to power, etc., that may not have been known, will float to the top and become part of the plan to move forward.
Don’t overlook contractor resource requirements. In order to perform Cx effectively, the contractor must engage the right testing equipment and people. This can be especially difficult when dealing with complex equipment. There are limited technicians that possess the level of understanding needed to perform the actions that will show up in mission critical testing applications. Without the right people on site to perform the tests at the right times, projects can experience significant delays.
- Best practice: Lead functional performance testing. The project contractor must understand the testing goals and vision and understand what it’s going to take from their end to achieve successful results: the amount of manpower, tools, etc. Creation of functional performance testing outlines, or even leading functional performance testing page turns, with the contractor and vendors are crucial steps to ensure that the contractor is aware of what the CxA will be asking of them. Cx is intended to be a collaborative process whereby the project team comes to a consensus to ensure that the actions performed during testing will confirm that the owner’s project requirements and the basis of design are adhered to without jeopardizing the safety of personnel or installed equipment.
Equipment firmware revision logs. This very common Cx lesson learned is prominent when dealing with electrical equipment that contains programmable logic controllers (PLCs). When equipment is tested, the PLC, or the computer that is governing its operation, relies on the installed firmware to force the system to respond to certain actions or external events with an expected reaction specifically designed to provide a predictable output. Planned testing activities can flush out problems that may require firmware to be updated.
These changes are only able to be made by a very limited number of people that possess this skill and there is rarely significant visibility as to what changes are made. Sometimes modifications to the code in the program fix an identified problem, while inadvertently creating another. It’s easy to lose track of what version of firmware was actually tested. This also brings up the question of whether or not the entire system needs to be retested if there is a minor change made to the firmware. The answer to this question is usually project specific and requires a discussion with all of the experts involved to make a decision based on project specific information.
- Best practice: Employ a firmware revision log. The use of a firmware revision log can help provide those not familiar with the code a summary of what changed and when it changed. Also associating these changes with a firmware version number can help the project team understand which version of the program the system is running and prove that the final testing was completed on the latest version. Keeping accurate firmware revision logs will help avoid the need for unnecessary retesting of equipment.
Complete reports in a timely fashion. There is a maxim in Cx: If it’s not written down, it never happened. Usually, supplemental documentation that results from testing such as screen shots, alarm logs, and photographs are created and need to be attached to the correct test for record keeping purposes. If there are 184 pieces of the same type of equipment to test, it is easy to lose track of which supplemental documentation attachments should be appended to which test.
- Best practice: Document it now. Connecting supplemental documentation immediately to reports will prevent loss of data and avoid time loss and unnecessary retesting to recreate lost data. Properly documenting all activities within 24 hours is paramount, especially if the plan calls for going on to test the same like-type piece of equipment over and over again.
COLLECTING, DOCUMENTING, AND IMPLEMENTING THE LESSONS LEARNED IN REAL TIME
Everyone seems to understand the value of documenting Cx lessons learned. Many project managers and companies mandate that lessons learned meetings be included as part of their process. However, it takes time and efficient management for lessons learned to provide real value. It also requires participation from several team members which can make scheduling and inclusion difficult. Even if the lessons learned are written down, how can a Cx team ensure that they provide value and actually get utilized on future projects? How are they used to ensure that the next project is better than the last and does not contain what should now be foreseeable mistakes?
- Best practice: Institute procedures to track lessons learned easily throughout the project. Creating an easily accessible and visible location to log and review lessons learned throughout the project will help drive immediate improvement. Not waiting until the end of the project to record the lessons learned will improve the contributions to the lessons learned log.
All members of the project team need to have access, and the lessons learned log can then be reviewed as part of the regular weekly Cx meeting. This will allow lessons learned to be added, worked on, and addressed throughout the project. In many cases the project can benefit from items discovered in real time. This effort is especially valuable in phased projects where significant portions of the project will be completed in the exact same manner only at a later date. A final meeting at end of project should then be held to discuss implementation of lessons learned going forward and any additions that were recognized late in the project or never implemented.
When this happens, improvement opportunities are infused into the project in the middle and not at the end.
Put lessons learned into categories and group them together using filtering for easy review and compiling. This can be done quickly when lessons learned are documented in an online commissioning tool.
Review pertinent lessons learned at the beginning of new, similar projects.
Assign actions to the Cx team between projects with accountability goals to get momentum on incorporating lessons learned into the Cx team’s process in a proven way that will benefit future efforts.
Find ways to involve everyone on the Cx team so they gain visibility to past lessons learned and know what to avoid in the future.
While complex data center Cx can be daunting at first glance, putting together a solid plan that can account for required adaptability can make the job much easier. Understanding that things will not go as planned and knowing how to adjust to minimize schedule and cost impact is the intellectual property that today’s CxAs are selling to their clients.
In addition, being able to report on project status in real time with accuracy, even if things are behind schedule, can be very valuable to clients and is generally appreciated. Having this information allows owner/operators to make decisions to help the project team and set stakeholder expectations as well as develop confidence to what is being done to stay — or move back — on track. The use of online Cx tools is strengthening the ability to make the process more visible and provide more quantifiable metrics, which will drive awareness to where problems may exist earlier on in the Cx process.
On a recent data center project for a West Coast-based technology client, the ESD Cx team used a unique metric to rally the entire team around lowering the average time it took to close Cx issues discovered on the project. This was possible due to tracking metrics provided by the online Cx tool employed as well as support and pressure from the client.
The software tracked and averaged how long it took to close an issue from when it was first opened. The life cycle of an issue usually goes as follows: Detection of the issue, open the issue in an issue tracking process, assign the issue to the appropriate party, investigation of the issue by the appropriate party, correction of the issue by the appropriate party, change of the issue to pending status for CxA verification, and closure of the issue upon CxA verification.
On this project, the client took advantage of this tracking capability to set a goal of closing issues within 10 days. To enable this to happen, plans were set up around each issue to ensure the team would hit the goal. This was done using online commissioning tool capabilities with tracking features, and achievable and trackable metrics were created and implemented.