NERSCPowering Scientific Discovery Since 1974

Energy Efficient Computing

Energy Monitoring to Improve Efficiency

NERSC has instrumented its machine room with state-of-the-art wireless monitoring technology from SynapSense. To date, the center has installed 834 sensors, which gather information on variables important to machine room operation, including air temperature, pressure, and humidity. The sensors signal temperature changes and NERSC has already seen benefits in center reliability. For example, after cabinets of a large, decommissioned system were shut down and removed, cold air pockets developed near Franklin’s air handling units. The SynapSense system alerted administrators who were able to adjust fans and chillers quickly in response to these changes in airflow and room temperatures before Franklin was negatively impacted.

Responding quickly to accurate machine room data not only enhances cooling efficiency, it helps assure the reliability of center systems. Future projects in this area include controlling water flow and fan speeds within the Air Handling Units based on heat load, and placing curtains or “tophats” around systems to optimize the air flows in their vicinity.

Innovations in Systems Installations to Improve Efficiency

Carver Installation Saves Energy, Space, Water

NERSC staff configured the IBM iDataplex system Carver in a manner so efficient that, in some cases, the cluster actually cools the air around it. By setting row pitch at 5 ft, rather than the standard 6 ft, and reusing the 52 degree F water that comes out of Franklin to cool the system, the team was able to reduce cooling costs by 50 percent and to use 30 percent less floor space. NERSC collaborated with vendor Vette Corporation to custom-design a cooling distribution unit (CDU) for the installation. The center has also shared these innovations with the wider HPC community and with IBM, which now uses NERSC techniques in new installations.

By narrowing row pitch to 5 ft from cabinet-front to cabinet-front (a gap of only 30-inches between rows), the Carver team was able to use only one "cold row" to feed the entire installation. (A standard installation would require cooling vents along every other row.) At such a narrow pitch, the IBM iDataplex’s transverse cooling architecture recirculates cool air from the previous row while the air is still at a low temperature. When the air passes through Carver's water-cooled backdoors (the cool water coming from Franklin), it is further cooled and passed to the next row. When the air finally exits the last row of Carver, it can be several degrees cooler than when it entered.

Using a design concept from NERSC, Vette Corporation custom-designed a cooling distribution unit (CDU) that reroutes used water coming out of the Franklin (Cray XT4) system back into Carver. Water from the chillers enters Franklin's cooling system at about 50 degrees F. The water exits the system at about 65 degrees F. Sending the water back to be chilled at such low temperatures is inefficient. Instead, the custom-designed CDU increases efficiency by rerouting the water through Carver to pick up more heat, sending water back to the chillers at over 70 degrees F and enabling one CDU to cool 250 KW of computing equipment.

Green Flash

Embedded Microprocessor Technology for HPC Read More »