Covering Scientific & Technical AI | Saturday, January 18, 2025

Four Companies that Went Green 

It's not easy to be green, but it's possible. It requires some planning and investment. Here are four companies with their own motivations for, and approaches to, building green datacenters.

There are many motivations for building a greener datacenter. Let's face it, the biggest one for business is saving money. With the proper retrofits, new buildings, better cooling systems, optimal location and other factors, companies can save millions or billions per year.  

But not everyone is looking just at the bottom line. Some executives have one eye on
environmental issues and reducing the carbon footprint.

Green Computing Report offers a look at four companies that recently decided to create greener datacenters, including why and how they did it and the results they achieved.

First up: Yahoo

Yahoo: Chicken Coops are Cool

Techniques: facilities design, prefab structures, passive air cooling, location

Yahoo has gone through some difficult times—and several CEOs—in recent years. But one thing for which it has received a lot of praise is new green datacenter design inspired by the simple air-flow approach of a chicken coop. When the Yahoo Computing Coop was unveiled in 2011, then-CEO Carol Bartz said that the motivation was both cost savings and environmental concerns. “It was very, very essential that our impact on our bottom line and on the environment be as small as possible,” she said at the time. The company spent five years (starting in 2005) to rethink its datacenter designs with the goals of making them more energy efficient as well as faster and cheaper to build.

The first technique was the design of the building itself: It is a prefab metal structure measuring 60 feet by 120 feet with louvered vents in the walls that let the wind blow through. Datacenter sites can be expanded by simply adding more prefab units (and Yahoo may be planning on expanding its first Computing Coop site; local news reports say it is planning to buy an additional 14 acres next to the existing 30 acre site.) The vents can be adjusted to catch the prevailing winds. The air flows through two rows of cabinets into a contained center hot aisle. A chimney in the top of the aisle can recirculate air back into the room or vent it out through louvers in a cupola on the roof. There are no raised floors, just poured concrete slabs. The tradeoff is that the servers are not packed in tightly, but must have room to breathe.

Location played a role in Yahoo's efficiency plans. The Yahoo Computing Coop (YCC) is a 155,000 square foot, 15 MW data center in Lockport, New York, just about 30 miles southeast of Toronto, Ontario, Canada as the crow flies (over Lake Ontario.) That gives it nice cool winds to harness. An added advantage of the location: the site has access to Niagara Falls-supplied hydroelectric power.

It doesn't use any chillers, and no more than 1% of the air needs to be cooled artificially with evaporative coolers—that during the estimated 212 hours per year when it gets too warm even for chickens. The coolers will use just 1% of the datacenter's energy. 

Results: With a reported PUE of 1.08, Yahoo says the Computing Coop uses 40% less energy and 95% less water than the average datacenter (although it doesn't give a direct comparison to the PUE of its own, older datacenters.) But it isn't stopping there. Yahoo is looking into ways to improve the efficiency even more with techniques such as more efficient power supplies, LED lighting, flash memory storage systems, data center monitoring and more virtualization. Recognizing the efficiency of the design, the Department of Energy awarded Yahoo a $9.9 million grant to help fund the $150 million plant's construction.

Next: Verizon Wireless
Verizon Wireless: Fast, Modular and Efficient

Techniques: Modular design, environmentally conscious construction, water efficiency, co-generation, recycled water, DCIM, balanced heat rejection system

In 2009 Verizon looked at how much energy its network equipment used and decided to do something about it. The company told its equipment suppliers to reduce energy consumption by 20%. Some of them complied.

With that success, Verizon Wireless decided to increase the efficiency of its datacenters as well. It needed to expand its facility in Twinsburg, Ohio, and wanted to do it quickly. So it built a new 140,000 sq. ft. facility that includes 50,000 sq. ft. of datacenter area using a modular design. It took 16 months to build the 9.6 MW center.

This was more than just a money-saving exercise, but a true attempt to focus on sustainability, starting with the construction itself. Materials were locally sourced and fabrication was done locally. The company recycled scrap and waste materials. Its racks, for example, contain at least 65% recycled steel.

Much of the focus is on water efficiency. It uses warm water cooling for the servers, and much of the heat generated by servers is re-used to warm the building and melt snow around the facility in winter. Rainwater collected from the roof is cleaned and re-used in the facility. The melted snow is also cleaned and recycled to supplement the cooling system and to provide water to the lavatories. Recycled water is cleaned with an electronic water treatment system from Clearwater Systems, which cuts water discharge by more than 20% compared to chemically-treated water. Discharge water from the cooling tower can be used as grey water for landscaping. It also uses well water as a backup water supply.

The center uses high-efficiency, variable frequency pumps and fans with electronically commutated (EC) motors. The facility also uses LED lighting. In order to minimize energy use for cooling, building automation programs balance three heat-rejection systems: a chiller/cooling tower, heat recovery, and free cooling. It also has a raised open ceiling that uses thermal stratification for passive heat rejection. It uses a datacenter infrastructure management (DCIM) system from Nlyte to measure and monitor some of energy savings.

The results: Verizon Wireless says the facility uses 20% less water than the industry average. Although the company rates its datacenter PUE at just 1.47, it says that it also consumes 20% less energy per year than the industry average for a comparably-sized datacenter. It was recognized by the U.S. Green Building Council with LEED Gold certification for leadership in energy efficient design, and the Uptime Institute awarded it a 2012 GEIT Award for Facility Design – Implementation. Next step: In March 2012 parent company Verizon pledged to cut its CO2 intensity to half its 2009 level by 2020.

 

Next: NSIDC
National Snow and Ice Data Center: Controlling the climate at a climate center 

Techniques: Retrofit, virtualization, external air cooling, evaporative cooling, environmental monitoring, solar power, web-based monitoring

The National Snow and Ice Data Center (NSIDC) at the University of Colorado at Boulder collects NASA satellite data and shares it with earth scientists and climatologists around the world. A lot of those folks are studying and are concerned about climate change, so NSDIC wanted to show it is possible to increase the efficiency of U.S. datacenters by reducing its own impact on the environment and sharing its results with the public.

When the time came to replace its its old direct expansion CRAC units, the folks at NSIDC decided it was a good time to show how it's done. Aside from the CRAC units, the existing systems were functional, although inefficient. NSIDC had to justify the price of a retrofit by assuring a payoff with lower energy costs (not to mention an enhanced reputation among CO2 reduction advocates.) The retrofit was supported by the Green Data Center project, funded by the National Science Foundation (NSF), with additional support from NASA.

The team managed to do it, first by reconfiguring the 1,600 sq. ft. data center into hot and cold aisles. They used server virtualization to help reduce the IT energy demand, which also allowed the equipment to be more densely packed and consolidated into a single server room, taking up less space in the 77,000-sq-ft building in which it's located.

They replaced the CRAC units with eight new coolers that use airside economization and indirect evaporative cooling. This system cools the air drawn in from outside the building by 16-22 degrees C (30-40 degrees F) without using compressors. They also produce warm, saturated air that's used to increase humidity on demand in the data center.

But that's not all. The Center also added a 50kW roof-mounted PV solar array to charge a UPS, eliminating the need for an extra backup generator. The solar array produces about 75,000 kWh annually, and when it isn't charging the UPS, it's feeding power back to the grid, almost completely off-setting the data center energy use on a sunny day.

Finally, the company added a web-based monitoring system (designed by a graduate student at the University of Colorado) so the public can see the energy efficiency improvements in real time. It's based on a Campbell Scientific Instruments Data Logger that collects temperature, humidity, airflow, and electrical power measurements. The system shows the heat gain from the data center equipment, the heat removed by the cooling system, the energy consumption of the cooling equipment, and the uniformity of conditions (such as temperature, humidity and airflow) within the data center.

Results: The cooling system now averages less than 2.5 kWh of energy use per month, almost 95% lower than the old system. The PUE at the datacenter was reduced from 2.03 to 1.2.

 

Finally: Apple
Apple: Making Peace with Greenpeace

Techniques: solar power, fuel cells, biogas, chilled water storage, environmentally conscious construction, air-cooled waterside economizers

Last spring, Greenpeace came down hard on Apple for using too much dirty energy to power its datacenters. The green advocacy group graded Apple with three D's – for energy transparency, energy efficiency and greenhouse gas mitigation, and its use of renewables and green advocacy. Apple got an F for infrastructure siting. That's not as bad as some of the other companies on the list (Amazon got three F's and one D) but Greenpeace then targeted Apple with a campaign to get it to clean up its act. As part of the report, Greenpeace estimated that Apple's Maiden, North Carolina data center used coal for 61.5% of its power and nuclear for 38%, while just 3.6% came from renewable sources. The data was based on the makeup of the power grid in North Carolina.

But Apple had already started making changes, especially at the Maiden facility that Greenpeace criticized, weaning itself from the power grid. It's installing two 20-MW solar array installations, each of which can produce 42 million kWh annually. One is being built on 100 acres at Apple's facility, the other on a 100-acre site a few miles away. The solar collectors track the sun as it moves across the sky.

By October 2012 Apple had also finished installing, and began testing, a 4.8 MW fuel cell system from Bloom Energy that will generate 40 million kWh annually. Then, in December, the Charlotte Observer found regulatory filings indicating that Apple wanted to expand the fuel cell system to 10 MW, and that the fuel cells themselves may be powered by biogas. Apple has said that the data center, which draws 20 MW at full power, will soon be 100% powered with renewable energy—60% of it generated on-site and 40% bought from local and regional sources.

The Maiden facility also uses a chilled water storage system that allows it to move 10,400 kWh of energy consumption to off-peak hours. During evenings and cool days, it uses outside air cooling through a waterside economizer, allowing the chillers to remain off over 75 percent of the time. Finally, the datacenter increases efficiency through variable-speed fans, by distributing power at higher voltages to reduce power loss, using LED lighting with motion sensors, and employing real-time monitoring and analytics. 

During construction the Maiden datacenter used 14% recycled materials, diverted 93% of construction waste from landfills, and sourced 41% of purchased materials within 500 miles of the site. A white roof also keeps the building cooler.

Results: Apple can (for the moment) claim to have largest end user-owned solar array and fuel cell systems in the U.S. Although not yet completed, the site already has LEED Platinum certification from U.S. Green Building Council. To prove how clean it is, Apple will register the renewable energy generated with the North Carolina Renewable Energy Tracking System. Apple has not revealed any PUE rating or comparisons for the facility.

AIwire