10 Innovations That Have Made Data Centers More Efficient

Why Trust Techopedia

Data centers can mean huge energy consumption, but several recent innovations are making them more efficient.

In April 2014, Greenpeace passed out energy-efficiency report cards to companies in the data center business. In somewhat of a surprise, two big names, Amazon and Twitter, failed. However, three well-known corporations, Apple, Facebook and Google, topped the honor roll.

Making the honor roll was not by chance. All three companies are invested in finding ways to cut energy consumption. It might be interesting to check out the new technology they’re using. Moreover, there’s a good chance some of the technology might end up reducing personal electricity consumption as well.

Reflective Roofing

It may seem like a no-brainer, but how many white roofs do you see in locales where there are few signs of winter? Apple decided it was worth it when building its Maiden, North Carolina, data center. It may not seem like much, but when it comes to data center efficiency, keeping things cool is everything. Apple’s white "cool roof" maximizes solar reflectivity, reducing the building’s cooling requirements and, therefore, its energy use.

Efficient Fuel Cells

Fuel cells are one of the most efficient ways to produce electricity. The only method more efficient is hydroelectric energy. Apple again took the lead when building is North Carolina data center, at first deploying a 4.8 megawatt Bloom Energy fuel cell system that runs on biogas. Not long afterward, Apple doubled the size of the cell to 10 megawatts.

And there’s more good news for those following fuel-cell technology; in July 2014, GE announced the development of a fuel cell that is 65 percent efficient. When a waste-heat processor is added, the efficiency jumps to 95 percent. Imagine having a fuel cell powering your house or even your computer!

Massive solar-Panel Arrays

Creating electricity using solar panels is not new. But according to Lisa Jackson, the vice president of environmental initiatives at Apple, "On any given day, 100 percent of the data center’s needs are being generated by solar power and the fuel cells" at Apple’s North Carolina data center.


In fact, simply coordinating the various power grids at the facility is quite a feat in and of itself.

Immersion Cooling

Immersion cooling has been around since the 1980s, when Cray Research began using the technology for their supercomputers. However, it wasn’t ready for data centers at that time. That is no longer the case. Allied Control builds and sells systems that immerse rack-style computer equipment in 3M’s Novec Engineering Fluid. One of Allied Control’s systems is helping a Hong Kong data center maintain a Power Usage Effectiveness of 1.02. (In laymen’s terms, it’s one of the most efficient data centers in the world.)

Higher Equipment Operating Temperatures

This sounds obvious, but it started when Google created servers that survive at higher temperatures. Thermostats in Google data centers are set at 80°F, and workers are encouraged to wear shorts. This video mentions that Google has saved more than $1 billion on cooling costs from this change alone.

Dell jumped into the fray as well. Graphs in this report from Dell and APC show 80°F to be the point at which energy saved from raising room temperature is still more than the energy being added by the server fans having to run more often.

The Use of Containerized Units

Data-center modules are becoming popular as a simple and quick way to expand data center capacity. Pre-built production modules allow manufacturers to test and optimize cooling and electrical needs, thus saving energy.

The Elimination of Power-Conversion Loss

At Facebook’s Prineville, Oregon, data center, the 480V to 208V conversion required by most servers was eliminated. That innovation reduced the data center’s electrical bill by close to 15 percent. (More on this later.)

Redesigned Servers and Racks

Both Google and Facebook redesigned their servers and racks. Google is quiet about what it did, while Facebook offers details (server and rack). One of the benefits for Facebook, as mentioned earlier, was the ability to run 480V right to the rack. Both Google and Facebook have optimized every aspect of theirs servers to run efficiently as possible to save energy.

Optimized Equipment Loading

Data center management provisions its facilities for the worst-case scenario to meet guaranteed response times. This means that much of the time, servers are under-loaded and wasting electricity. Much effort is being directed at figuring out how to reduce server "underloading" by activating needed servers and placing the rest in standby.

Open-Source Innovations

Although not an official innovation, both Google and Facebook are sharing what each has found to work when it comes to increasing energy efficiency. In particular, Facebook has started the Open Compute Project, which warehouses details of what Facebook and other members have found to help cut energy losses. The group’s mission statement:

    We believe that openly sharing ideas, specifications and other intellectual property is the key to maximizing innovation and reducing operational complexity in the scalable computing space. The Open Compute Project Foundation provides a structure in which individuals and organizations can share their intellectual property with Open Compute Projects.

It would be impossible to quantify the energy savings that have occurred because of the Open Compute Project, but any time useful information is shared, everyone benefits.


Related Reading

Michael Kassner

Michael Kassner is a veteran reporter of technology, wordsmith, and founder of MKassner Net (LLC).