Friday, December 9, 2011

Facebook Expands Green Open Hardware Push

The Open Compute Project is looking deeper into green data centres, from storage to systems management

If Facebook officials have their way, their Open Compute Project will go beyond servers and power supplies, touching on every aspect of a data centre’s infrastructure.

The initiative kicked off in April when Facebook open-sourced the server and data centre specifications the social networking giant employed in building its data centre in Prineville in Oregon. The project has since enrolled an impressive array of members, from Intel, Asus and Rackspace to Mellanox, Huawei and Red Hat, not to mention a few research and education institutions.

Spreading the initiative

It is an indication of the various directions in which the project is rapidly moving, Amir Michael, hardware design manager at Facebook, said in an interview with eWEEK during the recently concluded SC 11 supercomputing show in Seattle. Facebook is already moving forward with the next generation of the custom servers it has designed, Michael said.

At the same time, project members also are looking to tackle other aspects of the data centre, including systems management, storage and I/O. The push in these directions will help create the momentum to solve that key issues that Facebook officials saw when looking at data centre technology – that, in a broad way, proprietary products from large and small vendors alike could address some of the mainstream needs that are present in most enterprises, but often do not meet the unique demands a particular business may have.

Growing green roots

About two years ago, Facebook engineers set out  to start designing their own servers using standard off-the-shelf technologies. Up to that point, the company has been using systems from traditional OEMs. Facebook worked with chip makers Intel and Advanced Micro Devices, as well as systems makers Hewlett-Packard and Dell, to create the custom servers.

The aim was to build systems that offer the performance needed to run a fast-growing social network with 800 million-plus members while keeping down capital, power and cooling costs in the densely populated data centres. The Facebook-developed systems are 1.5U (2.65 inches) tall – rather than the more traditional 1U (1.5 inches) servers – which, among other positives, makes for better air flow and lower cooling costs, Michael said.

There is no paint or logos that are found on servers from OEMs – which not only reduces the capital costs, but also makes the systems lighter – there is a more energy-efficient power supply in place and they are easier to service, with tool-less components, from fans to power supplies.
The Oregon facility also uses outside air to keep the systems cool, rather than running expensive chiller units, Michael said.

Energy efficiency benefits

The result of the work was a 38 percent increase in energy efficiency at the Oregon facility at a lower cost of 24 percent as compared with Facebook’s other data centres, he said. The data centre also has a power usage effectiveness (PUE) ratio of 1.07. The PUE ratio is a way to measure how efficiently a facility uses its energy; the closer to 1.0, the better. The Environmental Protection Agency has a standard PUE rate of 1.5.

Facebook expects to get similar results as it builds new data centres, Michael said. Last month, company executives said they plan to build their next data centre in Lulea, Sweden, just on the edge of the Arctic Circle, to serve users inEuropeand other regions. The site was chosen for its cold air and access to hydroelectric power.

The company also is working on its next generation of servers, which will include such technologies as an Intelligent Platform Management Interface (IPMI) and the ability to reboot on the LAN. They also will continue to be powered by Intel and AMD chips, though Michael said the company also is keeping an eye on other chips, including those from ARM Holdings. ARM-designed chips from the likes of Nvidia, Qualcomm and Samsung are found in most smartphones, tablets and other mobile devices, but the company also is looking to move up the ladder and into low-power servers.

Adopted from http://www.eweekeurope.co.uk



No comments:

Post a Comment

Note: Only a member of this blog may post a comment.