Facebook – Continues Cutting Edge Technology

0

CBN_Nov6_Facebook

Introduces Latest Cold Storage Data Facility in Prineville

The quest for lower electrical energy cost in data centers is an ongoing development for large tech companies and Facebook is leading a revolution in the industry with their cold storage data center in Prineville and their successful open compute project.

The Prineville data center opened its doors to journalists for an exclusive tour recently to unveil its latest high tech money saving energy reducing innovative technology, cold storage. Although the catchy phrase instantly brings different connotations to mind, it is an old phrase with a new high tech definition used by Facebook to describe their new idea for storing seldom used data.

With Facebook now in excess of 1.15 billion people using the service and more joining every day, it can be mind boggling to imagine what technology is used to handle this ginormous amount of data information, let alone how to store it.

Facebook opened its first custom designed energy efficient data center in 2011 which received a LEED Gold Certification from the U.S. Green Building Council. The enormous data center continues to reduce its energy demands, re-setting the industry standards by using 52 percent less energy than comparable data centers built with code requirements.

Facebook isn’t resting on its laurels for developing the lowest energy costs for data centers in the industry as it launched its latest accomplishment to be even more energy efficient than before.

The importance of the new cold storage facility is best understood after one understands the amount of heat that a data center produces.

Chuck Goolsbee, site director for Facebook’s Prineville data centers started the tour from the ground up with a visit to the the massive 330,000 square foot data facility built on 120 acres of the high desert. This colossal mammoth of a structure is big enough to fit two WalMart super centers, or if stacked end to end, would be 84 stories tall, longer than the USS Abraham Lincoln.

It was important to start the tour at the beginning of the data stream storage challenge in the building that houses the entire Facebook cyber population. That means all of the 1.15 billion people that use Facebook to post their status, tag friends, upload a picture or message one of their 5,000 close personal friends share this gigantic matrix where data bits stream into miles of networking wires and thousands and thousands of servers with spinning hard drives. It’s hard not to think of the Matrix at this point, realizing that all the photos, uploads and comments are nothing more than bits of electro magnetic data stored on hard drives in what appear to be a catacomb of hot electrical impulses.

Goolsbee leads us to a data bank and starts by explaining, “The data center is a bits factory and bits create heat… This heat over time wears out electrical components and can affect the operation and efficiency of computer systems. This entire facility is built to efficiently handle the outside cool and hot indoor air temperature so that all the systems can work in optimal fashion with the lowest energy costs.”

Goolsbee removes one computer board from one of thousands of tall racks that hold eight computers, each with 15 hard drives and controlled by one master server above each computer rack. He demonstrates how “fluid servers” work by easily removing a module to show that it is designed to run cooler with the air flowing constantly over the unit. Goolsbee says, “Facebook’s Open Compute, (OCP) program led to many of the cost saving ideas and innovation in this facility.”

The OPC website reports, “Facebook set out to develop [an]innovative data center and data center server solutions that were both energy- and cost-efficient…Facebook decided to share its designs with the larger community in an effort to promote and encourage power efficiency and future innovation. The Open Compute Project (OCP) was born from this desire and officially launched in April of 2011.”

To put the above description into tech savvy urban language, it means they invited brilliant computer hackers to help develop the new energy saving technology that came by participating with the Open Compute Project. This allowed Facebook engineers to invite hardware development using the same model that is traditionally associated with open source software projects.

Data Center Operations Manager Kenneth Pratchett explains, “The OCP helps to create a demand for the industry which overall drives energy efficiency in the data center space.”

This same attitude was carried forward when Facebook came to realize that older data didn’t need to be accessed as often as recent data, especially when it came to photos of which there are 350 million loaded every day onto Facebook servers. Michael Kirkland a communications manager reports, “Users have added about 250 billion photos since Facebook started allowing photo uploads.”

With this amount of data flooding into Facebook servers it wasn’t long before it inspired the new 16,000 square foot cold storage data facility that uses even less energy per square foot than the large data center.

Goolsbee further explains, “This was accomplished in part with the use of the windmill server concept that allows each disk in the cold storage gear to hold four terabytes of data, and each two unit system contains two levels of hard disks. This configuration allows for four petabytes of cold storage in a rack, each storage head has two petabytes attached, and there are two heads per rack.

“The disks are rarely spinning – perhaps one will at any given time. As a result, these use less power than the open compute racks filled with servers, and there’s no need to put electrical infrastructure on each row. Instead, power can be distributed across multiple racks.”

This allows fewer computers to process the data and the use of hard drives to access it which uses much less energy than the racks of computers in the larger data center as the information is only used when it is accessed by a Facebook member. Goolsbee said, “Less than a week into its operation, the cold storage facility is already stored nine petabytes of user data and is adding more daily.”

Does anyone remember when a couple of megabytes was a lot of storage? It seems it all started with a megabyte or two in the beginning of the computer age so what is a petabyte? The easiest way to describe a petabyte says Pratchette, is to imagine 4,000 pennies flat on the floor and each one stacked with enough pennies to go from the earth to the moon….that would be equal to one petabyte of data.

The cold storage data center utilizes the cool air handling systems used in the other data center buildings where the heat generated by the massive electrical systems is harnessed and managed with the natural cool air available in the high desert climate.

Facebook has designed all their buildings to capture the cool desert air into their buildings and has managed to achieve the precise clean filtered air temperature and humidity levels. Likewise the buildings also capture all the waste heat from the electrical equipment inside the buildings and use it with the outside air to achieve the optimum operating temperature to achieve the lowest energy costs and minimum equipment failure. The data centers capture enough heat as needed with enough additional heat for office and administrative areas as well as the massive data centers.

“When it’s full, the 16,000-square-foot cold storage building would be able to hold thousands of petabytes of data,” Goolsbee said. “Getting it to capacity should take several years.

“Facebook has room to expand with two additional wings to the new cold storage building adding 32,000 square feet for additional servers in the future years. Cold storage won’t be installed at every Facebook data center, just at the Prineville site and the one in Forest City, North Carolina, and that should be fine for now. Facebook is projecting that the first cold-storage building will be filled to capacity by 2017.”

Share.

About Author

Avatar

Leave A Reply