Welcome!

Containers Expo Blog Authors: Pat Romanski, Liz McMillan, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui

Related Topics: Containers Expo Blog, Agile Computing

Containers Expo Blog: Blog Feed Post

How Green is Your Data Center?

Give us your opinions and experiences designing and implementing the green data center

Data Center “X” just announced a 2 MegaWatt expansion to their facility in Northern California. A major increase in data center capacity, and a source of great joy for the company. And the source of potentially 714 additional tons of carbon introduced each month into the environment.

Think Green and EfficientMany groups and organizations are gathering to address the need to bring our data centers under control. Some are focused on providing marketing value for their members, most others appear genuinely concerned with the amount of power being consumed within data centers, the amount of carbon being produced by data centers, and the potential for using alternative or clean energy initiatives within data centers. There are stories around which claim the data center industry is actually using up to 5% of power consumed within the United States, which if true, makes this a really important discussion.

If you do a “Bing” search won the topic of “green data center,” you will find around 144 million results. Three times as many as a “paris hilton” search. That makes it a fairly saturated topic, indicating a heck of a lot of interest. The first page of the Bing search gives you a mixture of commercial companies, blogs, and “ezines” covering the topic – as well as an organization or two. Some highlights include:

With this level of interest you might expect just about everybody in the data center industry to be aggressively implementing “green data center best practices.” Well, not really. In the past month the author (me!) toured not less than six commercial data centers. In every data center I saw major best practices violations, including:

  • Large spacing within cabinets forcing hot air recirculation (not using blanking panels, as well as loose PCs and tower servers placed adhoc within a cabinet shelf)
  • Failure to use Hot/Cold aisle separation
  • High density cabinets using open 4 post racks
  • Spacing in high density server areas between cabinets
  • Failure to use any level of hot or cold air containment in high density data center spaces, including those with raised floors and drop-ceilings which would support hot air plenums

And other more complicated issues such as not integrating the electrical and environmental data into a building management system.

The Result of Poor Data Center Management
The Uptime Institute developed a metric called Power Utilization Efficiency (PUE) to measure the effectiveness of power usage within a data center. The equation is very simple, the PUE is the total facility powe3r consumption divided by the amount of power actually consumed by either internal IT equipment, or in the case of a public data center customer-facing or revenue-producing energy consumed. A factor of 2.0 would indicate for every watt consumed by IT equipment, another watt is required by support equipment (such as air conditioning, lighting, or other).

Most data centers today consider a target value of 1.5 good, with some companies such as Google trying to drive their PUE below 1.2 – an industry benchmark.

Other data centers are not even at the point where they can collect meaningful PUE data. The previous Google link has an extended description of data collection methodology, which is a great introduction to the concept. The Uptime Institute of course has a large amount of support materials. And a handy Bong search reveals another 995,000 results on the topic. No reason why any data center operator should be in the dark or uniformed on the topic.

So let’s use a simple PUE example and carbon calculation to determine the effect of a poor PUE:

Let’s start with a 4 MW data center. The data center currently has a PUE of 3.0, meaning of the 4 MW of power consumed within the data center 3MW are consumed by support materials, and 1MW by actual IT equipment. In California, using the carbon calculator, this would return 357 tons of carbon produced by the IT equipment and 1071 tons of carbon produced by support equipment such as air conditioning, lights, poorly maintained electrical equipment, etc., etc., etc…

1071 tons of carbon each month, possibly generated by waste which could be controlled through better design, management, and operations in our data centers. Most commercial data centers are in the 4~10MW range. Scary.

The US Department of Energy recently did an audit entitled “Department of Energy Efforts to Manage Information technology in an Energy-Efficient and Environmentally Responsible Manner,” which highlights the fact even tightly regulated agencies within the US Government have ample room for improvement.

“We concluded that Headquarters programs offices (which are part of the Department of Energy’s Common Operating Environment) as well as field sites had not developed and/or implemented policies and procedures necessary to ensure that information technology equipment and supporting infrastructure was operated in an energy-efficient manner and in a way that minimized impact on the environment.” (OAS-RA-09-03)

What Can We Do?
The easiest thing to do is quickly replace all traditional lighting with low power draw LED lamps, and only use the lamps when human beings are actually within the data center space working. Lights generate a tremendous amount of heat, and consume a tremendous amount of electricity. Heat=air-conditioning load if that wasn’t already obvious. Completely wasted power, and completely unnecessary production of carbon. If you are in a 10,000sqft data center, you may have 100 lighting fixtures in the room. Turn them off.

If your data center requires security cameras 24×7, consider using dual-mode cameras that have low light vision capability.

Place blanking panels in all cabinets. Considering removing all open racks from your data center unless you are using them for passive cabling, cross-connects, or very low power equipment. Consider using hot or cold aisle containment models for your cabinet lineups. Lots of debate on the merits of hot aisle containment vs. cold aisle containment, but the bottom line is that cool air going into a server makes the server run better, reduces the electrical draw on fans, and increases the value of every watt applied to your data center.

Consider this – if you have 10 servers using a total of 1920 watts (120v with a 20 amp breaker <at 16 amps draw>), that gives you the potential of running those 10 servers at full specification draw. That includes internal fans which start as needed to keep internal components cool enough to operate within equipment thresholds. If the server is running hot, then you are using your full 192 watts per server. If the server is running with cool air on the intake side, no hot air recirculation producing heat on the circuit boards, then you can reasonably expect to reduce the electrical draw on that component.

If you are able to reduce the actual draw each server consumes by 30~40% by removing hot air recirculation and keeping the supply side cool, then you may be able to add additional servers to the cabinet and increase your potential processing capacity for each breaker and cabinet by another 30~40%. This will definitely increase your efficiency, cost you less in electricity and power, give you additional processing potential.

Sources of Information
Quite a few sources of information, beyond the Bing search are available to help IT managers and data center managers. APC probably has the most comprehensive library of white papers supporting the data center discussion (although like all commercial vendors, you will see a few references to their own hardware and solutions). HP also has several great, and easy to understand white papers, including one of the best reviewed entitled “Optimizing facility operation in high density data center environments” – a step-by-step guide in deploying an efficient data center.

The Bing search will give you more data than you will ever be able to absorb, however the good news is that it is a great way to read through individual experiences, including both success stories and horror stories. Learn through other’s experiences, and start on the road to both reducing your carbon footprint, as well as getting the most out of your data center or data center installation.

Give us your opinions and experiences designing and implementing the green data center – leave a comment and let others learn from you too!

John Savageau, Long Beach

Read the original blog entry...

More Stories By John Savageau

John Savageau is a life long telecom and Internet geek, with a deep interest in the environment and all things green. Whether drilling into the technology of human communications, cloud computing, or describing a blue whale off Catalina Island, Savageau will try to present complex ideas in terms that are easily appreciated and understood.

Savageau is currently focusing efforts on data center consolidation strategies, enterprise architectures, and cloud computing migration planning in developing countries, including Azerbaijan, The Philippines, Palestine, Indonesia, Moldova, Egypt, and Vietnam.

John Savageau is President of Pacific-Tier Communications dividing time between Honolulu and Burbank, California.

A former career US Air Force officer, Savageau graduated with a Master of Science degree in Operations Management from the University of Arkansas and also received Bachelor of Arts degrees in Asian Studies and Information Systems Management from the University of Maryland.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Chris Matthieu is the President & CEO of Computes, inc. He brings 30 years of experience in development and launches of disruptive technologies to create new market opportunities as well as enhance enterprise product portfolios with emerging technologies. His most recent venture was Octoblu, a cross-protocol Internet of Things (IoT) mesh network platform, acquired by Citrix. Prior to co-founding Octoblu, Chris was founder of Nodester, an open-source Node.JS PaaS which was acquired by AppFog and ...
The Founder of NostaLab and a member of the Google Health Advisory Board, John is a unique combination of strategic thinker, marketer and entrepreneur. His career was built on the "science of advertising" combining strategy, creativity and marketing for industry-leading results. Combined with his ability to communicate complicated scientific concepts in a way that consumers and scientists alike can appreciate, John is a sought-after speaker for conferences on the forefront of healthcare science,...
"The Striim platform is a full end-to-end streaming integration and analytics platform that is middleware that covers a lot of different use cases," explained Steve Wilkes, Founder and CTO at Striim, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...