Powering the Engines of the Internet — Google Datacenters Use Machine Learning & Cost-Effective Cooling to Maximize Energy Efficiency

Written by: Alexandra Anderson

Alexandra Anderson

Alexandra is a web marketer, Agile Product Owner, and die-hard wordsmith who's contributed to HostingAdvice, InMotion Hosting, HostGator, and other prominent hosting and technology blogs, as well as Forbes. She has a master's degree in information technology from Virginia Tech and more than 10 years of experience building websites, advising on web and mobile app design, and crafting content that engages and converts. Her primary subject matter expertise spans WordPress, UX design, Agile project management, and, of course, web hosting.

See full bio »

Edited by: Lillian Castro

Lillian Castro

Lillian brings more than 30 years of editing and journalism experience to our team. She has written and edited for major news organizations, including The Atlanta Journal-Constitution and the New York Times, and she previously served as an adjunct instructor at the University of Florida. Today, she edits HostingAdvice content for clarity, accuracy, and reader engagement.

See full bio »

TL; DR: Answering 1.2 trillion searched questions a year and powering billions of Gmail inboxes is expensive. Fueling the web’s #1 search engine is a high-cost, energy-guzzling game, no way around it, right? Google says otherwise. After 10 years of commitment to engineering efficiency into their datacenters, Google is now getting 3.5 times the computing power from the same amount of energy used five years ago. They’ve pledged to reach 100% landfill diversion of all waste produced by their facilities, and six of their 14 locations have hit the mark already. By zeroing in on how their datacenters operate, how they cool their servers, and how they can repurpose hardware, Google is saving millions of dollars and preserving our environment — helping put an end to the notion that it costs too much green to go green.

At any given moment, we can look through our online search history and find snapshots of our lives recorded in our browser. Recent queries varying wildly from “Grey’s Anatomy season opener” to “buy more MacBook Air storage” or “Gmail login” — our online searches say a lot about us. For many, they prompt the understated realization that we use Google for everything.

Indisputably the web’s most popular search engine, Google holds a 64% stake in what some have coined the Search Engine Market Share War — three times that of the closest competitor. Google has such a presence in our everyday lives that we’ve turned it into a verb as we refer to the amorphous influencer for direction on a daily basis.

The reality is Google is not just an all-knowing source of information that can’t be explained. The engine running 1.2 trillion requests each year is backed by a massive server network that requires a solid $11 billion of annual operational costs. Yikes.

As overwhelming as those numbers are, Google takes the costs and demand in stride, challenging themselves as a company to continue to scale up, keep latency down, and minimize the effect on the environment. The upshot: This saves them millions of dollars in material costs.

10 Years of Development Yields Datacenters That Embody Efficiency

While Generation Z may not know of a time without Google, the company actually came to be in the late ’90s, and they’ve been iterating on the search engine ever since. Over the last 10 years, special attention has been paid to engineering environmentally friendly solutions — infusing efficiency into their datacenters.

Machine Learning Ekes Out More Power, Requiring 40% Less Energy to Cool

A major challenge faced by any datacenter owner is keeping servers cool enough to run. Typically, this is done using pumps, chillers, and cooling towers — tremendously tedious to optimize for energy efficiency. The hardware, its physical environment, and the humans that operate it have a complex relationship, and the system can’t keep up with changes very easily. Furthermore, each individual datacenter is unique with its own set of complications for staying green.

Collage of Google datacenters and employees working on them

Google and DeepMind engineers teamed up to develop an adaptive framework for optimizing datacenter efficiency.

Needless to say, the sophistication of Google’s datacenters makes advances in energy efficiency an expensive and slow-moving ordeal. That’s part of what makes one of the company’s recent announcements on this matter so noteworthy.

In July 2016, Google revealed that after two years of applying machine learning to their datacenter operations, they were able to reduce the energy required to cool their datacenters by 40%.

In recent months, Google has called upon DeepMind researchers for assistance in building a “more efficient and adaptive framework to understand datacenter dynamics and optimize efficiency,” according to the post by DeepMind Research Engineer Rich Evans and Google Data Center Engineer Jim Gao.

The collaborating teams compiled historic data (temperature, power, pump speed, etc.) from thousands of sensors scattered throughout the datacenters and used the insights to “train an ensemble of deep neural networks.” They taught those neural networks the average future power usage effectiveness (PUE), or the ratio between the total energy used in the facility and the energy used for computing. Then, two additional groups of networks were trained to predict upcoming changes in temperature and pressure in the following hour. This helped ensure operating constraints were not exceeded.

Their model was deployed live in Google’s datacenter, producing a “15% reduction in overall PUE overhead after accounting for electrical losses and other non-cooling inefficiencies.” That’s the lowest PUE recorded onsite, and Jim and Rich stated that they plan to “roll out this system more broadly” and share the specifics of their methodology so that other operators (and the environment) can benefit from this advance.

In-House-Built Servers Ensure Power is Directed Only to Computing Components

To have an energy-efficient datacenter, you can’t ignore the servers themselves. Google designs theirs to use as little energy as possible, particularly when awaiting their next task. They have to be highly performant, and they have to be running constantly, so special attention is paid to minimizing power loss as well.

Google said traditional servers are often built with minimal startup cost and lower efficiency standards, but that often yields more energy costs in the long run. “A typical server wastes up to a third of the energy it uses before any of that energy reaches the parts that do the actual computing,” according to the website.

Instead, Google uses specific regulator modules to keep the power flowing only to the computing components of the facility. By putting backup batteries on their racks, they’ve eliminated two power supply stages where AC voltages are converted to DC voltages (a common culprit for efficiency loss). As for the hardware itself, they’ve removed any parts deemed “unnecessary,” like peripheral connectors or video cards, and ensured their cooling fans spin “just enough to keep (their) machines cool enough to run.”

Various Forms of Recycled Water are Used for More Efficient Cooling

As counterintuitive as it may be, water plays a powerful role in helping datacenter-scale computers run. The billions of kilowatt-hours of electricity pumping through datacenters to power the servers produces a tremendous amount of heat. Water can be used as an energy-efficient cooling source.

In the case of Google’s datacenters, fans pull hot air from behind the servers, pushing it through water-cooled coils and into Google’s Hot Huts, temporary homes for the hot air to chill out (pardon my pun). Once it’s cooled, the air is circulated back into the main area of the datacenter facility. A similar process is used in Google’s cooling towers, which speed up the evaporation process as fans lift the water vapor, cooling in the process, and redeposit the water back into the main facility.

Google uses sea water, rainwater, and recycled wastewater to cool down their 14 datacenter facilities.

In the interest of sustainability, it’s important to note that water is not an infinite resource. Google proposed this simple solution: “Instead of using potable (or drinking) water for cooling, we use non-drinkable sources of water and clean it just enough so we can use it for cooling.”

One of their datacenters is powered by rainwater — seriously cool, in the non-temperature sense — and others are powered by recycled water from various sources, including treated city wastewater and water from an industrial canal.

Google’s Goal: Zero Waste to Landfill

Nearly half of Google’s datacenters have achieved “zero waste to landfill” — and they’ve pledged to achieve that goal in all 14 datacenters. Six down, eight to go! So far, 86% of waste is diverted away from landfills across all of Google’s datacenter operations, according to this article.

How They Measure Efficiency

PUE is the datacenter industry-wide metric for measuring efficiency. Google looks at all of their datacenter facilities and calculates performance throughout the year (rather than only tracking efficiency in the newest facilities during fall or winter months).

Many non-Google facilities use as much non-computing energy as the electricity to power their servers. Google has reduced their overhead to 12%, so much of their energy consumption goes directly toward delivering answers to Google searches and powering Google products.

For highly accurate PUE calculations, Google considers only servers, storage, and networking equipment to be IT power.

A full discussion of how Google measures efficiency in their servers can be found here.

How They’re Doing It

Google’s major moves forward in engineering energy efficiency are attributed to many variables: recycled hardware, efficient cooling methods, and dedicated research, to name a few. While we’ve covered the water cooling research, I think the hardware usage itself is particularly noteworthy.

These machines power my six Gmail accounts and billions of others. They stream countless hours of YouTube videos and help answer trillions of life’s inexplicable questions. This requires constant machine upgrades and management.

“From the moment we decide to purchase a piece of equipment to the moment we retire it, we reduce, reuse, and recycle as much as we can,” according to Google. Even easily forgotten details, like shipping the inventory to the datacenter, are considered. “Whenever possible, we use local vendors for heavier components like our server racks,” the team stated. “By limiting the shipping distance, we reduce the environmental impact of transportation.”

Over half of the parts used for machine upgrades were refurbished inventory in 2015, and the small percentage of the hardware that can’t be reused or resold, for whatever reason, gets recycled instead. Since 2007, Google has avoided purchasing more than 300,000 replacement servers by remanufacturing and repurposing old ones.

The Results So Far Are Incredible

Google’s Mayes County, Oklahoma, datacenter location was the first of six to achieve zero-waste-to-landfill status. “Compared to five years ago, we now get around 3.5 times the computing power out of the same amount of energy,” said Amy Atlas, Global Communications and Sustainability Rep for Google. We sum up some of the most notable victories below:

These points merely illustrate a slice of the environmental impact being made by Google’s datacenter efficiency measures. Expect the innovating to only continue from here.

Keeping Up With Google — Driving the Industry Forward

When I think about all that I use Google for — my email, my document sharing, my research, and my entertainment — I’m overwhelmed by the sheer power it would require to run the machines running so many areas of my life. Google is no doubt a world leader in search marketing and numerous technological fields, but Google is also leading us toward a better, cleaner world.

“Although the last 10 to 20 percent of diversion will be the most difficult to solve, it is also where we see the most opportunity to get creative about new community partnerships and designing waste streams out altogether,” said Jim Miller, VP of Global Operations for Google.

Advertiser Disclosure

HostingAdvice.com is a free online resource that offers valuable content and comparison services to users. To keep this resource 100% free, we receive compensation from many of the offers listed on the site. Along with key review factors, this compensation may impact how and where products appear across the site (including, for example, the order in which they appear). HostingAdvice.com does not include the entire universe of available offers. Editorial opinions expressed on the site are strictly our own and are not provided, endorsed, or approved by advertisers.

Our Editorial Review Policy

Our site is committed to publishing independent, accurate content guided by strict editorial guidelines. Before articles and reviews are published on our site, they undergo a thorough review process performed by a team of independent editors and subject-matter experts to ensure the content’s accuracy, timeliness, and impartiality. Our editorial team is separate and independent of our site’s advertisers, and the opinions they express on our site are their own. To read more about our team members and their editorial backgrounds, please visit our site’s About page.

ABOUT THE AUTHOR

Alexandra Anderson’s interest in website administration was sparked in her teens, priming her for a fast-paced career in managing, building, and contributing to online brands, including HostingAdvice, Forbes, and the blogs of prominent hosting providers. She brings firsthand experience reviewing web hosts, perfecting website design, optimizing content, and walking site owners through the steps that add up to a successful online presence. With a master's degree in information technology from Virginia Tech, she combines her extensive writing experience and technical understanding to unpack some of the most complex topics that daunt novice website owners, as well as the subjects that excite veteran technologists within the HostingAdvice readership.

« BACK TO: BLOG
Follow the Experts
We Know Hosting

$

4

8

,

2

8

3

spent annually on web hosting!