Google, Meta and Microsoft are using AI to green their data centers

Artificial intelligence is used to optimize server energy and water consumption. It also makes it possible to design hydrocarbon catalysts with a low carbon footprint for emergency generators.

Data centers absorbed 1% of the total energy produced across the planet in 2020. A level, which with the exponential increase in stored data, could reach a fifth of global consumption in 2025, according to the International Energy Agency . And according to a Senate report published in June 2020, data centers are responsible for 14% of the digital carbon footprint in France. The energy absorbed by these sites is mainly intended to power the servers, the heat from which is then mainly rejected into the environment, regardless of the cooling system: air conditioning by ambient air, by water, etc. Per hour where climate change is no longer the subject of debate, the major players in this market are massively turning to green energies. At the same time, they rely on AI to optimize their energy and coolant expenditure.

One of the first to take this path is none other than Google. The Mountain View group has been developing a recommendation engine since 2016 to optimize the energy efficiency (or PUE) of its data centers. Based on two sets of neural networks, it ingests measurement histories gleaned from thousands of sensors installed in each data center: temperature, pressure, power, pump flow, etc. To this is added weather data. Once trained, the models anticipate over a range of hours the atmospheric conditions that the servers will face, then recommend actions to be taken to improve the PUE. Advice identified based on decisions made in the past, then refined through simulations. The results of an initial deployment having been convincing, Google has since implemented the system in all of its data centers.

Adjust cooling in real time

In 2018, Google kicked things into high gear with the deployment of real-time optimization AI. Baptized “level 2 automated control system”, it is based on the work already carried out. “It results in more precise adjustments than those generally made by a human,” says Joe Kava, vice president of data centers at Google in the columns of DataCenter Knowledge. “If, for example, the outside temperature goes from 22 to 24 degrees Celsius, a human operator would not have the reflex to modify the parameters of the cooling system, considering that the temperature of the wet bulb at the system inlet is approximately the same and that the consequences would be minimal in terms of PUE.”

“The goal is to streamline rack design and server layout to minimize wasted energy, network capacity and cooling”

At the end of 2021, Microsoft follows suit. It deploys “anomaly detection methods”, again based on AI. The models in question ingest telemetry data from the electrical and mechanical equipment of the group’s data centers. “The objective is twofold: on the one hand to optimize the electricity and water consumption of existing equipment, on the other hand to rationalize the design of the bays and the layout of the servers in order to minimize the waste of energy, network capacity and in cooling”, we explain at Microsoft.

Ditto on the side of Meta. Mark Zuckerberg’s group is also turning to reinforcement learning to limit the amount of air used by its data centers for cooling purposes. Depending on the configuration of the server infrastructures, the heat produced, the weather data and the ideal temperature to be reached, it uses reinforcement learning to train a deep learning engine by trial and error. The challenge ? Adjust fluid capture as accurately as possible with a view, ultimately, to reducing the volumes of hot air released.

A hydrogen catalyst

At Meta, we also use AI to design greener concrete for the construction of data centers. The challenge is to find alternatives to cement, a major component of concrete, the production of which involves large amounts of energy and the emission of a high level of CO2. Among these alternatives, the American group’s R&D identifies slag, crushed glass or even fly ash from the combustion of coal. The role of AI? Detect the optimal recipe among some 1,000 different mix configurations. And this, according to the criteria sought: resistance, insulation, durability of the materials…

“We are developing an AI to predict atomic interactions much faster than today’s simulators”

Always in the logic of achieving greener data centers, Meta, in conjunction with Carnegie University, is developing chemical learning models aimed at designing a hydrogen catalyst with low energy consumption. A hydrocarbon without any pollutant emissions (it only generates water) which can then come to supply the emergency generators of the company’s data centers.

“Discovering catalysts is an arduous process. Assuming that a catalyst is created from three of the 40 known metals, there are already almost 10,000 combinations of elements. And knowing that each combination must then be tested by adjusting the configurations and ratios of elements, we end up with billions of possibilities to test”, explains Larry Zitnick, researcher in the Meta AI laboratory. “To meet this challenge, we are developing an AI designed to accurately predict atomic interactions much faster than today’s simulators. The goal is to cover billions of possible catalysts per year.”

Microsoft, Google and Meta have all made energy transition commitments. The first is committed to supplying its data centers with 100% renewable energy by 2025, and the second by 2030. As for Meta, it claims to have reached this objective by 2020.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *