Where does data center CSR intersect reduced carbon footprint?

Tuesday, September 14, 2021

Every hour of every day, an astronomical amount of data is being stored, processed, analyzed, watched, read, and discussed. We socialize, share files, and hold meetings via handheld devices that access the internet. I would like to put the case that most of us do not really know where this internet cloud is. All we know is that it is ‘somewhere else’. As long as the connection works for us, we feel the storage space available is none of our concern.

In my day job, I store my files in the cloud. I’m also lucky to work with experts of many engineering fields, building the physical cloud. Our competence already supports data center owners going green, and we are eager to continue exploring this path further. Data center technologies intersect with several of our fields of expertise where the common denominators are energy efficiency and energy transformation.

The message we want to bring forward with this blog post is that we are ready to listen to the future wishes and expectations of data centers, and we are ready to make them happen in real life. 

Data centers can reduce their impact to the point of climate neutrality in three ways, all without compromising data availability:

  1. Lowering power usage effectiveness (PUE)
  2. Integrating renewables and storage into data centers
  3. Using waste heat from data centers for heating and/or desalination

The accelerating CO2 footprint of data centers

So, let’s return to that question of data in the cloud. The cloud is physical, and it is located in data centers: dedicated buildings on secure sites. The cloud can even be located underground. We are building data centers at an ever-increasing rate. They use amazing amounts of energy, correlating to the amount of data processed, including heat management. We don’t have an exact figure for this global energy consumption, but we know it is accelerating. Now, it’s in the range of the energy consumption of Norway and Finland combined. Estimates suggest that annual electricity demand from data centers could grow to 1,100-8,000 TWh by 2030 [Nature, 2018].

Therefore, the CO2 footprint of data centers is large, ever-growing, and needs to be tamed. We need to decrease our global data carbon footprint, and there are many actions yet to be taken. Trendsetters in the industry are taking steps towards greening data centers and improving efficiency with a focus on PUE.

Alongside voluntary initiatives, fortunately the growing impact of data centers on carbon emissions is also recognized by governments. Europe is already taking the first steps towards regulation. For example, the EU has established a code of conduct [EU Science Hub, 2016], and Norway has even raised the bar a notch higher [Norwegian Ministry of Foreign Affairs, 2016].

In some cases, regulation is not required, because green data centers are already a good business case. Let’s not underestimate the less tangible benefits for data giants: good CSR branding is also immensely valuable and a powerful influence on customer loyalty [Economist, 2021].

Commercial viability is strong

Energy-efficient and carbon-neutral data centers also form a good business case both financially and ethically. On the economical side, it makes sense to stack values to create multiple revenue streams for the data center. Such values may be data processing, selling waste heat and interactions in the energy market to support a grid that is challenged by the ongoing electrification.

Additionally, Environmental, Social and Governance (ESG)-aware companies walk the talk to gain attractiveness among investors. They assume responsibility for their impact on people, the environment, and the communities and societies in which they operate. This attitude is exemplified by companies who state goals of carbon neutrality within certain timeframes, demonstrating that sustainability matters. Many investors value the ethics of this approach while demanding their dividends. These investors range from anonymous funds to the man on the street — you and me. By putting our money where our beliefs are, we can demonstrate grassroots support for the ongoing greening of data centers. [Economist, 2021]

Synergies from other industries

In developing new energy-efficient solutions, we can learn a new approach to power conversion systems from other industries under pressure to reduce their emissions. For example, let’s learn from the shipping industry, where DC grids are increasingly popular as a means of improving efficiency in hybrid marine propulsion projects. In urban centers, solutions using DC grids are under consideration for use in buildings and factories to increase energy efficiency, encountering similar challenges to the marine industry. Research projects are evaluating how DC grids will work in distribution networks in future decentralized power grids. Why is DC now rearing its head in several industries at once? Because the technology delivers energy savings and facilitates easier integration of different energy sources.

In data centers, as for other industries, DC grids contribute to energy savings due to the better opportunities for varying frequency in backup power, compared to AC grids. For data centers, the introduction of DC grids also has the potential to reduce the number of conversion steps when compared with existing technology, resulting in energy savings as large as 15%. Additionally, a DC grid enables us to eliminate automatic transfer switches (ATS) which are notoriously unreliable in data center backup power supply.

Selling waste energy as a product to urban heating plants — #sectorintegration

The principles of circular economy encourage us to re-use valuable resources. Energy is one of these resources, though a challenging one. The main waste product in data centers is energy as heat, however it comes in a state which is less versatile than electricity, with some constraints making it more difficult to re-use. However, if data centers can overcome the constraints and learn to sell their waste heat, they can set new PUE records below 1.0. Voilà — we have used the energy twice.

To achieve good reuse of waste energy, we need to identify customers of waste heat, as well as understand their needs. Each recipient of waste heat will have individual requirements to that waste heat. For example, heating a swimming pool requires a waste heat temperature of 30 °C, and central heating requires a temperature of at least 55 °C. Potential customers of heat energy could for example be residential buildings and industrial processes.

Waste heat for desalination plants: an energy source for purifying water

Consider this: On a global scale, clean water is a scarce resource, and yet vital for human beings. The majority of water is saline (sea water). To transform this sea water into potable water requires water desalination, a process which is very energy intensive. Still, it is viable in arid countries. This is where excess heat from data centers can contribute to powering desalination plants in arid locations. By co-locating desalination plants and data centers, we can conveniently re-use the waste energy from data centers. There is a nice symmetry in implementing this concept where water is scarce, since data centers are such intensive consumers of potable water, consuming billions of liters of potable water every year [Nature, 2018].

There is no shortage of inspiration for the greening of our data centers. But we must still remember that these commercial enterprises run first and foremost on competitive terms. Customers don’t care whether their Google search is carbon neutral, if it doesn’t respond immediately when they need it. In this commercial environment, speed and reliability beat climate goals every time. Therefore, the cornerstones of reliability and good risk management must be in place, before addressing energy efficiency. Although energy efficiency makes economic sense, it’s still a secondary consideration.

Risk management in data centers: redundancy and availability

Uptime is critical for data centers, and therefore good redundancy is a high priority in risk management. Several measures should be carefully taken to ensure the expected availability. Dealing with redundancy involves so much more than just design. It involves submitting the design to a third-party review for evaluation, validation, and testing.

Testing is good, and it’s vital to be conscious of expected results — and what unexpected results may mean. Designing correct testing methods to discover hidden failures requires experience. In Danfoss, we evaluate relevant failure modes with our customers to reach the level of uptime expected without killing the system components during test.

All these efforts are an insurance against risk due to faults, either in design or equipment, which can lead to a less redundant system than intended. To improve risk management for data centers, we can look to models in other industries which already have well-established processes, such as UL listing or the marine industry with its classification societies.

How do we adapt machine learning to reach data center PUE goals? #ai

Energy and cooling management in data centers is complex. We also know that when complexity of a process increases, it gets more challenging to optimize that process. To illustrate this, imagine how pump speed relates to liquid flow. That’s easy, it’s a simple curve. Then imagine how to control pump speed as a function of YouTube clicks and weather forecasts. The complexity increases.

In a data center, a huge number of set-points and parameters must be adjusted to achieve the optimum point of operation for varying load. But how do we do that on-the-fly, based on load and environmental predictions? This is where data centers can find useful inspiration or cross pollination (if you prefer that analogy) from a different field of industry: machine learning. The Google company Deepmind has shown how machine learning can optimize the complex data involved to reduce energy consumption in data center cooling by as much as 40% [Deepmind, 2016]. This requires us to consider how the components we make interface with machine learning technology.

In Danfoss, we’re continuously developing new interfaces, simulation models representing the process, and our devices can be used to collect data for learning processes.

We can already achieve a lot using artificial intelligence learning processes with the technology we have today. What is unique about Danfoss, is its deep knowledge in electrification through power conversion and in components for energy management in data centers. We understand the challenges data center operators face thanks to decades of experience in data centers and other related fields, which we can bring to the table. We are continuously exploring how to meet upcoming new requirements for components, and where we can add even more value to our customers’ business.

Check out how Danfoss utilizes excess heat from its own data center to heat the Danfoss headquarters, showing that digital transformation and green transition go hand in hand.

 

Author: Ottar Skjervheim, Electrification Leader, Danfoss Drives

Sources:

Nature, 2018 https://www.nature.com/articles/s41545-021-00101-w

EU Science Hub, 2016 Code of Conduct for Energy Efficiency in Data Centres | EU Science Hub (europa.eu)

Economist, 2021 https://www.economist.com/finance-and-economics/2021/03/27/the-impact-of-green-investors

Deepmind, 2016  https://deepmind.com/blog/article/deepmind-ai-reduces-google-data-centre-cooling-bill-40

Water Facts/Water Scarcity, UN Water, United Nations https://www.unwater.org/water-facts/scarcity/

Microsoft Blog, September 21, 2020 https://blogs.microsoft.com/blog/2020/09/21/microsoft-will-replenish-more-water-than-it-consumes-by-2030/

Danfoss Group, 2021 Read how Danfoss decarbonizes by building green data centers | Danfoss