Facebook is committed to sustainability and the fight against climate change. That’s why in September 2020 we announced our commitment to reaching net-zero emissions across our value chain in 2030. Part of Facebook’s sustainability efforts involve data center efficiency — from building servers that require less energy to run to developing a liquid cooling system that uses less water.
To learn more about these data center sustainability efforts at Facebook and how we’re engaging with the academic community in this space, we sat down with Dharmesh Jani (“DJ”), the Open Ecosystem Lead on the hardware engineering team and the Open Compute IC Chair, and Dr. Katharine Schmidtke, Director of Sourcing at Facebook for application-specific integrated circuits and custom silicon. DJ and Schmidtke’s teams are working to achieve four goals:
- Extend Facebook data center equipment lifecycle and make our gear reusable by others.
- Improve energy efficiency in Facebook infrastructure via hardware and software innovations.
- Reduce carbon-heavy content in our data centers.
- Work with industry and academia to drive innovation on sustainability across our value chain.
DJ and Schmidtke discuss why it’s important to build data centers that are as energy efficient as possible, how we’re working with and supporting academia and industry partners in this space, and potential research challenges that the Facebook researchers and engineers could tackle next.
Building energy-efficient data centers
For over a decade, Facebook has been committed to sustainability and energy efficiency. In 2009, Facebook built its first company-owned data center in Prineville, Oregon, one of the world’s most energy-efficient data centers with a power usage effectiveness (PUE) ratio between 1.06 and 1.08. In 2011, Facebook shared its designs with the public and — along with other industry experts — launched the Open Compute Project (OCP), a rapidly growing global community whose mission is to design, use, and enable mainstream delivery of the most efficient designs for scalable computing. However, there’s more to be done.
“On average, data centers use 205 TWh of electricity per year, which is the equivalent of 145 metric tons of CO2 emissions,” explains DJ. “With the growth of hyperscale data centers in the coming years, this emission is going to increase dramatically if mitigation is not considered today (source 1, source 2). Facebook wants to work to address this growing emission, as well, to ensure we run efficient operations and achieve our goal of net-carbon zero in 2030.”
You may also like
Chasing carbon: The elusive environmental footprint of computing
According to DJ, Facebook is doing multiple things to address these problems: “The sustainability team within Facebook is working across organizations to align on the goals that lead to reduction in carbon. Circularity is one of the emerging efforts within infrastructure to increase equipment life cycle, which has the biggest impact on the net-zero-carbon effort. We’re driving sustainability and circularity efforts in the industry through the Open Compute Project,” he says.
Data center construction itself also contributes to carbon emission. High-utilization efficiency on already-built data centers is the key to reducing new data center construction demand. Over the years, Facebook has been developing a suite of industry-leading technologies to control and manage the peak power demand of data centers. As a result, many more servers can be hosted in existing data centers with limited power capacity. This has led to more than 50% data center construction demand reduction. The technology is developed in-house with the help of academic collaborations and research internship programs. Some key research findings and hyper-scale industrial operation experience are also shared back to the community via top academic conference publications. Here are some examples: Dynamo: Facebook’s Data Center-Wide Power Management System, Coordinated Priority-aware Charging of Distributed Batteries in Oversubscribed Data Centers.
Learn more about Facebook data center efficiency on the Tech@ blog, and read our latest Sustainability Report on Newsroom.
Partnerships and collaborations
Developing energy-efficient technology isn’t something that industry can do alone, which is why we often partner with experts in academia and support their pioneering work. “Facebook has launched a number of research collaborations directed at power reduction and energy efficiency over the past few years,” Schmidtke says. “Recently, Facebook sponsored the Institute of Energy Efficiency at UC Santa Barbara with a gift of $1.5 million over three years. We hope our contribution will help foster research in data center energy efficiency.”
“Another example is the ongoing research collaboration with Professor Clint Schow at UCSB,” Schmidtke says. “The project is focused on increasing the efficiency of optical interconnect data transmission between servers in our data center network. The research has just entered its second phase and is targeting highly efficient coherent optical links for data transmission.”
Facebook is also an industry member of Center for Energy-Smart Electronic Systems (partnering with the University of Texas at Arlington) and Future Renewable Electric Energy Delivery and Management Systems Engineering Research Center (at North Carolina State University).
In addition to fostering innovation within the academic community, Facebook is leveraging industry partners. According to DJ, “We’re looking to drive sustainability-related initiatives within the OCP community to align other industry players across the value chain. We plan to define sustainability as one of the OCP tenets so that all future contributions can focus on it.”
What’s next
DJ offers three sustainability challenges that researchers in the field could tackle next, all of which would involve industry collaborations with academia and other research organizations.
One research challenge is making computation more carbon neutral. The AI field’s computing demands have witnessed exponential growth: Since 2012, the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.4-month doubling time — a 300,000x increase in compute from AlexNet to AlphaGo Zero. “How can we make AI more efficient while the current approach of increasing computation is not viable?” says DJ. “This is one of the biggest challenges in the field, so I’m eager to see more Green AI initiatives.”
Another challenge is scheduling workloads (WL) within data centers when carbon intensity is low. “We have to think of the amount of WL coming into the data centers and complex interactions to optimize for such use cases,” explains DJ. “I hope to see novel algorithmic ways of reducing energy consumption, distributing workloads, and impacting carbon emissions.”
An additional potential area of focus is technology that utilizes chiplets. Chiplets can be thought of as reusable, mix-and-match building blocks that come together to form more complex chips, which is a more efficient system that uses a smaller carbon footprint. “I’m looking forward to new computer architectures that are domain specific and driven by chiplets,” says DJ. “We have only explored the tip of the iceberg in terms of sustainability. There is much we can do together in this space to further the goal of a greener tomorrow.”
Facebook is committed to open science, and we value our partnerships with industry and academia. We are confident that together we can help drive technology and innovation forward in this space.
The post How Facebook partners with academia to help drive innovation in energy-efficient technology appeared first on Facebook Research.