The world’s thirst for energy from computers and communication technology is a clear threat to the warming climate of the earth. That’s the candid assessment from the speakers of the two-day in-depth panel on the Climate Implications of Computing and Communications held March 3 and 4, organized by the Climate and Sustainability Foundation. of MIT (MCSC), MIT-IBM Watson AI Lab, and Schwarzman University of Informatics.
This virtual event includes rich discussions and highlights opportunities for collaboration between an interdisciplinary team of MIT faculty members and researchers and industry leaders across disciplines – emphasizing the power of academia and industry combined.
“If we continue with the current trajectory of computational energy, by 2040 we can reach the world’s energy production capacity. Bilge Yildiz, Professor Breene M. Kerr of MIT’s Division of Nuclear Science and Engineering and Materials Science and Engineering, said: of the 18 panelists. This computational energy forecast is derived from the Semiconductor Research Corporation’s decadence report.
To give an example: Information and communication technologies already account for more than 2% of global energy demand, on par with aviation fuel emissions.
“We are the initiator of this data-driven world. We really need to start thinking about this and taking action now,” said presenter Evgeni Gousev, senior director at Qualcomm.
Innovative energy-saving options
Finally, the panel presentations explored a wide range of energy-saving options, including specialized chip designs, data center architectures, better algorithms, hardware modifications, and more. change in consumer behavior. Industry leaders from AMD, Ericsson, Google, IBM, iRobot, NVIDIA, Qualcomm, Tertill, Texas Instruments, and Verizon outlined their companies’ energy efficiency programs, while experts from across MIT providing insight into the current study may result in greater computational efficiency.
Panel topics range from “Custom Hardware for Efficient Computing” to “Hardware for New Architectures” to “Algorithms for Efficient Computing”, among topics other.
A visual description of the conversation during the panel session titled “Energy Efficiency Systems.”
Image: Haley McDevitt
The goal is to improve computer-related energy efficiency by more than a million times, says Yildiz.
“I think part of the answer to how we make computing much more sustainable has to do with specialized architectures,” said Darío Gil, IBM Senior Vice President and Research Director. use is very high. As “elegant” as possible.
For example, Gil illustrated an innovative chip design that uses vertical stacking to reduce the distance data must travel and thus reduce power consumption. Surprisingly, the more efficient use of tape – the traditional medium for storing primary data – combined with a dedicated hard drive (HDD), can result in significant savings on carbon dioxide emissions.
Gil and presenter Bill Dally, principal scientist and senior vice president of research at NVIDIA; Ahmad Bahai, CTO of Texas Instruments; and others are added to the memory. Gil compared the data to an iceberg where we have quick access to the “hot data” of the smaller visible portion while the “cold data”, the large underwater mass, represents data tolerant to higher latency. Think about storing digital photos, says Gil. “Honestly, are you really retrieving all those photos consistently?” The storage system must provide the optimal combination of HDD for hot data and tape for cold data based on data access patterns.
Bahai emphasizes the significant energy savings gained from full processing and standby segmentation. “We need to learn how to do nothing better,” he said. Dally talks about mimicking the way our brains wake up from a deep sleep, “We can stay awake [computers] grow much faster, so we don’t need to let them run at full speed. “
Several panel speakers talked about focusing on “sparseness,” a matrix in which most elements are zero, as a way to improve efficiency in neural networks. Or as Dally put it, “Never stop until tomorrow, where you can stop forever”, effectively explaining not “getting the most information with the fewest bits. It works the most with the least amount of energy. “
Comprehensive and multidisciplinary approaches
“We need both efficient algorithms and efficient hardware, and sometimes we need to co-design both the algorithm and the hardware for efficient computation. ) at MIT.
Some of the presenters were optimistic about the innovations underway. According to research by Ericsson, up to 15% of global carbon emissions can be reduced through the use of existing solutions, says Mats Pellbäck Scharp, head of sustainability at Ericsson. For example, GPUs are more efficient than CPUs for AI, and the evolution from 3G to 5G networks enhances energy savings.
“5G is the most energy efficient standard ever,” said Scharp. “We can build 5G without increasing energy consumption.”
Companies like Google are optimizing energy use in their data centers through design innovation, technology and renewable energy. “Five of our data centers globally are operating near or above 90% carbon-free energy,” said Jeff Dean, senior fellow at Google and senior vice president of Google Research said.
However, pointing to a possible slowdown in doubling the transistors in an integrated circuit – aka Moore’s Law – “We needed new approaches to meet the machine needs this feature,” said Sam Naffziger, AMD Senior Vice President, Colleague and Product Technology Architect. Naffziger spoke of dealing with “overkill” performance. For example, “we found that in the gaming and machine learning space, we can use lower precision math to provide images that look as good with 16-bit calculations as with 32 bit and instead of the old 32b math to train AI networks, we can use lower power 8b or 16b calculations”.
Visualize the conversation in a conference session titled “Wireless, Networked, and Distributed Systems.”
Image: Haley McDevitt
Other presenters point to edge computing as a primary energy pig.
“We also have to change the devices shipped to our customers,” said Heidi Hemmer, Verizon’s senior vice president of engineering. When we think about how we use energy, we usually move to data centers – but it really starts with the equipment itself and the energy it uses. Then we can think about home web routers, distributed networks, data centers and hubs. Hemmer concludes: “These devices are really the most energy efficient of them all.
Some presenters have different views. Some call for the development of specialized silicon chipsets to increase efficiency. However, panel moderator Muriel Medard, Professor Green Cecil H. at EECS, described research at MIT, Boston University and Maynooth University on GRAND chips (Measuring Emerging Code and Standards). different, you can use one chip for all codes.”
Whatever new chip or algorithm, Helen Greiner, CEO of Tertill (a weeding robot) and co-founder of iRobot, emphasizes that in order to bring products to market, “We have to learn to let go of desire. Get the latest and greatest products. , the most advanced processors are usually more expensive. She added, “I’d say robot demos are a dozen, but robotic products are rare.”
Greiner emphasized that consumers can play a role in promoting more energy-efficient products – just as drivers are starting to demand electric vehicles.
Dean also recognizes the role of the environment for the end user.
“We have allowed our cloud customers to choose the cloud areas they want to run their calculations in, and they can decide how important it is to them to have a low carbon footprint,” he said. , and cites other interfaces that could allow consumers to decide which flights are more efficient or what impact installing a solar panel in their home would have.
Still, says Scharp, “Extending the life of your smartphone or tablet is really the best climate action you can take if you want to reduce your digital carbon footprint. “
Facing growing demand
Despite their optimism, the presenters acknowledge the world is facing increasing computing demands from machine learning, AI, gaming and especially blockchain. Panel moderator Vivienne Sze, an associate professor at EECS, noted the conundrum.
“We can do a great job at making computers and communications really efficient. But there is a tendency that once things are very efficient, people will use more, and this can lead to an overall increase in the use of these technologies, which in turn will increase the overall carbon footprint. our bodies,” says Sze.
The speakers saw great potential in academic/industry partnerships, particularly from research efforts on the academic side. Gousev concludes: “By combining these two forces together, you can really amplify the impact.
Speakers at the Conference on Climate Impacts of Computing and Communications include: Joel Emer, professor of practice in EECS at MIT; David Perreault, Joseph F. and Nancy P. Keithley Professor of EECS at MIT; Jesús del Alamo, MIT Donner Professor and professor of electrical engineering in EECS at MIT; Heike Riel, IBM Fellow and chief science and technology officer at IBM; and Takashi Ando, principal research officer at IBM Research. Recorded seminar sessions are available on YouTube.