Moore’s Law is Intel co-founder Gordon Moore’s famous prediction that the number of transistors on a chip will double every year or two. This prediction has mostly been met or exceeded since the 1970s – computing power doubles every two years, while better and faster microchips become less expensive.
This rapid growth in computing power has fueled innovation for decades, but at the turn of the 21st century, researchers began sounding the alarm that Moore’s Law was slowing down. With standard silicon technology, there are physical limits to how small transistors can be obtained and how many transistors can be squeezed into an affordable microchip.
Neil Thompson, an MIT research scientist at the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Sloan School of Management, and his team set out to determine the importance of these machines. More powerful computers to improve outcomes across society. In a new paper, they analyzed five areas where computation is important, including weather forecasting, oil exploration, and protein folding (important for drug discovery). The paper was co-authored by research assistants Gabriel F. Manso and Shuning Ge.
They found that between 49 and 94 percent of the improvement in these areas could be explained by computing power. In weather forecasting, for example, increasing computer power by 10 improves three-day predictions by a third of a degree.
However, the development of computers is slowing down, which can have far-reaching effects on the economy and society. Thompson spoke with News MIT about this research and the implications of the end of Moore’s Law.
Q: How do you approach this analysis and quantify the impact of computers on different sectors?
ONE: It is difficult to quantify the impact of computing on real outcomes. The most common way to look at computing power and IT advancement in general is to study how much money companies are spending on it and see how it correlates with results. But spending is an unwieldy measure because it reflects only a fraction of the value of purchased computing power. For example, a computer chip today may cost the same as last year, but it is also much more powerful. Economists try to adjust for that quality change, but it’s hard for you to grasp exactly what that number should be. For our project, we measured computational power more directly – for example, by looking at the capabilities of the systems used when protein folding was first performed using use deep learning. By looking at the possibilities directly, we can get more precise measurements and thus get better estimates of how computing power affects performance.
Q: How can more powerful computers improve weather forecasting, oil and gas exploration, and protein folding?
ONE: The short answer is that the increase in computing power has had a tremendous impact on these areas. With weather predictions, we found that the amount of computing power used for these models increased by a trillion times. That shows how computing power has grown and also how we harness it. This isn’t someone just taking an old program and putting it on a faster computer; instead users have to constantly redesign their algorithms to take advantage of 10 or 100 times more computing power. There’s still a lot of human ingenuity to improve performance, but our results show that much of that ingenuity is focused on how to harness increasingly powerful computational engines.
Oil and gas exploration is an interesting case because it gets harder and harder over time as wells are easy to drill, so what remains is harder. Oil companies counter that trend with some of the largest supercomputers in the world, using them to interpret seismic data and map subsurface geology. This helps them do a better job of drilling in the right position.
Using computers to perform better protein folding has been a long-term goal because it is important to understand the three-dimensional shape of these molecules, thereby determining how they interact with other molecules. In recent years, the AlphaFold system has made significant breakthroughs in this area. What our analysis shows is that these improvements are well predicted by the large increase in the computing power they use.
Q: What are some of the biggest challenges when doing this analysis?
ONE: When one is looking at two trends that are developing over time, in this case performance and computing power, one of the most important challenges is to distinguish between cause and effect. It’s really just correlation. We can answer that question, in part, because in the areas we’ve studied, companies are investing huge amounts of money, so they’re doing a lot of testing. In weather modeling, for example, they don’t just spend tens of millions of dollars on new machines and then expect them to work. They evaluated and found that running a model for twice as long improved performance. Then they buy a system powerful enough to do that calculation in less time so they can use it operationally. That gives us a lot of confidence. But there are also other ways we can see the cause-and-effect relationship. For example, we see that there have been some leaps in the computing power used by NOAA (National Oceanic and Atmospheric Administration) to predict the weather. And, when they bought a larger computer and it was installed all at once, the performance really skyrocketed.
Q: Would these advances be possible without the exponential increase in computing power?
ONE: It’s a difficult question because there are so many different inputs: human capital, traditional capital, and even computing power. All three change over time. One could say, if you have a trillion times increase in computing power, that certainly has the biggest impact. And that’s a good intuition, but you also have to account for diminishing marginal returns. For example, if you go from not having a computer to having one, that is a huge change. But if you go from having 100 computers to having 101 computers, that extra number of computers is not very profitable. So there are two competing forces – a large increase in computation on the one hand but a decrease in marginal benefit on the other. Our research shows that, although we already have tons of computing power, it is getting bigger very quickly, which explains a lot of the performance improvement in these areas.
Q: What effects from Moore’s law slow down?
ONE: The consequences are quite worrying. As computing improved, it provided better weather predictions and other areas that we studied, but it also improved countless other areas that we didn’t measure but that are still important parts of our economy and society. If that engine of improvement slows down, that means all subsequent effects slow down as well.
Some might disagree, arguing that there are many ways to innovate – if one path slows, the others will compensate. To some extent that is true. For example, we are seeing growing interest in designing specialized computer chips as a way to offset the end of Moore’s Law. But the problem is the magnitude of these effects. The gains from Moore’s Law are so great that, in many areas of application, other sources of innovation will not be able to compensate.