Reducing IT’s environmental footprint requires efficient algorithms

0

This article was written by Neil Thompson, a researcher at MIT’s Computing and Artificial Intelligence Lab and the Digital Economy Initiative.

As IT applications become more complex and data sets increase, the environmental impact of IT intensifies. Historically, this was not much of a problem as increasing computational needs were offset by improvements in hardware efficiency, colloquially referred to as Moore’s Law. But as hardware improvements diminish, another (often invisible) source of efficiency takes center stage: improving algorithms.

Our growing appetite for computing is reflected in the proliferation of data centers – which can span millions of square feet – and which consume large amounts of electricity. The International Energy Agency estimates that data centers account for 1% of global energy consumption and 0.3% of all global CO2 emissions. Without ways to make IT more efficient, this damage will increase as we tackle growing big data challenges in our increasingly sensor-laden world.

In a recent study, Yash Sherry (an MIT Sloan affiliate researcher) and I looked at how quickly algorithms improved and compared that to what was historically the most important counterweight to the growing appetite for computation, Moore’s Law. Driven by the miniaturization of the building blocks of computer hardware, Moore’s Law has enabled many decades of vast year-over-year improvements in computing efficiency. Just as increased agricultural productivity has fueled global population growth, so increased equipment productivity has fueled the growth of computing around the world.

But while Moore’s Law is the flashy brother that’s always in the news, improving algorithms is the behind-the-scenes brother.

Algorithms are the recipes that tell computers what to do and in what order. And while Moore’s Law has provided us with computers capable of performing many more operations per second, improving algorithms has provided better recipes for knowing what to do next with each of those operations – and the benefits can be huge. For example, imagine that you are Google Maps and you have to find the shortest route between 1,000 popular places that people are going to. Calculating this using an older algorithm could easily take a million times more calculations than using a more modern version. Another example we have documented is text matching, such as when search engines search for keywords in web pages or lawyers look for particular references in legal documents. Better algorithms can easily make such a search 100 times faster than it originally was, reducing computing time and power consumption.

But while the individual examples can be impressive, we wanted a broader view. For this study, we looked at 57 textbooks and over a thousand research articles to find the algorithms that computer scientists believe are most important. From these, we extracted 113 different “algorithm families” (sets of algorithms that solve the same problem in different ways) that had been highlighted as the most important by computer textbooks. For each of the 113, we followed whenever a new algorithm was proposed for this problem from the 1940s to the present day.

So how does the algorithm improvement compare to the hardware improvements? For big data issues, 43% of algorithm families had year-over-year improvements that were equal to or greater than Moore’s Law gains. Of these, 14% had improvements that have overtaken those who have come from better material. These improvements completely transformed what was doable in these areas, allowing problems to be solved in a way that no hardware improvement can do. Equally important in our current era of data size escalation, the gains from improving algorithms are greater the larger the problem to be solved.

State-of-the-art companies and research labs are already responding to the need to invest in better algorithms. The middle organization devotes 6-10% of its IT developers to building new algorithms and 11-20% to improving existing algorithms, which is a very large investment. Other organizations, accustomed to simply buying new hardware to improve computing, will increasingly have to follow the lead of these algorithm flagships to stay competitive.

The growing importance of algorithms is part of a larger shift in the drivers of advancement in computing. Historically, improvement was geared towards hardware, but with the end of Moore’s Law that’s changing. Instead, improving algorithms will come more and more to the fore, providing the engine to tackle new, more difficult computer problems.

But pushing the boundaries of computing is only one benefit of better algorithms, the other is efficiency. For those in government or academia, or just those concerned about the sustainability of computing, better algorithms are an ideal option: allowing us to achieve the same results but at significantly reduced environmental costs.

Neil Thompson is a researcher at MIT’s Computing and Artificial Intelligence Lab and the Digital Economy Initiative. Previously, he was assistant professor of innovation and strategy at the MIT Sloan School of Management where he co-directed the Experimental Innovation Lab. Thompson has advised businesses and government on the future of Moore’s Law and served on National Academies panels on transformational technologies and scientific reliability.

Data makers

Welcome to the VentureBeat community!

DataDecisionMakers is the place where experts, including data technicians, can share data-related ideas and innovations.

If you’re interested in learning more about cutting-edge ideas and up-to-date information, best practices, and the future of data and data technology, join us at DataDecisionMakers.

You might even consider contributing your own article!

Read more about DataDecisionMakers

Share.

Leave A Reply