A Highly Wasteful Industry

While the industry claims to be concerned about power, it only does so for secondary reasons, and massive levels of waste go unaddressed.


The systems industry as a whole is not concerned about power. I know that is a bold statement, but I believe it to be true. The semiconductor industry is mildly concerned, but only indirectly. They care about power because thermal issues are limiting the functionality they can squeeze onto a chip, or in a package.

Some users, such as data center operators, claim to care about power because it impacts the amount of infrastructure and cooling they need, but their words are somewhat hollow because I have never seen them question the power efficiency of any of the software they run on the hardware. They pass that cost on, and because their competitors do the same, there is no real issue. They still have economies of scale that oftentimes make it cheaper than having in-house data centers.

At best, the power concern for a semiconductor, beyond what may limit their chip, is also relative. If a competitor were to offer a chip that consumed half the power, customers may be willing to spend a little more for it, or to favor it compared to a more power-hungry solution. But how much more would they pay? And is it worth the investment? Battery life is a secondary concern to functionality, and in the case of an Apple product, style.

In many of my interviews recently, I have heard an implied disgust by an increasing number of people about the magnitude of the power waste. I have written articles trying to highlight that waste, but the only thing most people will talk about are the techniques are available to reduce power enough such that chips don’t burn up. They don’t go any further than that. Nobody will address the real power wastage that is happening. As a simple example, when the screens on my desktop computer go to sleep, why does the GPU keep rendering an image? There needs to be a back feed to say whether data that was being generated is actually being used. So long as the frame buffer is maintained, or can be regenerated in a timely manner, then everything else is waste and the GPU burns through a significant portion of my total computer power.

Software remains the biggest culprit, because software companies always claim that productivity is the most important thing. In my last set of interviews, one person said that a smart phone probably would last 5X longer if the software was written using an efficient language. Others have said that software engineers will not use tools that enable them to analyze performance or power if they do not run at, or close to, real-time speeds. Neither are they willing to pay for anything that can provide that. Basically, they have no incentive to improve their software beyond picking appropriate algorithms or concentrating on tight loops. Even then, few seem to get it right and give no consideration to efficient data layout or anything like that.

I know from past experience as a software manager within the EDA industry how inefficient even low-level software packages are. While at the time I was only interested in performance, I forbade my engineering team from using about half of the standard C library. Routines like malloc and printf attempt to be so general-purpose that they contain massive amounts of bloat, which can easily be avoided. They had to provide me with evidence as to why they should be given an exception, which was rare. Instead, we invested a small amount of time creating routines that were tailored to our needs and ran many times faster. That would also have resulted in much less power.

I know other EDA companies that did similar things, but that was 20 years ago, and I am not sure if that is still done today. I doubt it, but please comment if similar things continue to be done.

Outside of our work environments, an increasing number of people say they are concerned about the environment. Those words are also somewhat hollow. Yes, they may buy an electric car, or make some changes, but they are also happy to use things like ChatGPT, which consumes huge amounts of power, or free software such as social media platforms. They never question the true environmental cost of those. Just because the environmental damage is somewhat hidden does not make it acceptable.

We are addicted to free software, and the environment is paying the price. We are being encouraged to use methodologies that map software onto huge farms of machines instead of developing better algorithms. I want power options in software that allow me to turn off unnecessary graphics, or overly fancy interfaces. Give me the cheap and frugal option.

Much of the development in AI is for purposes that do not advance mankind or provide a net benefit. While some question the ethics of AI, I also question if we can afford the environmental impact associated with these massive data models.

Maybe I am becoming jaded in my old age, but I am getting tired of the tech world being two-faced. It is time we truly started to be concerned about power – even if its costs more.


Mark L Schattenburg says:

Right on, brother!

Leave a Reply

(Note: This name will be displayed publicly)