Reducing energy consumption involves striking a balance between universal and highly specialized architectures.
Attention nowadays has turned to the energy consumption of systems that run on electricity. At the moment, the discussion is focused on electricity consumption in data centers: if this continues to rise at its current rate, it will account for a significant proportion of global electricity consumption in the future. Yet there are other, less visible electricity consumers whose power needs are also constantly growing. One example is mobile communications, where ongoing expansion – especially with the new current 5G standard and the future 6G standard – is pushing up the number of base stations required. This, too, will drive up electricity demand, as the latter increases linearly with the number of stations; at least, if the demand per base station is not reduced. Another example is electronics for the management of household appliances and in the industrial sector: more and more such systems are being installed, and their electronics are becoming significantly more powerful. They are not currently optimized for power consumption, but rather for performance.
This state of affairs simply cannot continue into the future for two reasons: first, the price of electricity will continue to rise worldwide; and second, many companies are committed to becoming carbon neutral. Their desire for carbon neutrality in turn makes electricity yet more expensive and restricts the overall quantity much more severely. As a result, there will be a significant demand for efficient electronics in the coming years, particularly as regards electricity consumption.
This development is already evident today, especially in power electronics, where the use of new semiconductor materials such as GaN or SiC has made it possible to reduce power consumption. A key driver for the development and introduction of such new materials was the electric car market, as reduced losses in the electronics leads directly to increased vehicle range. In the future, these materials will also find their way into other areas; for instance, they are already beginning to establish themselves in voltage transformers in various industries. However, this shift requires more factories and more suppliers for production, and further work also needs to be carried out to develop appropriate circuit concepts for these technologies.
In addition to the use of new materials, other concepts to reduce energy consumption are needed. The data center sector will require increasingly better-adapted circuits – ones that have been developed for a specific task, and as a result can perform this task much more efficiently than universal processors. This involves striking the optimum balance between universal architectures, such as microprocessors and graphics cards, and highly specialized architectures that are suitable for only one use case. Some products will also fall between these two extremes. The increased energy efficiency is then “purchased” through the effort and expense of developing exceptionally specially adapted architectures. It’s important to note that the more specialized an adapted architecture is, the smaller the market for it. That means the only way such architectures will be economically viable is if they can be developed efficiently. This calls for new approaches to derive these architectures directly from high-level hardware/software optimization, without the additional implementation steps that are still necessary today. In sum, the only way to make this approach possible is by using novel concepts and tools to generate circuits directly from a high-level description.
Leave a Reply