The next big breakthroughs in performance will come from completely different approaches.
For the past half-century, chipmakers have been following the same roadmap for improving performance in chips and reducing the cost of chips. That has proven tremendously effective in reducing costs and packing computing into a smaller space, allowing people to carry around what used to be a multi-million-dollar mainframe in their pocket.
That approach is beginning to lose momentum. It’s getting too expensive to continue shrinking features, and the benefits from scaling are diminishing. While there are probably another couple of nodes where this will remain viable, those power/performance improvements most likely will have to be paired with some sort of advanced packaging. Samsung, TSMC, GlobalFoundries, UMC and Intel all offer advanced packaging options today. So do all of the major packaging houses.
Collectively, these approaches should provide sufficient improvements in performance, power, and ultimately cost through the next decade. That’s an excellent prognosis for an industry that has been struggling node-to-node ever since 28nm. But the big changes that lie ahead are less about shrinking boxes with a screen, and more about distributing processing everywhere and making it more mobile. This includes neuromorphic approaches in cars, using smarter sensors, and adding flexible hybrid electronics and other form factors.
Along with this will be a slew of new materials. In some cases, those materials could be clothing fabric or what today looks like a Band-Aid. In others, they could be standard chips with man-made materials that allow electrons to pass through more quickly. But this also can be taken much further. There is research underway at the U.S. Naval Research Laboratory to combine standard electronics with magnetoelectronics, allowing them to become single-spin devices to control the behavior of electrons and to create precise band engineering for logic and memory. That would make materials such as graphene much more useful, and it would have a profound effect on where and how processing can be done.
Finally, there is a massive amount of work underway in quantum computing. At least part of this is driven by sheer panic that quantum computing will be able to hack some very complex ciphers much more quickly in the past, and no one wants to be left behind in the increasingly complex world of cyberwarfare. But some of it also is a recognition that the next big gains in computing will happen where all noise can be controlled and electrons can move freely, namely in isolation at temperatures close to absolute zero. This certainly include qubits, but it also includes memory such as DRAM, which is superconductive at those temperatures. And if researchers continue to increase the lifespan of qubits, most likely with spin engineering, there may be a play for photonics in there, as well, which also is the subject of ongoing research.
Put in perspective, the end of device scaling as we know it is just the beginning of significant progress in other approaches to electronics. But the fundamental building blocks that will be required to make this work all need to be developed. This is almost like hitting the reset button on semiconductors. What worked before won’t necessarily work in the future, and everyone needs to go back to the drawing board to figure out the best tools, equipment, and methodologies to achieve the same kinds of economies of scale and reliability that will guide this industry through the next half century.
Leave a Reply