Consolidation across the semiconductor industry could have a big impact on future choices.
The rule of thumb in business is that consolidation in a maturing industry improves the health of the surviving companies. In most market sectors that’s true. In the semiconductor industry, that formula doesn’t work.
The reason is due to what might well be called foundational economics. While it’s possible to reduce costs in making chips for years to come, at some point the basic building block for all electronics—the transistor—reaches a level at which it is no longer profitable to continue researching, designing and developing chips at the level necessary for progress.
This is different in the chip industry than in any other, because the costs of developing new ideas are so enormous that no single company can afford to do it all alone—not Intel, not Samsung, and not TSMC. The problems being solved are incredibly hard, and they take an unexpected sum of money because not all of the research pans out, and very little of it ever pans out in an expected time frame. Multipatterning would never have even been considered if EUV was available. 2.5D and 3D architectures would never have been even contemplated seriously if it wasn’t for the difficulty of pushing electrons through thin wires surrounded by thinner oxides, all fighting to limited chip resources. And that doesn’t even begin to tackle quantum research and quantum effects, which can add a level of uncertainty about how quickly electrons can move through various materials.
The equipment side is a whole different kind of problem. To fully equip an advanced fab may take $10 billion to $15 billion. To develop that equipment takes many billions of dollars more in investments.
Slow down research in any area of this ecosystem, which has been finely tuned over a period of five decades, and it will have huge repercussions well beyond the semiconductor industry. The chip, despite how little attention it gets from investors, is the the cornerstone of all electronics. It’s only possible to continue cutting costs—or, alternatively, adding features and functions—in other industries because of the massive investment in chips. That is the engine that drives progress in smart phones, cars, industrial automation, television and, in the future, the IoT. And while it’s possible to improve on what’s there, a reduction in semiconductor R&D has an impact in all directions.
This isn’t just a per-node cost, as defined by Moore’s Law. It’s an industry-wide consolidation that erodes the basis for R&D. Mega acquisitions don’t result in combined R&D budgets. In marketing speak, they create efficiency gains. In plain English, companies slash costs wherever possible in order to make acquisition costs more palatable to shareholders. The problem is that semiconductor R&D budgets have to grow to keep pace with increasingly difficult physics problems. If they don’t, over the next few years there will be a falloff in advanced research and nothing to take its place.
It’s possible to squeak by for a generation or two of hardware—think of a generation as about 6 months—with software upgrades and architectural tweaks. There is so much performance in devices today that just using the processing capabilities already built into semiconductors in a more efficient manner will provide gains. But push that out for a year or two, and even the best code won’t work well enough to satisfy new markets. And at that point, even the largest system vendors will begin breaking into a sweat over what makes their product different or new.
Consolidation may be good for some things—beefing up the capabilities and market reach of companies, adding expertise where needed, and providing enough mass to attack nascent markets such as the IoT/IoE, but it also can be very bad if it impacts the R&D of the foundational technology on which it is built. That creates a disruption that no single company—or even a single country—can afford to rebuild.