Change is no longer confined to the chip; it now has a direct bearing on your skill set.
Complexity isn’t always bad. The key is being able to deal with it intelligently and economically.
Systems engineers are in the middle of one of the most complex periods in chip development. In the past, they typically had to deal with one problem at a time. At 1 micron, the wall was lithography. At 90 nanometers, it was low-k dielectric insulation (and yes, copper interconnects and 300mm wafers). At 65nm it was new gate structures, high k insulators and and strained silicon.
But at 45nm, 32nm and 22nm, not to mention all the half nodes in between, it’s not just a single thing. It’s all of the above, plus potentially new substrates, power islands, power gating and stepping, more lithography, more IP and potentially even more problems such as high defect density. And they’re stacking up in an ever-larger pile at each process node.
This is going to force the hand of companies that have been relying on home-grown tools because they’re cheaper, and it’s going to test the faith of systems engineers who have to design and verify these incredibly complex devices. They can’t possibly understand every facet of development anymore, so now they have to work within safe zones—boxes they create in which they know pieces work properly.
For engineers who have been developing chips for more than a decade, this is an unnerving development. It’s a matter of plugging data into a black box and waiting for the answer to come out the other side. On one hand, it’s hard to meet tape-out schedules without using this kind of technology. On the other, it’s hard not to feel like your skills are being marginalized along the way. Even analog engineers, who considered themselves to be the last holdouts, are heading down this path.
IBM began working with this approach publicly in the early part of this decade when it began offering commercial foundry services and promising first-pass silicon. Since then, most companies have been experimenting with it in one form or another, whether it’s TLM 2.0, IP-XACT or the new low-power IEEE 1801 standard.
Complexity is forcing more compliance and that isn’t necessarily all bad—as long as engineers can figure out a way to stay current and add new value. But the days where an engineer could learn a skill and continue applying it over several decades are over. At 90nm, we passed from expertise as a fixed asset to a expertise as a relative asset—and one that needs constant refreshment. At 32nm, engineers will need a smattering of knowledge in physics, and at 22nm they will need to understand subatomic physics, chemistry and, increasingly, business.
For those who can adapt and create value for their companies, complexity will serve them well. For those who can’t—well, it’s better to think you can.
–Ed Sperling
Leave a Reply