Rapid technology changes, uncertain interactions create huge challenges.
AI and 5G bode well for the semiconductor industry. They will require many billions of new, semi-customized and highly complex chips from the edge all the way to the data center, and they will require massive amounts of engineering time and tooling. But these technologies also are raising lots of questions on the design and verification front about what else can be automated and how to do it.
Existing tools can handle individual interactions between blocks of memory and logic. They can optimize layout for data flow, identify parasitics that can cause reliability problems. They can even simulate thermal maps of chips under a variety of conditions, not all of which are obvious to design engineers.
However, there is a whole set of new challenges that require a higher level of abstraction. AI and 5G chips are all about system-level data flow, including storage and acceleration both on-chip and off-chip, and those are very large data paths to simulate and verify even with all the resources in the cloud.
There are two main problems that are beginning to show up. First, most tools require an additional level of abstraction, preferably one that is open to multiple vendors’ tools with a standardized integration scheme. In effect, what is required is a dashboard that can be used to track different options before a device is built—basically what amounts to pathfinding for heterogeneous architectures—allowing design teams to zoom in or out on different scenarios.
The chip industry historically has been resistant to raising abstraction levels. That has changed recently, as evidenced by the rapid adoption of high-level synthesis. HLS had been touted as a highly useful tool for nearly two decades before it finally caught on, in part because new chips aren’t just new versions of older chips with smaller features, and in part because some of the companies developing these chips have no pre-existing design infrastructure and biases.
Adding fuel to this demand is the fact that power/performance benefits of shrinking features is dwindling, and there are a raft of new options around which there is little industry experience. This includes heterogeneous architectures with thousands of small processors and memories, various types of advanced packaging, and neuromorphic designs that balance throughput and processing power against variables like precision or more customized memories and compute elements.
To make matters worse, all of this has to be done faster than ever before. Competition in AI and 5G is explosive, fueled by a massive infusion of venture capital that is reminiscent of the dot-com days. The difference this time is investors are chasing real opportunities in automotive, medical, industrial and consumer markets with money to spend, but they all require highly customized solutions rather than everyone competing for one or two designs. And unlike the dot-com era, when investors funded innovative but often unsustainable or badly crafted business models, these are real business opportunities in proven markets.
But 5G and AI also add their own unique challenges. In the case of 5G, there is no obvious way yet to test and verify these devices will work as expected. Some of this technology is still in the research phase, particularly on the millimeter-wave front. So while sub-6 GHz versions will add some needed performance improvements, beyond that it’s a dotted line between PowerPoint and silicon. No one is even sure how well or how far signals will carry in the real world, where atmospheric conditions can affect range and speed and beam-forming could sharply reduce battery life.
On the AI front, a device is supposed to adapt to its use models within a given distribution. But how to verify it will stay within that distribution, or perform within acceptable specs in all conditions, isn’t clear at this point. Building chips for acceptable behaviors is a very gray area to begin with. Setting parameters for tools to design chips within those distributions will require looking at designs from a much wider perspective.
All of these are big, system-of-system types of issues, and the reach of the tools will have to extend much further than in the past in order to be effective. This is a huge opportunity for EDA companies, and most of them are well aware of it. So far, they just aren’t saying much about it.
Related Stories
Pushing AI Into The Mainstream
Why data scrubbing and social issues could limit the speed of adoption and the usefulness of this technology.
Debug Tops Verification Tasks
Time spent in debug on the rise; what needs to be fixed and why.
5G Test And Deployment
How test will change as next-gen wireless evolves toward higher frequencies.
Leave a Reply