Hyperscale computing and a focus on the interaction between domains are pushing math-based analysis to the forefront.
You may have seen the term “computational software” more often recently. What are some prominent examples? Why do we in the electronic design automation (EDA) industry have to deal with math in the first place? Wasn’t chip design all about drawing polygons at one point? I’m glad you asked!
Computational software supports and manages the complexity of fundamental industry trends—hyperscale computing, 5G, artificial intelligence, IIoT and autonomous driving—and transforms electronics across several critical verticals—consumer, compute, mobile, networking, automotive, aerospace/defense, industrial and health. It’s all about math, as I previously outlined my blog post “Math and Electronic Design Automation” back in April.
To be more specific, let’s look at some of the computational software algorithms used in EDA today, as outlined in this figure:
The figure illustrates how to refine a system of systems into smaller systems that have printed circuit boards (PCBs) and systems on chips (SoCs) in packages. The SoCs themselves are often complex combinations of hardware and software that must be verified and then implemented as digital or custom/analog designs. IP reuse is playing big, and development teams need to bring up software as early as possible. Some computational software algorithms that are widely used today are indicated as well.
For instance, finite element analysis (FEA) is widely used for structural analysis, calculation of heat transfer, fluid flow and mass transport, and to do electromagnetic analysis, which is needed extensively when assessing silicon in package effects. It all goes to complex math that most of us would like to forget, like partial differential equations in two or three space variables. FEA uses a numerical method for solving them by subdividing the larger system into smaller finite elements using space discretization via a mesh of the object that is to be analyzed. This is then boiled down to a system of algebraic equations, and FEA approximates the unknown function. Math galore! (To learn more, GraspEngineering’s “Practical Introduction and Basics of Finite Element Analysis” provides a great overview in 55 minutes well spent.)
Other examples of computational software—as indicated in the figure—are the various types of solvers used in formal verification, as well as advanced algorithms for constraint solving as used in verification. And then there are, of course, all the compile optimizations for mapping into different execution fabrics like FPGA-based prototyping and emulation using specialized silicon. Plus, technologies like SPICE predict the timing, frequency, voltage, current or power at the transistor level.
While EDA has used very detailed math-based analysis and simulation pretty much since its inception—SPICE was originally developed in 1972—a couple of trends combined are now leading to increased re-emphasis of computational software.
First, with the advent of cloud-based computing and its premise of scalable compute capacity, the industry has found many more innovative ways to parallelize the problems. These allow developers to maintain accuracy and deal with much more complex problem sets, applying technologies like FEA to full phones and server racks.
Second, the interaction between the different domains has become much more complex. Developers need to know, as early as possible, what the electromagnetic, power and thermal effects will be. They want to combine the electrical and mechanical aspects of their simulations as early as possible because an early decision may incur a much higher cost later on when they are harder to correct. The trends towards digital twins, “shifting left” software development and continuous integration from the earliest point in time, are all outgrowths of this dynamic.
Third, with the abundance of compute available, more developers are exposed to more computations. For instance, when I myself started as a designer, I did not have to use SPICE for my first design. Other developers had already done that. It was the era of gate arrays when we used simulation at the gate level and drew the metal layer manually using Cadence’s “edge” tooling. All the underlying circuit simulation had already been done, and it would have been impossible from a compute perspective to do it all for every project. With today’s scalable compute in the era of hyperscale, one can simply get more math done.
Lastly, there are business aspects playing here, too. According to CIMData, EDA was about 20.7% of the about $48B Product Lifecycle Management (PLM) market in 2018. Simulation and analysis was about 13.5%, and the mechanical CAD-related market was about 14.8%. With the different domains growing so much closer together, there are a lot more shades of grey in the world of simulation, with players overlapping and markets growing closer together. Simply put, the sizes of the fish tanks for the various players are growing.
As we are now in the era of hyperscale computing, domain-specific architectures and domain-specific languages, math is cooler than ever and computational software is the foundation of previously disparate industries that are growing closer together.
Leave a Reply