The Future Is Bright: DARPA Is Driving Electronic Resurgence

Revisiting old ideas leads to new paths forward as changes in technology, architecture, and application specificity converge.

popularity

This week, DARPA ran the Electronics Resurgence Initiative (ERI) Summit in San Francisco, and while we are certainly staring at some daunting challenges to continue the fast-paced development in electronics, it looks like the future actually looks quite bright. I found myself whistling, “How lucky we are to be alive right now” from Lin-Manuel Miranda’s Hamilton when leaving the stunning Palace of Fine Arts. The future is bright. Lots of innovation lies ahead and we are living in exciting, transformative times.

I had been invited to present and participate in the “What’s Next” technical brainstorming at the hardware emulation workshop happening prior to the actual ERI summit. We had fascinating discussions moderated by DARPA’s program manager, Andreas Olofsson. The workshop opened with Waymo, Xilinx and NXP giving the automotive perspective, followed by Boeing, Lockheed Martin, and Raytheon talking about emulation challenges in the aerospace domain. National Instruments, Synopsys, Mentor and yours truly for Cadence were adding the vendor perspective.


Systems of Systems Challenge (Source: Bigstock)

The questions that DARPA posed were intriguing and triggered a lively discussion on emulation and general development of “Systems of Systems”:

  • Is real-time emulation possible? The question really became what the use case for real-time emulation would be. We ended up “bucketing” emulation and prototyping together. Airplanes with emulators onboard to test how different control algorithms would work in real time generally did not sound like a great idea. There are, however, applications and cases for which the alternative—virtualization—would not reflect the appropriate real-world effects.
  • Is emulation in the field possible? Situations like car manufacturers driving around real prototypes of new ECUs came to mind and were discussed. The question here very quickly becomes one of balancing the fidelity of the prototype with speed and cost. Digital twinning comes to mind: collect data in the real world and apply it to a digital twin of an airplane or car. This is promising, but as soon as actuators are needed, it may break down.
  • Are standard interfaces required? Andreas had put up a slide showing a software stack for self-driving cars. Generally, the answer to the question was yes—but the discussion drifted fast to the scope of what the interfaces cover. Of course, interface standards like CAN, Ethernet AVB, and others are needed; software standards like Autosar help re-shape the design chain and dependencies between suppliers and OEMs. Depending on application domains and the scope to be covered, standards make sense—like the Apollo software stack sponsored by Baidu.
  • Is virtual qualification possible? This was a topic that had previously been presented by Boeing during the session: using emulation to re-apply data to a part that had to be re-developed because of obsolescence. Interesting future aspects going beyond pure electronics will be in interest here—do the parts work in all angles and all acceleration specs that were defined? Users can re-apply the same tests that were used for qualification years ago and perhaps qualify the new part easier.

The bottom line is that emulation has a fascinating future ahead in the space of “systems of systems.” For example, the complexity of cars is quite daunting. We also discussed how to grow beyond the traditional space of digital electronics to representations of the analog and electromechanical aspects.

The last time I had been in the Palace of Fine Arts building was when they were housing the Exploratorium. Fun memories. While I had originally considered leaving after the workshop, the quality of the keynotes and presentations simply blew me away and I ended up staying. Professor John Hennessy, Chairman of Alphabet, delivered a keynote on “The New Golden Age.” He described how we arrived at the end of an era that had delivered 40 years of stunning progress. We are looking at 106 times better throughput than microprocessors deliver. Add to that three major architectural innovations in processors: going from 8-bit to 4-bit, introducing instruction-level parallelism, delivering multicore capabilities now up to 32 cores. And finally, add in clock rates moving from 3MHz to 4GHz. All of this was made possible by semiconductor technology, driven by Moore’s Law and Dennard Scaling.

Today, as the three changes in technology, architecture, and application specificity converge, we are really re-thinking what Hennessy calls “the same old ideas.” With the emergence of domain-specific architectures (DSAs), hardware becomes “sexy again.” If we now combine DSAs with domain-specific languages (DSLs), then portability and performance can co-evolve. In addition, Hennessy sees lots of opportunity and potential still in silicon with new methods for efficient energy scaling—in packaging and technologies such as carbon nanotubes and quantum computing. Especially with regard to DSAs and DSLs, Hennessy closed with the comment, “Everything old is new again,” referencing the development of the ILLIAC IV back in 1975, and a quote from Dave Kuck on how applications and hardware architecture didn’t really match that well. And that was over 40 years ago!

Looks like revisiting some older ideas will indeed lead us to a brighter future…



Leave a Reply


(Note: This name will be displayed publicly)