Not All Software Is Like Elvis
Only some of it leaves the building.
January is traditionally my look-back and outlook month. Five years ago my year-end wish had been a census of software developers, and it is fascinating how software in the context of verification has evolved since then (more on this below). Also, most years I go into my garage, dust off my collection of IEEE Spectrum print editions from January five, ten and 15 years back to assess which of the predictions have withstood the test of time. Well, after I remodeled my garage into a gym and hobby room, I am now lucky that most of what I need are archived and digitized!
Looking back 5 years in IEEE Spectrum’s outlook issue from January 2012, the choice of top technologies included 3D chips as a means to extend Moore’s Law. This January 2017, the IEEE Spectrum outlook section also discusses the future of Moore’s Law both in “Intel Finds Moore’s Law’s Next Step at 10 Nanometers” as well as in “The Ising on the Computer Chip.” It is fascinating to see how changes to known trends are not happening overnight. In the former article, Associate Editor of the IEEE Spectrum Rachel Courtland says:
“Intel says transistors [..] will be cheaper than those that came before, continuing the decades-long trend at the heart of Moore’s Law—and contradicting widespread talk that transistor-production costs have already sunk as low as they will go.”
And it is equally fascinating to see the milestone in optical computing in the latter article that deals with HPE’s 1,000-component optical processor made for challenges like the “traveling salesman problem,” really exploring a future after Moore’s Law.
Another striking similarity between the 2012 and 2017 issues of IEEE Spectrum is the outlook on telecommunication as a driver for electronics. In the December 2011 issue, “Fantastic 4G” looked at how hundreds of telecoms would invest in 4G LTE networks, while “Here Comes 5G—Whatever That Is” deals with the next generation this year in 2017—discussing how Verizon and AT&T prepare to bring 5G to market by 2020. 5G has the potential to increase speeds to 10Gbps by 100x, reducing latency by 30-50x and increasing the number of connections per square km by 100x to 1 million. It will be a huge driver for electronics in the next couple of years, not even counting all the new applications it enables.
Adding into the picture other key drivers like autonomous cars, deeper and cheaper machine learning and augmented reality, the future for electronics certainly looks bright! So will it be in hardware, software, or both?
Looking back to what I wrote five years ago, I was concerned about software developers and their numbers. In “All I Want for Christmas … Is a Census of Software Developers!”, I was musing about data I had seen from Jim Hogan—the inverse triangles—and data I had visualized based on data provided by Handel Jones from IBS.
A lot has happened since then:
- The number of developers that write software that “leaves the building” together with the hardware has grown even further. This is the software that Jim Ready, the founder of MonteVista, always referred to as “Elvis Software”—for obvious reasons. It defines the product functionality, runs the user interfaces, and determines the user experience. Since 2012, with growing numbers of software developers, the mix of developers has been further tipped towards software.
- Intricate knowledge of the hardware/software interaction is a key skill—almost a new job description. Systems and systems on chips (SoC) have grown so complex that no single person can understand all of the elements any more. The person who understands the hardware/software interface is becoming the moderator between development teams and also can assume a central role in optimizing the design characteristics. For instance, in the server space, optimization for specific workloads has become critical, and detailed knowledge of the hardware/software interface is crucial to achieving
- There are millions of lines of code that actually never leave the building; they are used purely for testing and verification. For example, bare metal software is often not only software that serves as an abstraction layer for the actual “Elvis software.” There is also bare metal software created as automatic tests to stress whether the hardware actually operates correctly at the SoC level. (Incidentally, my colleague Tom Anderson just wrote a great blog called “Bare Metal Tests and Hardware-Software Co-Verification.”)
- The actual operating system (OS) is often visible to the users. However, with real-time Linux as test vehicles, and tools like Kozio’s VTOS, there are definitely OSs out there that the user never sees and are geared solely towards verification of the hardware-software At the highest level, this can be a set of tools following the Portable Stimulus group’s suggestions, such as the Cadence Perspec—or it can be a set of middleware verification routines, such as the OpenGL tests as described by NVIDIA making sure that your graphics subsystem works correctly.
What’s so exciting for verification in this context is that this software can have different engines on which the hardware executes, as well as different hybrid combinations:
- Pure TLM based virtual platforms are geared towards functional representation of the hardware for early software development.
- When RTL simulation is connected to TLM based virtual platforms with processor models like the Fast Models from ARM, the resulting hybrid is geared towards lower level driver verification due to its speed.
- Emulation is used by itself for various types of software development. Emulation hybrids with virtual platforms – like the ones in which Palladium emulation is connected to Fast Models from ARM (see NVIDIA) – allows users to get to the point of interest even faster and enables early OS bring-up and software-driven
- FPGA-based prototyping (like the Cadence Protium Platform) is geared towards software development that requires full accuracy after RTL has become stable, while reaching higher speeds than emulation.
- Similarly, development kits based on early chip samples are a key vehicle to bring up software at real hardware speed.
I have attempted to capture the sweet spots per engine, as well as the Elvis-software relationship, in the graphic associated with this blog – part of the picture credit goes to my colleague Jose Fernandez. Bottom line, as you can see, whether it is Elvis-software that leaves the building and gets to the users or non-Elvis-software used for testing and verification, the set of engines for execution of the hardware is broad and hybrids between the engines are used frequently.
The future for hardware and software – together – definitely looks bright!
Frank Schirrmeister
(all posts)
Frank Schirrmeister is executive director, strategic programs, system solutions in Synopsys' System Design Group. He leads strategic activities across system software and hardware assisted development for industries like automotive, data center and 5G/6G communications, as well as for horizontals like AI/ML. Prior to Synopsys, Schirrmeister held various senior leadership positions at Arteris, Cadence Design Systems, Imperas, Chipvision and SICAN Microelectronics, focusing on product marketing and management, solutions, strategic ecosystem partner initiatives, and customer engagement. He holds an MSEE from the Technical University of Berlin and actively participates in cross-industry initiatives as chair of the Design Automation Conference's Engineering Tracks.
Leave a Reply