The Fine Art Of Compromise

Different requirements and limitations for hardware and software forced design teams to change their modeling strategies.

popularity

By Jon McDonald
Ask 10 people a question and you might get 10 different answers. Ask 10 software engineers what they need in a hardware platform and you might get more than 10 different answers because each probably will have a list of needs for the platform to deliver. Getting them to agree on acceptable targets may not be as difficult as a budget compromise, but project failure is a more personal motivator than the fiscal cliff.

Recently I was involved in project discussions around this topic. One aspect of the discussions was to try to understand what level of accuracy was needed in the models being created to support software development. The discussion was challenging due to the vastly differing perspectives and concerns of the individuals involved. The hardware developers were responsible for creating the models of the platform and wanted to know what level of detail and performance was required in the model, but the hardware group was going to use the models in a very different way than the software team would be using them. The software team just wanted to be able to run their software. They didn’t want to deal with the complexities of the modeling.  The software group wanted the software to work as it would on the physical hardware when it became available. They wanted an interactive runtime performance as close to the real hardware as possible, and they wanted it now.

These very open-ended and competing requirements from the software developers didn’t help to focus the modeling requirements. Through additional discussion on the questions and tasks that needed to be addressed by the software developers a number of characteristics of the modeling platform were identified. For a portion of the development, a functional model would suffice. While developing initial software functions and modules the focus was on logical correctness of the software—the model would need to deliver a fast interactive runtime experience to allow for the interactive debugging needed to develop the functions. They had experience with transaction-level modeling at the loosely timed (LT) level and felt this would deliver a very good platform for this class of software development.

The system did have unique hardware content that was developed for a reason: They had specific targets for performance, throughput and power that must be met and those targets needed to be verified with the software running on the hardware. For this detailed verification they needed a cycle accurate representation. The only thing that would deliver the accuracy required for this sign off was the RTL model of the hardware. Due to the limitations of RTL. late availability, slow runtime performance, and limited ability to make changes, the RTL was accurate and appropriate for the final signoff. But it was not appropriate for use during the development of the system. Development of the system included both the hardware and software components of the system.

For a significant portion of the system-design process, which included both hardware and software, the LT models would not provide the accuracy required and the RTL had too many limitations. Compromise is much easier once the realities of the choices are accepted. The compromise was to enhance the transaction-level models to a level of accuracy that allowed for the performance, throughput and power analysis required to make the system design choices accurately. The approximately timed (AT) level of modeling defined by TLM 2.0 allowed for the accuracy, early access, flexibility and runtime performance required for this level of analysis.  AT-level modeling is a compromise that sits between the pure runtime performance of an LT model and the implementation accuracy of the RTL. While initially the groups were reluctant to accept the compromise, after understanding the alternatives they realized AT delivers a unique set of capabilities that cannot be addressed by the other modeling levels. That made it a worthwhile addition to their development process.

—Jon McDonald is a technical marketing engineer for the design and creation business at Mentor Graphics.



Leave a Reply


(Note: This name will be displayed publicly)