Verification And Validation Don’t Mean The Same Thing

The two tasks have different goals and require a different approach.

popularity

While often used intermixed, verification and validation are quite different procedures with different goals and different means to achieve those goals.

No better way to clear up the confusion by starting with some definitions as stated by Wikipedia, https://en.wikipedia.org/wiki/Verification_and_validation:

“Verification is intended to check that a product, service, or system (or portion thereof, or set thereof) meets a set of design specifications.”

“Validation is intended to ensure a product, service, or system (or portion thereof, or set thereof) results in a product, service, or system (or portion thereof, or set thereof) that meets the operational needs of the user.”

As you can read from the above definitions the fundamental difference between verification and validation is the context in which tests are being conducted. In a verification flow the correctness of the design is being tested against the design specifications. In the validation flow the correctness of the design is being tested against the needs of the targeted user. While the needs of the targeted user ought to have driven the design specifications, this should be explicitly validated.

In their article “How prototyping increases interoperability success” Hugo Faria and João Lucas from Synopsys illustrate that this is not always the case:

“At a recent plugfest, Synopsys was testing an HDMI 2.0 capable receiver, using an IP Prototyping Kit, with another company’s HDMI 1.4 transmitter. By connecting the cable to the receiver and reading the Extended Display Identification Data (EDID) from it, the transmitter acquires knowledge about the receiver and how to configure itself for compatibility. For example, if the preferred video mode of the receiver is 4K (the highest resolution video mode supported by our IP Prototyping Kit) and the transmitter is able to send only full HD 1080p, then the transmitter should configure itself for 1080p. However, during the plugfest, the transmitter system unexpectedly reconfigured itself for 480p DVI mode. …

… This unwanted behavior occurred despite the fact that none of the devices were acting contrary to the HDMI specification. It could only be discovered by trying multiple configurations in quick succession at the plugfest, which the IP Prototyping Kit facilitated.”

Since the tasks are different, how to perform them is also different. As verification is a continuous process that typically happens throughout the development cycle, it is important to enable many iterations. This also means that setup should be as easy as possible. That is why verification is typically done using simulators and emulators. Specifications can be checked leveraging verification IP (VIP) and transactors. They drive protocol correct traffic to the design under test to verify functional correctness. The performance of FPGA-based emulators using transactors also enables verification in context of the actual software which has become an absolute must in today’s software dominated technology world.

In contrast, hardware and software validation requires interaction with real world interfaces. And to enable realistic scenarios with these real world interface, they must run at their target speed which is typically over 100MHz. This requires a different methodology supported by prototyping. FPGA-based prototypes can support the target performance and enable interaction with real world interfaces through PHY daughter boards. It is, however, important that the prototyping hardware and software are developed with real world IO and performance in mind. At these high speeds every detail is important:

  • Taking care of the different clock ratios in the system and supporting the interface IP to run at 100+MHz.
  • Synchronization of clocks across multiple FPGAs.
  • Supporting high performance prototypes that span across multiple FPGAs by offering a high speed time division multiplexing (HSTDM) scheme that deals with multiplexing the many communication signals across FPGAs.
  • Minimizing delays across a prototype that spans multiple FPGAs by reducing the amount of FPGA crossings through direct cable connections between FPGAs across the entire prototype.

Only when a prototyping solution takes care of the above requirements will it enable complex designs with multiple interfaces to be validated against the target use cases. This involves testing software issues, like driver issues only triggered when validating against actual PHYs; performance validation, to avoid unexpected contention or the occurrence of latencies due to subsystem integration; hardware issues, like signal integrity; and power management problems like power up or power down sequence deadlocks.

As with many aspects in life, it is important to use the right tool for the job. Verification and validation are not the same. Only prototyping in the context of real world IO can provide early confidence that hardware and software fulfill the end user’s needs.



Leave a Reply


(Note: This name will be displayed publicly)