provigil svu, side effects of too much provigil, provigil memory pill, provigil mood stabilizer, vyvanse or provigil, provigil social phobia

Four Steps To Verifying An SSD Controller With Emulation

Emulation is well suited to address the complex challenges of new storage technologies.

popularity

By Ben Whitehead and Paul Morrison

Datacenters, cloud computing, the IoT, and all things electronic demand that huge amounts of data and information are stored securely and accessible anywhere at any time. This requirement is driving the adoption of new storage technologies.

The capacity, size and performance of solid state drives (SSDs) make it a very interesting technology. It offers higher performance, lower latencies, lower energy costs, higher reliability, and other advantages compared to hard disk drive (HDD) technology.

The primary differentiator of SSDs are complex controllers that must perform a myriad of tasks to receive, monitor and deliver data accurately and reliably. To ensure these controllers are built optimally and delivered to market quickly, an increasing number of controller design teams are turning to emulation-based verification methodologies.


Figure 1: SSD controllers add complexities of their own that need to be managed.

Hardware emulation is a smart option for many parts of the SSD verification flow. While simulation gives full visibility and is ideal for verifying design blocks, it has limitations in terms of speed. On the other hand, FPGA prototyping allows faster, more extensive testing, connections to external hardware, and firmware test development with real hardware, but has limited visibility for debug and is much less flexible.

Emulation spans the gap between simulation and FPGA prototyping, as it is faster than simulation, gives more visibility than prototyping, runs real production firmware, and enables the same setup for both pre- and post-silicon verification. In addition, hardware emulation works in concert with FPGA prototyping for more effective debug and root-cause analysis.

Let’s explore some basic steps for verifying an SSD controller using virtual emulation on the Mentor Veloce emulation platform. Running the emulator in virtual emulation mode leverages models of the peripherals and devices rather than physical components that are customary in an in-circuit emulation (ICE) mode.

1. Create the Verification Environment
Several steps need to be followed to convert an existing environment for testing an SSD controller on an emulator or for creating a new environment for enhanced testing.

Host Interface modification: The host interface, most likely, requires the most modification to get an environment to run. Using a host Veloce VirtuaLAB solution that connects into a QEMU environment allows a user to run existing applications that may be available. This can include test scripts, performance measurement applications, and other host-related exerciser scripts. Users can also run off-the-shelf performance measurement software to measure the performance and identify bottlenecks within the design. All of this creates a better first tapeout and reduces the likelihood of subsequent controller spins or the number of spins necessary.

Host interface Veloce Transactor Library (VTL) designs are also available if engineers plan on using emulation of the SSD controller to enhance an existing verification environment. These are similar to existing verification IP used within the verification flow; although typically they have a subset of features relative to the verification IP that are required for communication with a synthesized design.

Replacing the NAND memory with a model: Given that SSDs typically range from several hundred gigabytes to several terabytes, finding enough physical memory available to implement the full drive memory is challenging. We recommend using a Software Sparse memory model (a model that runs on a server connected to the emulator with dynamic allocation of the NAND memory as it is used). Full hardware memory models are also available.

Virtual JTAG: If the environment is validating an entire design, including firmware, a virtual JTAG host needs to be connected to the processor for debug and trace support. The Mentor Codelink product supports quick firmware debug.

Replace DRAM and NOR memories with models: Since these two memory implementations are typically much smaller than the NAND array of memory, hardware models that reside on the emulator are best. Also available are DRAM DFI models, which should connect to many DRAM controllers and remove the implementation and debug time required to get a working DRAM PHY into the emulator.


Figure 2. Emulation deployment on Veloce.

2. Run the Tests
After the design is ported to the emulator, the user can run a set of tests to check out the controller design. Many testcases should already exist, ranging from the verification environment used with a VTL host front-end, to customer-based validation tests that can check out the design in a full system.

While not as fast as real hardware in the lab, these tests run significantly faster than in a verification environment. Many tests not even considered before because of runtime are now possible, running in a fraction of the time while still providing full visibility. Firmware can also be loaded and run on the emulator, allowing for testing of a production design long before that design is available in the lab.

This environment supports development and debug of hardware designs, firmware designs, validation test scripts and customer test scripts — all before tapeout, reducing time-to-market as well as increasing the likelihood of working first-pass silicon.

3. Debug
The emulator has many ways to find the root cause of bugs. While all signals within a design can be captured in a waveform, the Veloce emulation platform also supports a capture mode where only those signals of interest are captured.

Codelink enables results to be captured and replayed independently of the emulator. Debug can be done by stepping forward and backward in the results to isolate and fix a bug. This “capture and replay” feature also works from a hardware perspective. Results for a test run on the emulator are captured and downloaded to a server. Engineers can rerun and debug the results while freeing up the emulator for other uses.

4. A/B Testing
Testing of different SSD configurations, specifically the amount and configuration of NAND memory connections, can be challenging in a typical lab environment. At a minimum, the existing memory is replaced with the new configuration — in the worst case, a new printed circuit board created and parts soldered on the board. This assumes that the physical parts exist and work for cutting-edge development.

It’s possible that the NAND chips are being developed concurrently with the controller design, and they aren’t even available for prototype testing. A NAND memory model solves these problems. Even if a NAND chip is not available, a specification typically is. A NAND model is created based on that specification and used for pre-tapeout testing of a controller to ensure that it works as expected. If a feature in the NAND chip changes, the model is easily updated to match the new feature, and the testing is rerun.

Most if not all controllers are designed to support multiple sizes and configurations of memory, including number of channels, number and size of blocks and pages, number of planes, and multiple other configuration options. Testing all of these possible configurations is much easier with an emulator, as only a parameter needs to be changed before recompiling the design.

Instead of having to replace chips and possibly create new printed circuit boards, a different top-level file is created that instantiates the different configuration, re-compiled, and a new set of tests run with the new configuration. This makes A/B testing with different configurations and different optimizations much easier and faster to run, allowing the controller design team to make tradeoffs in their design and update their architecture much sooner if it’s discovered that there is an unexpected hole in performance or support.

Conclusion
As storage technology and use models continue to evolve, so do the verification tools needed to solve today’s challenges. Emulation is well suited to address these challenges, as has been proven by leading storage companies who are using it today in their production environments. The advances in storage technologies also reveal new opportunities to use more flexible and powerful tools. Ultimately, the goal is faster time-to-market. Emulation is a significant contributor to reducing development, testing, and debug time.

To take a closer look at the challenges of the storage market, the evolution of SSD, and how an emulation-based verification methodology offers design teams a significant advantage—as well as how to implement an SSD controller in Veloce—please download the new whitepaper Using Emulation to Deliver Storage Market Innovations.

Paul Morrison is a technical marketing engineer in the MED Solution Marketing group at Mentor, a Siemens Business.



Leave a Reply


(Note: This name will be displayed publicly)