best site to buy provigil online, modafinil (provigil) or armodafinil (nuvigil), provigil message board, provigil and estrogen, provigil online free shipping, how to purchase provigil

Software-Defined Test And Measurement

SDx is making inroads into 5G, automotive radar, and other new technology.

popularity

Software-defined radios, instrumentation and test are ramping up alongside a flood of new technologies related to assisted and autonomous vehicles, 5G, and military/aerospace electronics, breathing new life and significant change into the test and measurement market.

Software-defined test adds flexibility in markets where the products and protocols are evolving or still being defined, and where system architectures are being tweaked or replaced to deal with an explosion of data. In effect, the entire compute infrastructure across multiple markets is shifting, and the number of signals that need to be optimized and processed is rising significantly. Alongside of that is software-based instrumentation, also known as virtual instrumentation, which builds in similar levels of flexibility rather than relying on benchtop, handheld, and standalone instruments that have been a mainstay of the test and measurement business for decades.

“If you look at the smartphone you have sitting in front of you, that thing has multiple radios, testing multiple technologies, and the cost of that test, in theory, should be higher than it was 10 years ago because I’m actually testing probably 10 times more things than I was back then,” said Jason White, director of wireless test for National Instruments. “I’ve got more antennas, more radios. Besides cellular, I’m now testing Wi-Fi, GPS, NFC, wireless charging. All these things represent air interfaces on the product. The trends that you’re seeing in that space from a test equipment perspective affect the semiconductor value chain that goes into those devices, and they also affect the end devices’ manufacturing test.”

Software-defined test also provides the ability to do more tests using a single testing device.

“You’re seeing more and more pieces of test equipment that can test multiple standards,” said White. “You’re starting to see trends where the cost of test equipment has gone down. Or in some cases, the cost of test equipment may not have gone down. But the pressures around being able to utilize that test equipment more, and also having much faster test times than you had in the past, tare all part of the buying decisions that consumers are making, both in semiconductor and on into end-device production test.”

NI isn’t alone in seeing this shift (see Fig. 1). The market for software-defined technology is picking up steam on a number of fronts as new end markets begin coming to grips with a significant increase in data generated by a proliferation of sensors. That data needs to be processed in multiple places, and it has to be moved around quickly both wirelessly and through copper and optical cabling.


Fig. 1: Partial list of companies offering software-defined instrumentation and software-defined radio products. Source: Vendor documentation/Semiconductor Engineering

This helps explain why there is so much attention focused on the Peripheral Component Interconnect Express (PCIe) bus standard and its related serial interface, which are key enablers for software-defined interfaces and test. The AXIe Consortium last year unveiled the Optical Data Interface specification, an initiative embraced by Intel, Keysight, and Xilinx, among others. The ODI spec supports high-speed instrumentation and embedded systems for work in 5G, advanced communications research, and mil/aero applications.

“The measurements really get defined on the computer, and there are fast data pipes that can exist between the instrument and the PC so that you aren’t tied to just one piece of software,” said Matthew Maxwell, Tektronix’s product manager for real-time spectrum analyzers. “For example, we have our USB spectrum analyzers that run the software we took out of our benchtop big box and now have it as standalone PC software. And because there’s a fast USB 3.0 connection that provides data rates of up to 280 megabits a second, it saves a step on the instrumentation side. We don’t need to design the same kind of digital interface that we used to, which basically meant build a laptop type of PC into every instrument that’s automatically obsolete by the time we come out with it. Now we plug in any laptop that has USB 3.0 into a USB instrument and you’re already up to date. That’s a lower-cost approach. There’s a proliferation of USB instruments out there. On the high end, we’ve done something similar.”

Moving large amounts of data quickly is particularly important for military applications, as well. Antenna systems, Bluetooth connectivity, and the Internet of Things also benefit from these software-enabled instruments.

What is SDx?
Software-defined technology moves components into software that in the past typically were developed in hardware, either using a PC or an embedded system. This is particularly important with the proliferation of new technologies, protocols and feature sets that are still being defined. In the case of 5G, for example, there will be multiple revs of the technology, starting at what is essentially 4.5G and ultimately moving to millimeter-wave technology.

Trying to support all of these changes in hardware is too expensive and far too slow. By the time hardware is fully tested and released, protocols may have shifted. That means the products are obsolete by the time they hit the market, whether that is the wireless radio technology or the equipment used to measure and test the signals and various pieces of equipment.

Building this capability into software provides much more flexibility. Rather than changing the hardware, the software can be updated, patched or completely replaced. And due to constant improvements in the compute capabilities of PCs and other mobile devices—the result of multiple revs of Moore’s Law—there is plenty of compute horsepower to make all of this work. But it’s not always as simple as it sounds.

“Designing the software is one of the most challenging aspects of developing software-defined instrumentation and software-defined radios,” said David Hall, senior group manager for test systems at NI. “It might seem obvious that the intent of both types of products is for engineers to be able to develop and customize the functionality of the device using software. However, what might be less obvious is that the most common users of these devices are not necessarily software engineers. Instead, systems engineers, test engineers, and advanced researchers , all with varying levels of software expertise, are often the primary users. Thus, one of the biggest challenges of delivering software-defined instrumentation and software-defined radios is delivering a software experience that is powerful enough to take advantage of advanced technical capabilities like heterogeneous computing and high-throughput data movement, while at the same time being simple enough that every engineer can easily use.”

Software-defined technology appears to be spreading into other areas, as well. For example, software-defined radio could replace cognitive radio, just as cognitive radio is beginning to gain traction, according to Maxwell. Cognitive radio allows transceivers to detect which channels are being used and dynamically utilize those that are vacant. It also can utilize the best channels in a given area to avoid interference and signal congestion. The technology is comparable to smart load balancing in virtualized server farms, but with the added focus on signal quality. This is particularly important for 5G, military/aerospace applications, and self-driving cars.

“Evolving wireless technologies like 5G and IoT are driving the capabilities of software-defined instrumentation and software-defined radios,” said Hall. “Especially with 5G, the requirements are driving radio designs to wider bandwidths and new frequency ranges. Although 100 MHz was once considered a ‘wideband’ instrument, the market now requires bandwidths of 1 GHz or more. In addition, technologies like 5G and even automotive radar are also driving the frequency ranges of these devices. For example, interest in the 71 to 76 GHz band for 5G and the 79 GHz band for radar have driven both instrument and SDR technologies toward supporting new millimeter-wave bands.”

Software-defined technology
5G has a particularly important role to play in software-defined technology. There higher frequency wireless communication technology allows more data to be moved around and more signals to be processed, but it also will require more experimentation to optimize that traffic. With 5G, signals don’t travel as far, and they are subject to more interference than previous generations of wireless communication such as 4G or 3G. Building much of this capability into software, rather than trying to retrofit the hardware to deal with the software, allows for much more flexibility to accommodate change.

“There are smaller custom signals, but way more of them,” said Brian Durwood, R&D Engineer at Keysight Technologies. “In MIMO, quantum, and radar, there is a parallel emphasis on generating many signals while tracking many targets. Accordingly, racks of skinny little screen-less AWGs (arbitrary waveform generators) are replacing stacks of traditional box instruments. Data crunching is getting stressed, and that opens the door to possible cloud-based processing deployment and/or big data analytics.”

Rather than trying to deal with this using existing technology, system architectures need to be re-thought, and along with that so does the instrumentation and test.

“At the core of software-defined anything is the concept of end-to-end prototyping, wherein the developer starts with a framework that includes sources, analyzers, and some place in the middle to experiment with the parts that need to change,” Durwood said. “For these types of systems to be robust, there has to be ‘epic modularity,’ such that the system does not break when a component is modified. In this way, hardware-in-the-loop engineering begins to resemble Agile software engineering, with its emphasis on well-tested modules with robust APIs, so they can be modified and swapped out without crashing the rest of the system. In radar, an example module could entail kinematics. That’s a fancy word for being able to add in-line filters to simulate physical conditions such as rain, which can impact lower power, MIMO signals.”

Software-defined radio came into focus with LTE, military communications, satellites, and wireless connectivity. It is now being deployed in automotive, where the number of possible permutations is expanding and the operating environment is both challenging and highly competitive. Wireless remains the lion’s share of SDR deployments with teams needing to insert hardware-in-the-loop into both ends the network testing gear, along with military radar and Internet infrastructure. But there are new applications on the horizon that will require this kind of architectural flexibility, as well.

Quantum computing is stretching the envelope on SDI with many synchronized signals, requiring racks of PXIe-based AWGs with user programmable FPGAs,” said Andrew Westwood, a systems architect at Keysight. “Instead of the common instrumentation requirements of wireless and electronic warfare to support high-throughput and raw-data I/O, quantum requires 100% of the waveform construction and analysis to be performed within the FPGA, and then circulate very small data analysis results between the FPGAs on the generation and analysis ports. Waveforms are contained by—or constructed by—primitive/segment manner from very small files. Analysis is of relatively small amounts of data in a noisy environment, however, and they are after the analysis results, which could be as small as three bit values. Those simply need to be presented back to the waveform construction engine.”

Each market has its own requirements. Automotive, for example, is very price-sensitive and open to using IP to track multiple objects in streaming real time. Military radar, in contrast, requires faster responses, more targets, more environments, and more intelligence on both ends. And in the datacenter, neural networks and machine learning are in an almost constant state of change. This is even more apparent in the cloud, where constant movement of data and processing requires almost constant model creation, execution, data centralization and data analytics.

This opens the door for SDR development and test.

“Conventionally, the upside includes more channels in MIMO, significant miniaturization for portable devices, better power characteristics,” said Keysight’s Durwood. “Unconventionally, SDR and SDI are part of the solution, too. Emerging test challenges are stimulating over-the-air sensors that eliminate the physical contact to the device under test. This is required at some frequencies at which physical probe contact changes the circuit performance. Real-time streaming is another military concept that we expect will trickle down more to commercial wireless.”

One of the key goals here is designing for non-obsolescence. “The user investment in software and gateware engineering can over time eclipse the hardware investment,” Durwood noted. “Intelligent planning of architectures, and selection of vendor-independent tools, can make quite a difference 5 years, 10 years, and in military applications, 20 years down the road as test gear life extends. You also need design services to close bleeding-edge timing. FPGAs are easy to use at 9/10ths. Xilinx and Intel PSG have introduced high-level synthesis tools that greatly improve the ability of software engineers to work in FPGA hardware. But it remains the province of the FPGA experts to squeeze 10/10ths performance. Quality of results has always been an issue with ‘automatic software-to-hardware compilation. It’s only as good as the designer’s ability to refactor HLL code to coarse-grained logic that efficiently compiles to FPGA. Plus, the software-to-hardware compiler is a bit of a black box, making assumptions that can impact the deployment in hardware. Teams are still relying on experts to close ultimate timing when the project is important.”

Conclusion
Evolving technology, an explosion in data, and a variety of new market opportunities are forcing changes throughout the test and measurement ecosystem. Creating hardware to match a particular use case no long is sufficient, and given the compute power available in laptop computers and the ease of upgrades, it’s no longer necessary or even cost-effective.

SDI/SDR are finding various applications now. “Today, engineers are using SDI and SDR technology in an extremely wide range of applications, from general-purpose test and measurement applications to advanced wireless research,” said NI’s Hall. “In more traditional test and measurement applications, the technical benefits of a software-centric approach and the fast pace of change in the wireless industry are driving widespread adoption. More specifically, the fast evolution of new wireless technologies like 802.11ax, and 5G are driving engineers to use instrumentation that is more flexible — especially because the cost of updating the instrument’s software is significantly less than that of purchasing new hardware.”

Software-defined instrumentation and test has been on the horizon for some time, but it is now a requirement across an increasing number of markets. That trend will only grow across new markets as the amount of data increases and the need for moving that data increases with it.

Related Stories
Wireless Test Faces New Challenges
The advent of 5G and other emerging wireless technologies make test more difficult. Over-the-air testing is one possible solution.
Looking At Test Differently
How test strategies are changing to adapt to smaller batches of more complex designs and new packaging technologies.
Testing Analog Chips
Increasing numbers of analog components could help perk up this market after years of steady but sleepy growth.
Auto Chip Test Issues Grow
Semiconductors used in cars have higher quality and reliability requirements than most chips, but they have the same cost and time-to-market pressures.



Leave a Reply