MEMS Goes Mainstream

Connecting MEMS with SoCs is a hot topic of discussion; more standards are needed.

popularity

By Cheryl Coupé
Micro-electromechanical systems (MEMS) are well known for enabling innovative capabilities for devices that range from vehicles and gaming to smartphones and tablets—and increasingly in personal health and fitness, security, and environmental applications. As stacked die become more popular, they also will become part of the integration challenge that chipmakers will wrestle with as they seek to build customized chips for very specific market slices.

For device makers—the companies defining the specs for these new SoCs—understanding MEMS will be a requirement. Today’s smartphones already have as many as 14 sensor types, many based on MEMS technology. And there are often multiple sensors as well: the iPhone 5 has four MEMS microphones, which enable better voice and recording quality as well as capabilities such as noise reduction and voice recognition.

Smartphones typically include an accelerometer, magnetometer, gyroscope and pressure sensor, which, when used together, allow the accurate computation of the linear and angular position, velocity, acceleration and altitude of the device. The accelerometer—along with other sensors such as ambient light and proximity sensors—also can be used for system power savings by hibernating the CPU when the device is in a purse or pocket or is lying still on a desk. And combinations of sensors support new innovations in gesture recognition, health monitoring, contextual awareness, location-based services and augmented-reality applications.

MEMS packaging trends and challenges
As part of the trend to multiple sensors, MEMS components have evolved from standalone devices to packages that typically include a microcontroller, an analog/mixed-signal ASIC and several MEMS sensors in a single package.

There are several ways that MEMS and electronics can be combined: at the process level (MEMS-on-CMOS or CMOS-on-MEMS), at the wafer-packaging level, or at the die level (multi-die assembled into a package). As is often the case, the choices are being driven by the cost, size and performance requirements of the consumer electronics market, and there are tradeoffs involved in each approach.

“There’s been a long-standing debate about whether a monolithic process or separate CMOS and MEMS process on different die is better,” said Stephen Breit, vice president of engineering at Coventor. “Generally, the trend has been towards different processes—that you do the MEMS process on one wafer and the CMOS process on another wafer—and then the question becomes how do you connect them. The crudest way at the moment is to have those separate die mounted side-by-side with wire bonds between them. Or you can move to more sophisticated things like bonding them together face-to-face like InvenSense is doing, or stacking them using TSVs [through-silicon vias] to get from MEMS to CMOS.”

All of these approaches are currently in use. Analog Devices produced the first fully integrated MEMS accelerometers for airbags in 1991. One problem with this approach is inefficient use of space. MEMS sensors have a fairly large footprint (up to half a millimeter square); once layers of CMOS are built under the MEMS, as much as a third of the total die area isn’t being used. Another approach—one used by InvenSense in its proprietary Nasiri fabrication platform called NF-Shuttle—builds the CMOS and MEMS processes on separate die and then bonds them together face-to-face in a small, cost effective standard package. This approach can reduce the number of MEMS manufacturing steps, supports wafer-level testing and can reduce back-end packaging and testing costs and improve yield.

Last year, STMicroelectronics became the first vendor to use TSVs for its high-volume MEMS production. TSVs replace traditional wiring with short, vertical interconnects between multiple silicon die stacked vertically in a single package, enabling a higher level of functional integration and performance in a smaller form factor than wire bonding or flip-chip stacking, but challenges remain in both manufacturing and test.

The evolution of sensor hubs
Today’s sensor-based applications require very high levels of computation to calculate input from multiple sensor axes, as well as information from many sensors and data sources to improve the accuracy and value of the combined sensor data. For instance, a motion-sensor component with 10 degrees of freedom has several MEMS sensors in a single package, including a 3-axis accelerometer, a 3-axis gyroscope, a 3-axis magnetometer and a pressure sensor. The output from these sensors into the information needed by the application is a complex task that is handled by sensor fusion software. Sensor fusion also adjusts for calibration and drift (caused by time or temperature) and compensates for issues such as axial alignment between sensors, the location of specific sensors on the board and interference (magnetic or electrical) from other devices on the board.

10-axis frame of reference. Source: Freescale

A common question is: Where do you do sensor fusion? According to Mike Stanley, systems engineer at Freescale, there are several possibilities. “You could say, ‘Well, if I have an apps processor in the phone, I’ll do it on the apps processor. The problem with that is that then your apps processor has to be awake all the time as your samples are coming in from your sensors. And those things will chew up a huge amount of power. Typically what you want to do is have a sensor hub.”

The most common sensor hub approach is a standalone microcontroller that does nothing but sensor fusion. It needs to be low-latency, low-power, and it needs to be able to shut down when it’s not in use. Some devices merge the microcontroller into the CMOS circuitries associated with the sensors, which reduces power even more because of the tight integration between the sensor circuitry and the CPU doing the calculations. Going forward, the sensor hub will likely move into the apps processor, with algorithms being computed in hard gates and in software while preserving the low-latency, low-power aspects of the device.

Sensor hub integration with mobile device apps processor. Source: Movea

System-level design with MEMS
As Freescale’s Stanley says, “Sensor fusion requirements are derived from an analysis of the overall system requirements.”

Stanley writes a popular blog where he covers common issues such as accelerometer and magnetometer placement on a product’s circuit board. For example, sensor fusion calculations are much simpler if the accelerometer is at the center of the rotating frame of reference. And a magnetometer needs to be the first step in PCB layoutin order to place it as far as possible from sources of hard and soft magnetic interference and high-current traces. Depending on the sensor type, strains in the package resulting from mounting it on a PCB can be enough to throw an offset into MEMS devices, which needs to be simulated and calculated as pre-board-mount and post-board-mount offset ranges.

Stanley describes a case in which Freescale was getting inconsistent measurements on a device under test. “They finally tracked it down to the effects of the graphite where we had written a serial number on the package—the charge in the graphite was affecting the sense circuits in the device. That’s because we’re sensing changes in capacitance on the order of femtofarads. The quantities we’re looking at are infinitesimal in comparison to what most engineers are used to working with.”

Standardization
One of the hottest topics around MEMS today is standards—or the lack thereof. Those standards will be a prerequisite to quickly modifying chips in a stacked die package, or even a system of chips that are connected using some high-speed communication network. Coventor’s Breit noted that most of the discussion is centered on the digital protocol used to communicate between the MEMS component and the SoC, as well as the expected output.

“Most industry observers suggest that the way forward is through standardization of hardware and software interfaces,” Breit said in a recent blog post. “On the hardware side, the system companies would like to see standardization at the digital interface level, such as compatibility with the I2C bus. To address this requirement, MEMS-based sensors such as microphones, accelerometers and gyroscopes are increasingly being labeled as digital, meaning they have a digital rather than an analog hardware interface.”

But there is also talk of standardizing the performance specifications of MEMS sensors, such as sensitivity, drift, zero offset, temperature stability, etc. Intel and Qualcomm recently initiated an effort to standardize terms and datasheet measurements for the sensors themselves rather than on measurements for digital I/O pins.

“The idea is to tie every one of those datasheet parameters to a mathematical model of how the device operates wherever possible so it’s unambiguous,” Stanley says. Intel and Qualcomm have developed a draft proposal, which has received input from Bosch, InvenSense, Freescale and STMicroelectronics as well as other members of the MEMS Industry Group. The plan is to have a draft 1.0 by the end of the year.



Leave a Reply


(Note: This name will be displayed publicly)