Reconsidering design approaches when implementing Cat-M and NB-IoT.
The Internet of Things is here and ramping deployment today, but there’s still considerable work underway to optimize many aspects of the network. Not the least of this are the access technologies that exist or are emerging to enable the ‘last mile’ connectivity for IoT connected objects. Wireless access broadly fits into two main areas: licensed band and unlicensed band. Short-range unlicensed band technologies, such as Bluetooth, Wi-Fi and 802.15.4, are already prevalent, and we are starting to see the emergence of unlicensed wide area communications such as LoRA, Weightless-P and 802.11ah that will all play an essential role in connecting devices.
Licensed band technologies have been somewhat slower to react to the IoT opportunity, in part because of the industry focus on mobile broadband services driven by the smartphone revolution. For decades we have seen so-called machine-to-machine (M2M) services deployed on 2G systems such as GSM. These services are generally deployed in scenarios where there is a need to support wide area wireless networking coupled with mobility such as transport, logistics and vehicle tracking. As the standards evolved through 3G and LTE, the energy and complexity requirements of the technologies far outstretched the needs of IoT devices. As such, 3G and LTE are only used in a few niche use cases that require higher data rate connections, such as video surveillance.
The latest incarnation of 3GPP Release 13 has worked to re-address the balance and has introduced enhancements to the standards to focus more on IoT use cases and delivering energy-efficient connections over licensed bands. LTE-MTC (Machine Type Communications) aims to deliver power-efficient connections over existing LTE networks. With the inclusion of mobility, this technology promises to offer a long-awaited upgrade path for the majority of 2G M2M connections today.
LTE Cat-M is a cellular standard with a number of benefits compared to the non-cellular technologies. One obvious benefit is the existing infrastructure for LTE, where operators around the world have been rolling out this technology since 2009. According to GSA, there are now 480 LTE networks launched in 157 countries. In addition to Cat-M1, 3GPP has also defined a ‘clean slate’ standard called Narrow Band-IoT (NB-IoT), which is designed to operate in licensed bands providing wide area IoT connectivity. NB-IoT offers the prospect of further reducing the cost and complexity of the end devices while bringing the advantages of licensed band deployment.
There are several different emerging wide area wireless technologies alongside LTE Cat-M, based on other specifications such as SigFox, LoRa and Weightless. They target extremely low-power, low-throughput applications, most often in unlicensed spectrum. Although each solution has its pros and cons, an advantage with LTE Cat-M and NB-IoT over these clean-slate solutions is that they can leverage the existing network infrastructure. Indeed, base station providers claim that the LTE Cat-M and NB-IoT technologies require only a software upgrade to support the latest standards. The network deployment cost, therefore, will almost be zero.
Cost and power consumption of IoT end nodes are key requirements seen in many use cases, which assume battery-operated devices that should last for up to 10 years. The target for the new 3GPP categories for IoT is to be able to operate on 2AA batteries for more than 10 years.
There are several key additions to the Cat-M specification in 3GPP release 13 providing lower cost and power consumption. The figure below shows different key parameters for different LTE categories. The first LTE specification in release 8 specified 4 categories, with Cat-4 as the highest category supporting up to 150Mbits/s in the downlink. The modem complexity is derived from this category and normalized to it. Cat-0 was specified in release 12 as an intermediate step towards a competitive LTE specification for IoT applications. The complexity of LTE Cat-0 vs. LTE Cat-4 is estimated to be reduced by 40% mainly due to lower data rates but also from the change in duplex mode, where half duplex mode eliminates the need of a duplexer and so saves cost. LTE Cat-M is an optimized IoT version of Cat-0 where the major change is the system bandwidth reduction from 20MHz to 1.4Mhz. Another important change is the transmit power reduction to 20dBm. This reduction eliminates the need for an external power amplifier and enables a single chip solution, again reducing cost.
This standard’s evolution is good, but designers are then confronted with the challenge: how to implement LTE Cat-M and continue to push down on the cost, size and power curves.
One way is to re-think components selection. For example, traditionally, a separate digital signal processor (DSP) has been chosen to execute algorithms such as transforms and filters, which require particularly intensive mathematical calculations. The architecture of a DSP is designed to provide optimized implementations of the mathematical operations found in these algorithms. Conversely, this means that they are not particularly well-suited for the control tasks typically performed by standard microcontrollers.
Microcontrollers, on the other hand, are general purpose and are normally optimized for basic data and arithmetic processing, interfacing to standard peripherals, handling user interfaces, and providing connectivity via a network interface. Microcontrollers, however, are not usually good at performing intensely mathematical algorithms because they lack the required number of registers and a dedicated set of instructions that support these computations.
A design team could shrink a multi-chip solution to keep DSP and MCU functions separate, but does a multi-chip solution necessarily work in the real estate-, cost- and power-constrained IoT world? Not always, therefore other approaches need to be considered.
For example, a key feature of the ARM Cortex-M4 and Cortex-M7 processors is the addition of DSP extensions to the Thumb-2 instruction set. These instructions accelerate numerical algorithms and provide the opportunity to perform signal processing operations directly on a microcontroller without the need for an external digital signal processor.
The re-thinking exercise is valuable in considering the modem design. A natural approach when designing a low-cost and low-power digital modem is to adopt previous existing architectures to the new low-complexity requirements. This approach results in a traditional architecture based on one or a number of general-purpose processors (GPPs), DSPs and accelerators.
There are a number of advantages using this approach but also some disadvantages. The most important advantage is reuse. Shrinking a full LTE modem to be able to only support Cat-M or even NB-IoT can reduce the total design effort. But the final architecture, which is originally based on a full LTE architecture, might not be optimal in terms of cost and power consumption since the full LTE modem is fundamentally different than a Cat-M/NB-IoT modem. Reducing the number of GPP and DSP cores, using low-complexity ARM and DSP cores, reducing clock frequency and memory are typical approaches when shrinking a full LTE modem towards a Cat-M/NB-IoT modem. To accomplish really low power and low cost a different method might be advantageous.
Due to the enhancements introduced in 3GPP for the upcoming LTE Cat-M standard, architectural simplifications are also possible. The traditional approach, which includes a GPP, a DSP and a number of accelerators, could be also be simplified. Teams could consider, for example, using the ARM Cortex-M4 processor with its DSP instructions and pre-compiled optimized signal processing libraries. Instead of using a dedicated DSP for the signal processing algorithms, such a processor is able to run some of the algorithms that traditionally are run on DSPs. The heaviest algorithms, in terms of signal processing, are still run on dedicated accelerators in order to meet timing and reduce power. The identified potential algorithms that are candidates for the processor include: FFT, channel estimation, equalizer, modulation and demodulation. Algorithms that are mapped to dedicated HW include: decimation, initial filtering, synchronization and decoding.
IoT is a brave new world for designers, full of opportunity and challenge. With IoT use cases demanding ultra-low power consumption coupled with low cost, the design parameters and standards for connectivity are changing to meet these new demands. There’s no need to abandon the past completely, however. A careful consideration of new approaches, new architectures and components choices crafted with an eye toward power and cost benefits can help evolve standards such as LTE and can help carry teams successfully into this new world.
Michal Stala, CEO, and Magnus Midholt, Co-Founder, of Mistbase (an ARM partner that delivers wireless IoT solutions), and I have written a white paper that dives more deeply into this topic. You can download it here.
Leave a Reply