The majority of devices that constitute the Internet of Things (IoT) will use wireless machine-to-machine communications technologies to communicate with each other and with IoT applications typically running in the cloud. Some applications that require global coverage and/or mobility will focus on cellular technologies. Currently, mainly 2G and 3G technologies are used but the future will belong to new technologies like enhanced Machine Type Communications (eMTC) and Narrow Band IoT (NB-IoT). With these technologies mobile operators will be able to address a much wider share of the overall wireless IoT market.

Figure 1

Figure 1 Growth of cellular M2M connections. Source: Cisco.1

Features like power saving mode (PSM), extended DRX cycles (eDRX) and extended coverage (CE) will be used to tune the wireless interface to the different needs of the IoT applications. In order to meet all performance and availability requirements, all communication layers — the physical layer, the signaling layer, the IP layer and the application layer —have to work smoothly together. Consequently, there is a need for improved end-to-end application testing in order to optimize, for example, power consumption and reaction times.

IoT applications that depend on mobility or global accessibility make use of satellite technologies or cellular mobile radio technologies. About 86 percent of today’s cellular IoT devices use second or third generation mobile communications technologies1. Typical applications include fleet management, container tracking, coffee vending machines, ATM banking services and personal health monitoring. For the most part, these applications generate little data traffic, often needing only an SMS service for transmission. Figure 1 shows the expected increase of cellular M2M connections by year.

The fourth generation of mobile communications has played a smaller role to date. Because LTE is primarily optimized for the mobile broadband market, the IoT has generated little demand for 4G technology. Moreover, the costs for a typical LTE modem are still relatively high in comparison to a GSM modem and the global coverage of 2G/3G networks is still unbeatable. Some aspects of LTE, however, make it increasingly attractive. One of these is global accessibility: According to GSMA, 4G LTE networks covered more than a third of the global population by year-end 2015, and by the end of the decade developed countries are expected to reach ‘full’ coverage.

LTE offers additional technological advantages with respect to spectral efficiency, latency and data throughput. The long-term availability of LTE is another consideration. Second generation networks are in operation for more than 25 years and even though some future evolution provisions have been introduced in the specification, it may be possible that operators will discontinue the service on these networks in the long term. Therefore, the industry is looking for LTE solutions being competitive to today’s 2G solutions in terms of cost, power consumption and performance.

3GPP STANDARDIZATION FOR IoT

The need for optimized solutions for the IoT market was also recognized by the 3GPP standardization, and specific enhancements for machine type communication have been developed. For example, the committee has defined features in Rel. 10/11 intended to protect the mobile network against overload. Network operators need to be armed against the possibility of several thousand devices trying to connect to the network at the same time. This could happen after a sudden event such as the power grid coming back online after a power failure. Overload mechanisms and options for reducing the signaling traffic have been introduced to handle these types of occurrences. Many IoT applications — sensor networks as an example — only rarely send data and do not need to operate precisely to the second. These devices can report to the network that they are prepared to accept longer delays during the connection setup (delay tolerant access).

Rel. 10 includes a process that permits the network to initially reject the connection requests from these devices and delay them until a later time (extended wait time). With Rel. 11, access to the cellular network can be controlled by means of access classes. In this case, a device may set up a connection only if it is assigned a class that is currently permitted by the network. The network transmits a bitmap called an extended access barring (eab) bitmap that identifies which classes are permitted access. These processes introduced in Rel. 10 and 11 ensure reliable and stable operation of the IoT applications and devices of today and tomorrow within cellular networks without endangering the mobile broadband service.

Figure 2

Figure 2 PSM and extended DRX.

LOW COST, LOW POWER DEVICES

Still missing were optimized solutions for IoT devices addressing requirements like low data traffic, low power consumption and low costs. The committee started on those in Rel. 12. It quickly became clear, however, that there will be no single, simple solution for all applications. The requirements for applications such as container tracking, waste bin management, smart meters, agricultural sensors and sports and personal health trackers are too varied. Rel. 12 therefore concentrates on the areas of reduced power consumption and cost-effective modems. The results are a power-saving mode (PSM), which is especially important for battery-operated devices, and a new LTE device category 0, which should have only 50 percent of the complexity of a LTE category 1 modem. Baseline is the sacrifice of features for the sake of less complex hardware enabling a lower cost design and a more energy efficient operation.

The PSM process starts after a data link is terminated or after the periodic Tracking Area Update (TAU) procedure completes (see Figure 2). The device first goes into the idle mode in which it periodically switches to receive mode in order to receive messages (discontinuous reception). As a result, it remains reachable via paging. After timer T3324 expires, the power saving mode is entered. In this mode, the device is always ready to send messages because it remains registered in the network.

However, the receiver is literally switched off so the device is not accessible via paging. PSM is thus suited for sensor networks that only rarely need to send data to the device. This mode is not suitable for applications that require a quick response from the sensor or expect a time-critical reaction. Applications that use PSM must tolerate this behavior and the design process must include the specification of optimal timer values for idle mode and power-saving mode.

The introduction of LTE category 0 was a first attempt at permitting significantly less expensive LTE modems for the IoT market. To achieve this, the complexity of the modem was reduced by bringing the supported data rate down to 1 Mbps. This minimizes the requirements for processing power and memory. Manufacturers can also eliminate full duplex mode, i.e., the simultaneous transmission and reception and multiple antennas. As a result, the device does not require duplex filters that otherwise would be necessary to prevent interference between the transmitter and receiver. LTE category 0 was the immediate step towards LTE category M1 introduced in Rel. 13. With category M1, additional cost-reduction measures, especially lower bandwidths in the uplink and downlink, lower data rates and reduced transmit power were implemented.

A new standard called NB-IoT was developed in parallel with LTE category M1. The requirements profile for this standard includes extremely low power consumption, very low costs, improved reception in buildings and support for an enormous number of devices with very little data traffic. NB-IoT has a bandwidth of just 180 kHz and can be deployed by using unused LTE resource blocks, free spectrum between neighboring LTE carriers (guard band) or stand-alone, for example, in unused GSM carriers. With the NB-IoT, 3GPP has created a new cellular air interface that is fully adapted to the requirements of typical machine type communications. Table 1 gives an overview of the different LTE categories which meet diverse IoT application requirements.

Table 1

An additional feature for reduction of the power consumption was implemented too. With eDRX on connected or idle mode, the time interval is extended, and the modem goes into receive mode to receive paging messages and system status information. The DRX timer determines how often this occurs. Currently, the shortest interval for the Idle DRX timer is 2.56 seconds. That is fairly frequent for a device that expects data only every 15 minutes and has relaxed delay requirements, for example.

The main differences between PSM and eDRX are the time permitted for the device to stay in a kind of power off mode and the procedure to switch into receive mode. A device using PSM mode needs to go first in the active mode to become reachable and afterwards it stays for a certain time in idle mode. A device using eDRX can stay in the idle mode and go just quickly into the receiver mode, without any additional signaling.

For example, a device may be expecting very infrequent spontaneous messages from the server (e.g., once per day) but the application requires an answer in less than 10 minutes. If the device is using the PSM, it has to leave the PSM mode at least every 10 minutes and make a TAU, followed by a short time in idle mode. If using eDRX, on the other hand, the device just goes into reception mode every 10 minutes, which consumes much less power and generates reduced signaling load. Otherwise, in the case of a sensor device that sends data once per day, and during which there is essentially no need to communicate with the sensor in the time between, PSM is probably the most appropriate power saving feature. In some cases, it would be meaningful to combine several power saving features like eDRX in connected mode, in idle mode and the power saving mode.

Moreover, in eMTC and NB-IoT some coverage enhancement features were introduced to cover use cases like smart meters installed in the basement of a house. One principle is the redundant transmission, e.g., repeatedly sending the same data over a period of time dependent on the actual coverage conditions. But transmitting the same data several times obviously takes more time and consequently has an impact on overall power consumption. As depicted in Figure 3, a couple of parameters defined by the design, but sometimes dependent on the network configuration or actual network condition, influence the battery lifetime of the device.

Figure 3

Figure 3 Parameters influencing the battery lifetime of a device.

END-TO-END APPLICATION TESTING

Theoretical calculations about battery lifetime based on assumptions about the communication behavior of the applications and parameters is a good starting point. But applications may behave quite differently in reality, and this behavior can also change over time depending on the actual situation. For example, a sensor reports the actual value only if a certain threshold is reached, but as long as the sensor value is above this threshold it will be reported periodically. In general, the overall communication behavior of the end-to-end application including communication triggers (client initiated, server initiated, periodic), delay requirements, network configuration, data throughput, or mobility needs to be considered (see Figure 4).

Figure 4

Figure 4 End-to-end aspects to be considered.

PSM and eDRX are just tools with slightly different characteristics that can be helpful to meet the battery lifetime requirements. The challenge for the device and application developer is to use these tools in the most efficient way. This requires an understanding and analysis of all aspects influencing the power consumption. It starts with the applications running on the device and on the server side, and includes also the mobile network’s behavior as well as the IP network characteristics.

The situation provides a motivation to evaluate parameters like RF performance, battery consumption, protocol behavior and application performance. Overall, it will start with detailed analysis based on communication models selecting different features and varying different parameters on paper, but in the end, it would be very useful to verify the results under well controlled, simulated, yet realistic network conditions. This will not only verify the model assumption but will also reveal the impact of non-perfect network conditions. Also scenarios in which the network does not support a feature or use different timers can be verified. And after all, a better understanding of the overall application behavior can be gained.

A UNIQUE TEST SOLUTION

There is a growing demand for test, verification and optimization of end-to-end applications, which is going far beyond pure RF and protocol testing. Manufacturers of test and measurement equipment are addressing this demand. Rohde & Schwarz, for example, offers a solution based on the R&S CMW500/290 multi-radio communication test platform and the R&S CMWrun sequencer tool. It allows a detailed view on different parameters like mobile signaling traffic, IP data traffic or power consumption on one platform. In real networks, it is not possible to reliably reproduce and test end-to-end application requirements. But the test platform simultaneously emulates, parameterizes and analyzes wireless communication systems and their IP data throughput.

The sequencer tool allows straightforward configuration of test sequences without requiring specific programming knowledge of how to remotely control the instrument. It also provides full flexibility when configuring parameters and limits for the test items. One of the key differentiators of this solution is the intuitive way the user can combine and run applications in parallel with common event markers out of signaling or IP activities.

For example, in end-to-end application tests, synchronized traces show the current drain and IP data throughput. During analysis time, synchronized event markers indicating signaling events or IP status updates are displayed in both graphs. This ensures a deeper testing level where the user can see the impact of a signaling or IP event on the current drain and IP throughput. This helps to understand the dependencies and to optimize the application parameters.

The starting point could be to just look at the overall communication behavior, e.g., number of IP connections, transmitted messages, or communication and signaling events. Moreover, it could be interesting to see the power consumption in different activity states, or in eDRX or PSM status. Later it would be useful to tune the related parameters for eDRX or PSM and probably the application behavior. Finally, it could be helpful to analyze different scenarios reflecting possible real-world situations. Thus, end-to-end application testing becomes more and more important in order to meet such challenging application requirements as a 10-year battery lifetime.

Reference

  1. Cisco Visual Network Index 2016 Global Mobile Data Traffic Forecast Update, 2015–2020 White Paper.