Tradeoffs include power, performance, security. Each standard has its own benefits and drawbacks.
“The good thing about standards is that there are so many to choose from.” – Andrew S. Tanebaum
The extended version of that quote adds “furthermore, if you do not like any of them, you can just wait for next year’s model.”
That could not be truer when it comes to IoT and wireless connectivity. Every standards group is rushing to create new versions of existing standards that use less power, have more possible connections, provide a balance between range and bandwidth, or which make it easier to integrate into an SoC. So far there are no clear winners, making it difficult for edge device manufacturers to optimize their product offerings.
Most IoT edge products want to be in control of their wake-up schedule to reduce power. “Consider a tire pressure monitoring system (TPMS),” says Jeff Miller, product marketing manager for Tanner products within Mentor, a Siemens Business. “It takes a pressure measurement at some appropriate interval and may scale the time interval based on whether the tire is moving or not to save power. There is no point sending data to the car when it is turned off or not listening. You only need to use the radio when there is a genuine need.”
That presumes you are in control of your own measurement schedule and you do not have an actuation that is externally commanded. “When there is externally actuation you do have to think about the tolerable latency, and it costs power to listen,” says Miller. “It requires having the radio powered up, and there probably will be some front-end processing just to listen for an incoming command.”
Choosing a protocol
For many applications, such as the TPMS, you have no choice about the protocol that you use. You have to talk the standard automotive protocol at the frequencies that are used around the world. But the choice is less clear for many edge devices.
“Wireless protocol is dictated by the infrastructure but there are a number of layers to this question,” points out Richard Edgar, director of communications technology for Imagination Technologies’ Ensigma group. “If the wireless infrastructure is already in place, then it is likely that the system used will be one that utilizes the existing infrastructure. It’s important to remember that the wireless infrastructure could include WPAN (wireless personal area networks) technologies such as cellular (3G or 4G), or LoWPAN (low power WPAN), which although they are not part of your infrastructure can be utilized. Today, more and more infrastructure devices are available that support multiple wireless protocols, such as Wi-Fi, Bluetooth and 802.15.4 solutions.”
The right protocol is dictated by a number of features, including range, bandwidth, latency, overhead, number of sensors connecting to an access point (AP) and of course, the protocols supported by the AP.
“WiFi is an attractive protocol for a lot of devices, but it consumes quite a lot of power just to be part of the network,” says Miller. “For long-running battery-operated devices, WiFi is not a very good choice, even though it may be quite natural given the existence of an existing infrastructure. This is also true of some of the higher-tier Bluetooth protocols, even though they were designed for battery-operated devices. More recently there is Bluetooth low energy, where you can get years of battery life and really optimize how often they talk, and can schedule when they listen for new commands.”
Ron Lowman, strategic marketing manager for IoT at Synopsys, also sees problems with the adoption of WiFi. “WiFi uses 10X to 100X the amount of power compared to Bluetooth low energy. There are companies that are trying to lower the power for WiFi because it is so ubiquitous. But it will never get to the levels of the other protocols. I believe it will always be 3X to 10X.”
But that is where waiting pays off. “An emerging version of WiFi, 802.11ah, has a number of technical features that are targeting low power and can support many devices,” says Straff Wentworth, vice president of engineering for Adapt-IP. “In 802.11ac there is a beacon and devices stay awake and listen to that beacon. But with ah, a device can tell the AP that it expects to communicate in two days. The AP will register that, and when the device wakes up, communications can start. The device itself can control its battery usage based on its needs to report data. In addition, 802.11ah operates down in the sub-GHz range. (902 to 928MHz). When you try to move bits really fast, you use a lot of power. So they dropped the frequency.”
Much of this depends on how much data needs to be sent. “If you are only sending a small amount of data, then Zigbee and Thread can be lower power because these are very simple messages,” said Lowman. “But if you have more data, you could send higher data through Bluetooth and turn off sooner. There are tradeoffs, and the different protocols are there for different reasons. Each has specific niches for those purposes. All of the protocols have survived because of that.”
Wentworth points out another factor to consider. “802.11ah will support over 8,000 IoT devices at a distance of 1 km, which is 0.62 miles and encompasses an area of 490 acres and can adjust throughput based on range.”
In comparison, Bluetooth has a range of about 100m and incurs additional latency as more connections are added.
“Thread and Zigbee support more of a peer-to-peer configuration,” says Lowman. “Everything can be OFF and wake every once in a while to see if there is a signal for it. Bluetooth tends to flood messages, so it is ON more of the time and has to go to everybody. Bluetooth recently adopted a mesh technology, whereas the other two have had that capability since inception.”
Thread and Zigbee are based on the same radio standard, 802.15.4, and have similar payloads. (see Fig. 1.)
There are decisions to be made even after a protocol has been selected. “Transmitter power is not independent of the protocol,” points out Miller. “It is often a very critical part of their design. Bluetooth low energy can be configured for longer ranges than would be typical for something like a headset. But range and power are traded off against each other, and often are optimized dynamically. You could broadcast at what you think is an appropriate level, and if you don’t get acknowledged, you know that you have to go to a higher power level and you negotiate the power of transition until you get a good response.”
Bluetooth 5 can go up to 20dB. “Our IP goes up to 6dB without an amplifier, but you can add one that would take it up to 20dB legally,” says Lowman. “That dictates the distance that they can cover. A lot of the distance is designed up front, so they need to understand the capabilities. There are options to change power on chip, and Bluetooth low energy will be adding direction — the angle of departure so those features will help, particularly with indoor navigation. That will be available later this year. This enables them to figure out how far apart they are from each other.”
Frequency can make a considerable difference, as well. “Lower data rates will have a longer range (for the same frequency and Tx power) than higher data rates,” explains Edgar. “Also, the lower the spectrum, the longer the range (for the same Tx power). For example, a Wi-Fi 11n system will have approximately twice the range in 2.4 GHz than it will in 5Ghz (for the same Tx power).”
Another tradeoffs being explored is the amount of processing to do locally. “Initially, everyone was talking about IoT, especially those developing high-end chips for the Cloud, and expected all data to be sent to the Cloud,” says Lowman. “But that is very costly. You have to do smart things at the edge to decide what to send, when to send it, and if there is local processing that can be done, which would reduce the amount of data that has to be sent.”
That adds another set of tradeoffs. “More edge processing may reduce the amount of communication, but that may cost power,” points out Miller. “You are trading off scarce resources, which are power and bandwidth.”
Many IoT devices are made up from standard components that are integrated on a board today. This makes it fairly easy to swap out the radio sub-system, but this is a costly solution in terms of power. “In one case study, a design migrated to a custom chip rather than off-the-shelf components and achieved a 70% reduction in power consumption,” explains Miller. “They went from 12 chips to 1.”
In the past, it was common to buy a chip that included a radio, and to integrate it at the board level. “Now we are seeing more single chip solutions emerge,” Lowman says. “The radio has to be on-chip from a power perspective and security perspective. The process technology nodes have also aligned. We see 40nm and 55nm becoming popular nodes for these low-power wireless solutions.”
Many radios are becoming available at these technology nodes. “Bluetooth low energy IP is often available in 40nm, where you still have a good tradeoff with the digital capability combined with reasonable analog libraries and a reasonable selection of RF devices,” says Miller. “For some protocols, you may have to go to 180nm to get higher voltage or high power for the RF. You would not do this with 7nm finFET. It depends upon the application. If you are doing image processing, audio or machine learning, then you may need to go to a smaller node and possible have multiple die in the package.”
Another decision is the type of IP to buy. While the radio portion itself is likely to a hard macro, the digital circuitry provides more flexibility. “There is the PHY portion and there is the baseband part, which converts from analog to digital. And then there is the MAC component, which does the work for packet retry and fragmentation reassembly,” explains Wentworth. “Using high-level synthesis allows us to move faster with less money. It also enables the addition of things such as encryption.”
Security is becoming a bigger concern. “A lot of devices are hacked because no security was put in at all,” says Lowman. “This is often because of the complexity of the software, or they just don’t want to take the time to do it. All of the wireless technologies today have security options. It is a matter of if they are turned on and used. We have optional hardware acceleration that can be added to our wireless solutions. Typically, we see them not adopt it, but they are beginning to ask more questions, such as the ability to have the accelerators on chip. Newer chipsets are more likely to add these. It can be done in several ways. You do not have to add accelerators. You can do it in software. You could use simple encryption or fully trusted execution environments that are built into the SoC.”
There is a price to pay, however. “Higher security in the communication demand longer keys and better encryption schemes,” points out Roland Jancke, head of the department for design methodology for Fraunhofer’s Engineering of Adaptive Systems Division. “This contradicts the need for low power in mobile devices. Other parameters that influence the choice of a suitable protocol are latency until a communication link is established, detection and correction of transmit failures as well as immunity against other radio traffic.”
“Ultimately, you have to analyze the threat levels and assess the needs and see what they can afford in terms of die size or power,” concludes Lowman.