Electronicdesign 19855 Push 11 Myths Promo

11 Myths About IoT Messaging

Dec. 14, 2017
Many application-development myths emerge in the corporate IoT market due to issues of real-time data handling and delivery. This article looks to pop those myth balloons.

Download this article in PDF format.

Everyone loves a good myth. It makes for entertaining dinner-table conversation. But sometimes myths get in the way of solving problems, and unfortunately in the technology world, this can mean wasted time, money, and resources.

The corporate Internet of Things (IoT) market is a competitive place where application-development myths proliferate. Many of these myths involve real-time data handling and delivery—the heart of successful corporate IoT application development. The challenges of IoT application development are speed, scale, and reliability for data exchanged among people, machines, sensors, and devices. Let’s debunk some of those myths…

1. Latency is a function of distance.

Latency is a big challenge when it comes to the IoT. Consider first responders in an emergency, or a more consumer-focused example such as Google Glass. When latency exists, the data quality degrades to the point of arriving too late for useful action or response. Therefore, the efficacy of many IoT applications depends on the data being sent and received in real time.  One solution, according to a Wired article, is that “organizations will need to put their data and computing infrastructure in close proximity to users and the devices, and be able to connect directly to their trading partners and digital supply chain.”

However, this isn’t always possible. One cannot put a data center in the middle of the ocean (even though Google did discuss this a few years back) to reduce latency for support of marine shipping to, for example, handle RFID updates from containers on ships or to communicate with the planes flying overhead.  And in these situations, if there’s an emergency, the information must be transmitted as quickly as possible.  

Computing location is only part of the story. Getting the right data from the right device at the right time isn’t just about hardware and sensor location, it’s about data intelligence. If systems can understand data and only distribute what’s important, at the application level, this is more powerful than any amount of hardware thrown at the problem. 

A network alone can’t prioritize data because it doesn’t understand data; it just moves it. This prioritization of data should be done at the application level, where there’s logic. Combine this with data caching at the network edge and you have a solution that reduces latency.

To achieve low latency, IoT applications require a combination of intelligent data distribution and an architecture designed to put the data as close to the end user as possible—whether that’s a machine, device, or person.

2. The IoT is just like mobile data distribution.

Distributing data over a mobile network can be an issue because a huge amount of data is sent from a server to a device somewhere in the world. Unfortunately, the network can be unreliable, and this causes speed and performance issues. For IoT, that model is often reversed. That’s a huge volume of data from many devices or machines or sensors coming in over an unreliable network to a few aggregation locations for analysis and action.

IoT application development cannot be approached in the same manner as mobile application development. For IoT, you need: a strategy for collecting all of the data from “things” (people, devices, machines, sensors) on a huge scale over unreliable networks, the intelligence to only pass on what is relevant or what has changed, the resilience to manage the incoming data deluge, and efficiency to avoid exceeding available bandwidth. 

In many application cases, the preponderance of the data is sent to a warehouse for storage to do auditing or reporting. However, some of the data needs to go through a CEP (complex event processing) engine or other real-time tools to be acted upon immediately, such as in the cases of fraud prevention or risk detection for credit-card processing.  Once processed, the time-sensitive data then needs to be distributed in a manner similar to a mobile strategy.

The objective of these IoT applications is to understand, at speed, which data must be processed and distributed in real time. Examples include the ability to tell first responders to a crisis event to alter their route due to traffic congestion, or changing traffic light signals to improve the flow.

3. I’ll just use enterprise messaging; it’s fine for the IoT.

Many data-communication technologies are simply messaging systems that blindly send large amounts of data back and forth—an inefficient and expensive approach to data transmission. The specific demands of IoT preclude effective use of generalist data-transmission technology solutions that work adequately for less-demanding operational environments, such as chat or social media.

Many companies attempt to shoehorn inefficient messaging technology into their software, or use open-source and try to build their own. Unfortunately, these organizations are trying to solve speed, scalability, and reliability issues with traditional techniques and solutions that were not purpose-built for the IoT world. These technologies don’t reliably scale.

4. IoT data is outdated and the application is thus useless. 

Some IoT users report that often the data in their applications, sent from IoT devices, is out of date, making the application useless. This is true. If you don’t receive the right data at the right time, your application will be useless. However, that’s a challenge of the data-distribution strategy, not the application itself.

For the IoT to be successful, the IoT applications can and must maintain a consistent and reliable flow of data back and forth and function in real time. Static applications, such as mobile news applications, will only load a page once and be done, but conversational and action-oriented applications must be constantly updated with current data.

For instance, if an end-user wants to find out what the temperature of the gas stove is in his or her house, the application must provide accurate and current information. A smart-city application requires real-time information on the current capacity of trash bins—how full are they? Otherwise time and money are wasted due to inefficient route and truck-deployment planning for bin collection. No matter the application, current and accurate data feeding the application is important to critical for useful operation.

5. Sensors/small devices won’t put pressure on the internet.

The myth here is that most of the IoT uses sensors and small devices that only update periodically with small amounts of data. As a result, the bandwidth pressure on the internet will not be a challenge. However, with one trillion connected people, devices, sensors, and machines sending and receiving 2.5 billion Gbytes of data every day (source: IBM Corp.), alongside all of the other computing resources utilizing the internet for transport, bandwidth pressure always exists on the Internet.

Companies face scalability issues when all of the “things” connect back to their servers at the same time. In addition, if a message is sent to hundreds of thousands of “things” at the same time, the application must scale in real time to effectively deliver that message. All of the sensors and small devices will put pressure on the internet—it’s a myth to think otherwise.

6. The cloud is the answer for IoT messaging.

Not really. The fundamental issue facing IoT is that network power remains very centralized. Even in the era of the cloud, when you access data and services online, you’re most often communicating with relatively few massive data centers that may or may not be located conveniently close to you.

That works when you’re not accessing a lot of data and when latency isn’t a problem. However, it doesn’t work in the IoT, where, for example, you’re monitoring traffic at every intersection in a city to more intelligently route cars and avoid gridlock. In that instance, if you wait for the data to be sent to a data center hundreds of miles away, processed, and then have commands sent back to the streetlights, it’s too late—the light has already changed.

7. My sensors aren’t accessible to the internet; I don’t need to worry about the network.

An argument is made by some people that, just because their sensors (for example nuclear-power-plant valve sensors) will never access the internet, there’s no need to worry about network latencies, or unpredictable network availability, or bandwidth issues. This is a myth.

Wi-Fi networks, although they may have higher bandwidths than mobile networks, suffer from some of the same reliability problems as mobile networks and the bandwidth available does fluctuate according to position and environmental factors. As such, for critical devices that require constant monitoring and real-time responses, it’s essential that updates and alerts are delivered to the appropriate devices and applications, while consuming as little network bandwidth as possible.

Private satellite links used for devices in remote locations, such as the mid-ocean or at 36,000 ft. in the air, suffer from severe latency delays and bandwidth is hugely expensive. Ensuring that device communication is optimally efficient is fundamental. This requires the ability to distribute data reliably across the networks, while concurrently understanding and only sending data that changed.

8. We can’t keep up with IoT data.

The IoT is producing an avalanche of data. According to IBM, over 1 trillion connected objects and devices generate 2.5 billion Gbytes of data every day. Amid this huge volume of data produced by the IoT, not all of it needs to be communicated to end-user applications such as real-time operational intelligence applications. This is because much of the chatter generated by devices is redundant and doesn’t represent a change in state.

The applications are only interested in state changes, e.g. a light being on or off; a valve being open or shut; a traffic lane being open, closed, or clogged. Rather than bombarding the applications with all of the connected device and object data, applications should only send updates in real time when the state changes. Therefore, the data transmission must be intelligent and “data aware.”

9. The data center is where “all of the magic happens.”

Some argue that the data center is where all the magic happens for the IoT. The data center is absolutely an important factor for the IoT. After all, this is where the data is stored. But the myth here is that the data center is where the magic happens. What about the network? After all, the IoT is nothing without the internet actually supporting the distribution of information.

Certainly, storage and/or analysis can occur in a datacenter. However, if the data can’t get to the data center in the first place, is too slow in getting to the data center, or the data center can’t respond back in real time, there is no IoT. 

10. Unreliable networks will be the death of the IoT.

The reality of the IoT is that if data can be distributed from the “thing” across the network in real time, over unreliable networks, the IoT application will operate successfully. This requires intelligent data distribution. To lighten the load on the network by reducing bandwidth usage, the IoT service must have data handling that understands what data is important and what data is redundant and doesn’t affect operation.

By understanding the data, intelligence can be applied to only distribute what’s relevant or what has changed. This means only small pieces of data are sent across the congested network. The result is effective IoT applications with accurate, up-to-date information that operate effectively at scale—at scale because they can cope with the millions of connected devices, sensors, machines, etc. The IoT application won’t be hit with huge pieces of data in huge volume that can shut down the services.

11. Messaging is a niche technology. Who cares?

Incorrect. The IoT market is maturing and with that comes a realization that network-efficient, high-volume data streaming and messaging is critical for corporate applications and analytics. Simply put, companies using IoT devices need solutions that increase reliability, and reduce bandwidth and infrastructure requirements.

This equates to intelligent data distribution and management, reliable operation at huge and often fluctuating scale, and an architecture that has been designed to put the data as close to the end user as possible—be it a machine, device, or person.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!