Registered users can unlock up to five pieces of premium content each month.
The Rise of Edge Computing |
NEWS |
Within an IoT deployment, edge computing can be performed on the gateway which has a relatively higher compute capability. Despite resource constraints, edge computing can also be performed on small footprint sensor devices like microcontrollers, too. This capability has developed due to innovation in developing software frameworks like TinyML, which work in resource constrained deployments. Edge computing is often regarded as a substitute for cloud computing, but the two technologies can be complementary in a “distributed architecture” setup. Edge computing use cases include Industrial IoT (IIoT) in resource-constrained devices, which are often battery-powered. Edge compute fits well with the resource-constrained nature of OT manufacturing devices, so the implementation of edge computing will enhance the capabilities of IoT deployments.
Also, edge computing is possible in enterprise IoT (such as telematics) where higher compute device sensors and gateways allow for higher compute ML inference and predictive analytics. For example, edge computing is expected to play a significant role in telematics use cases where both gateways and sensors alike have less battery constraints (given the vehicle has significant energy from fuel). Key players pivoting towards providing an edge compute-based device management solution include cloud service providers, gateway manufacturers, and software suppliers. Enterprises are widely expected to incorporate edge computing, with many customers seeking a device management solution from edge to cloud. The emergence of next-gen connectivity protocols, such as ultra-reliable low-latency communications (URLLC) and low-power wide-area networks (LPWANs), has reduced latency and enabled real-time communication at the network's edge.
How and Why are Edge Compute and Device Management Mutally Interdependent? |
IMPACT |
Firstly, device management services help to enable edge computing. For example, when device-to-cloud connectivity fails, then a business rule (i.e., a contingency plan) is set-up on the device (the sensor itself) to relay that piece of data that previously went in the cloud servers to a gateway instead. The data is then cached on the gateway. The device management service facilitates edge computing by playing a connection monitoring role, recognizing that connectivity with the cloud had been lost, and it plays a configuring role by sending the data to a gateway instead. Following the data extraction it is then filtered, processed, and stored on the gateway itself. Then, the stored data is processed on the gateway (at the edge) rather than being sent over a long-distance to a data center (in the cloud). So, the ability of device managers to implement a rules engine, which sends data to an edge endpoint, is critical in the edge compute value proposition of improving reliability by reducing the risk of downtime. The internal plumbing of any IoT deployment is encapsulated by device management services, which involves monitoring, diagnosing, and configuring IoT sensors and detectors. Edge computing can be performed on relative gateways with large footprint, or on relatively small footprint IoT devices, such as microcontroller units (MCUs), the latter typically deployed with RTOS (real-time OS). Whereas greater computing power (which is available on the gateways instead of the MCUs) enables more complex ML algorithms but comes with greater resource consumptions and a more complex set up.
Secondly, edge computing helps to enable device management services. For example, an edge compute process has concluded that connectivity has improved in the locality where this IoT solution is deployed. In response to this new information, the ML inference concludes that all the sensors should send out their network messages that test connectivity with the cloud every 2 minutes rather than 30 seconds (enhancing battery life). Updating how frequently this message is sent involves the device manager remotely configuring the sensor through a visual dashboard (or a command-line interface) based on analysis that was performed in real-time at the edge of the network. Therefore, edge computing provides the insight which then influences device management as the operator uses a device manager to optimize system configurations remotely.
What Does the Edge Compute Revolution Mean for Device Management Customers and Suppliers? |
RECOMMENDATIONS |
The costs of developing device agents for small footprint devices in IoT use cases (such as industrial manufacturing) and costs of running edge compute must be outweighed by the benefits of additional functionality provided by edge computing. Development costs mount due to requirement of programmers, as the libraries are typically based on C/C++, Python and Java languages. So, if there’s an attractive value proposition for enterprises, then they will invest in IoT solutions. Suppliers are all too aware that device management services which are enhanced with edge compute capabilities can make the difference between an IoT solution being stuck in proof-of-concept or entering deployment. A benefit of edge compute device managers on gateways is that they do not necessarily require device agents, however, device agents are required for device managers on sensors. Device agents are required to enable edge compute functionality on a low compute device which uses a real-time operating system (RTOS) that does not have these capabilities built-in natively. An example is a microcontroller storing data in a ‘cache’ if the data cannot be immediately sent to the cloud given that network connectivity was poor. However, device agents aren’t required for edge compute functionality on gateways since these higher compute Linux-based devices have capabilities natively built in for processing data at the edge if device-to-cloud connectivity is poor. Commercially licensed edge compute application programming interfaces (APIs) are available as an alternative to using an open-source development toolkit, and these APIs help to exchange data from one platform into another. For example, the data gathered from a device management service is sent to an analytics platform where the resource (dataset) is processed. These resources (such as over-the-air (OTA) updates) are often sent from the edge server to the client device or from the client device to the edge server (i.e., monitoring/usage data).
Typically, industrial/ruggedized gateways can enable edge computing, which connect IoT devices such as sensors to each other and devices to the cloud, often use containers, such as Docker. Another method of utilizing edge computing has been through edge native applications on industrial (ruggedized) gateways, which often use a Linux OS capable of deploying edge native applications instead of containerizing applications. For customers, the edge computing revolution is facilitated by deploying natively or through Docker. The revolution toward edge computing means that more IoT solutions move from proof-of-concept to deployment as device management platforms can deliver core services (OTA updates, remote monitoring, and diagnostics) more efficiently in terms of preserving battery life and saving network connection costs.
For suppliers, one way to increase the number of value-added services in device management is to include edge computing on the device manager, and northbound services like data orchestration and data management also benefit by extracting, filtering, and processing the data at the edge. Customers can also benefit from an edge compute-based device management platform in terms of southbound IoT solution cost savings: the network, energy, and maintenance costs, which are saved due to a southbound edge compute implementation on the micro-controller level that remotely configures devices and optimizes their energy and network resource usage.
In summary, it is common for suppliers to say that their device management platforms are industry agnostic. However, the reality of many vendors specializing in niche industry verticals suggests otherwise. Among the suppliers adjusting to edge computing, high-compute gateway suppliers are best placed to do this, but also northbound hyper-scalers (with AWS IoT Greengrass and Azure IoT Edge) and southbound chipset providers are following suit. However, connectivity management platforms (CMPs) are also device management service providers but aren’t directly impacted by edge computing. Since their core service does not operate at the hardware level (required to allow edge computing) or at the software level (showing results from edge computing). Instead, the network connectivity plane is isolated and so their device management customers are incentivized to build an a la carte or “modular” solution in which their edge compute capabilities are sourced from other vendors, instead of network connectivity suppliers.