Advantage at the Edge

Editorial Type: Feature Date: 2020-03-01 Views: 1,590 Tags: Networking, Edge computing, Data Centres, Cloud, Traffic Management, AI, Zizo PDF Version:
To benefit from edge computing it is necessary to understand its strengths. Peter Ruffley, CEO at Zizo, suggests that there are three competitive advantages that need to be considered and understood

Advances in hardware and software technology are bringing data centre and cloud power to the Edge. With the concept of edge computing and analysing data where it is created fast gaining momentum, organisations are discovering how they can quickly access only the most valuable data, in real time. This will prove to be mission critical to their business.

There are three key reasons and benefits as to why the processing and analysing of data should be done at the edge, namely reduced data traffic, lower infrastructure and hardware costs, and distributed processing for increased overall throughput.

Let us imagine a traffic management system where we are trying to manage the signal to optimise the traffic flow. Scattered across the city are many IoT devices that generate an event whenever a car passes. These are collected in local hubs and then transmitted to a central server that analyses the events, learns from them, and plans a better strategy for the following day.

If we initially consider the non-edge example, then the function of the local hubs is merely to forward the events to the central service. We might further assume that at peak times an individual sensor might generate 100 events a minute, or 6000 an hour. If we have 1000 sensors then we could be transmitting six million events in the peak hour. The centralised AI has to process and learn based on a single day's experience of tens of millions of events.

If we consider the bandwidth and cost associated with this scenario, we might assume each sensor sends a time, location, and direction of travel for each vehicle that passes, which might amount to five bytes, equating to a total of 30Mbytes of data sent. Not unreasonable, but still a lot of cost and bandwidth on a cellular network, and if you increase the packages with checksums etc. that cost can only grow. Potentially, the cellular cost and bandwidth can be offset by having wired connections from the nodes, but this inevitably increases the infrastructure and maintenance costs.

The second issue is whether this data is actually useful to the AI, or if it is too fine grained. The scenario on any one day is never going to be repeated because many factors control traffic flow. Realistically the AI should perform some form of aggregation or histogramming to turn these events into something better suited for its training. For example, the number of vehicles to pass a location in a given direction within a five-minute window. This is far more likely to elicit patterns that the AI can use.

So, in an Edge computing scenario, processing will be performed on the edge nodes prior to transmitting just the aggregated data to a central server. The first saving is a massive reduction in bandwidth. Instead of transmitting thousands of messages a minute, just one or two messages per device every five minutes will be transmitted. So, rather than megabytes of data being sent, this is reduced to just a few kilobytes transmitted from each edge node to a central server.

This means that it will certainly be cost effective to use the cellular network for the traffic which, in turn, reduces the infrastructure costs for the system. Finally, using the processing power of the edge nodes can reduce the requirement for power in the central node. Because it no longer needs the capacity to store and analyse millions of data points, the device could be a simple piece of commodity hardware or even a cloud service that just analyses a manageable volume of data. NC