Edge Computing: Why the Future Is Now

essidsolutions

The IT industry keeps changing with new technologies and trends gaining importance almost every year. This is true when it comes to distributed computing as well. In this article, Robert High, IBM Fellow, VP, CTO IBM Edge Computing, discusses what led to the advent of edge computing, the key elements that define it, and why the future is now.

The IT industry has often been referred to as a ‘fashion industry’. What was in vogue last year is no longer of interest next year. What was old is new again. We’ve seen that pendulum swing with distributed computing over the history of computing. 

The Computing Pendulum

In the beginning, there were mainframes – room-sized machines that ran our most critical applications, from payroll to banking transactions to climate modeling. During the ‘80s and with the introduction of mid-range and personal computers, we saw a movement toward decentralized computing and the introduction of client-server computing, where small, distributed computers could provide immediate access and improved user experiences but still gain the benefits of data and transactions hosted in mainframe servers. 

But the client-server paradigm proved too expensive to administer; IT organizations struggled to ensure their ‘clients’ were being kept up to date. And with the advent of the World Wide Web and Internet browsers, the pendulum swung back to the favor of web-server computing that avoided the difficulties of keeping PCs up to date.  

And then mobile computing came along. Apple and later Google proved they could handle the administrative problem. This was by putting the responsibility of keeping client software (Apps) up to date on the client users and making it easy enough to do so through the AppStore or PlayStore that the benefits of localized experiences were put back in the hands of users.  

For the last decade, we’ve also seen the movement to cloud computing – essentially a massive centralization of computing driven primarily by the economic benefits of commoditization, normalized software architecture, and elastic scalability of virtually infinite resources. 

But cloud computing, even when combined with mobile computing, has led to a major gap. In industrial and enterprise computing scenarios, the users are running equipment and processes that do not resemble mobile devices (at least, not in the way that Apple and Google tend to represent them). There is no AppStore or PlayStore for these users and the equipment they use. But, like mobile device users, the need to provide high fidelity user experiences; the need to access the results of computing decisions; the need to protect the data they create or use from leaving their possession; and the need to continue to operate even when the network is ‘down’ is just as strong as it is with mobile computing. 

This has led to the advent of the edge computing era. In some ways, this era has been ushered in by the transition of Operations Technologies (OT), commonly used in industrial production processes, to more generalized Information Technologies (IT), as well as the maturation of the Internet of Things (IoT). They both drove connectivity to billions of sensors and evolved to localize more intelligent processing for those sensors. But edge computing is more than that. In fact, edge computing isn’t always about human users. As machines have become more intelligent and autonomous, they too need software in the form of AI or other algorithmic logic to provide that intelligence. More often, this intelligence is built with general-purpose computing. 

Also Read: Paving a Path for Future IoT Growth With Blended Connectivity

Edge computing is defined by three key elements:

    1. Edge computing moves the software to where the data is being created and where decisions must be made – that is, with the users (or machines) who need to do a job and use tools and equipment to do their work.
    2. Edge computing leverages the gold standard for cloud-native development practices – that is, componentization, containerization, loose-coupling, separation of concerns, micro-services, agile development methods; essentially all the things that have enabled cloud computing to scale. 
    3. Edge computing is autonomously managed at scale – that is, the task of ensuring the software used at the edge is right and up to date is even easier and more secure than what we’ve come to expect from mobile computing software.

With edge computing, if you are a factory worker on an assembly line, or a driver on your delivery route, a cashier in a grocery store, a barista in a coffee shop, a loader in a distribution center, or a lineman maintaining power equipment, the equipment you use – the fabrication machine you operate, the delivery van you drive, the point of sale terminal you use, the coffee machine you use to brew the perfect cup, or the drone you operate – will be made better by the software that runs on it, or near it. Cameras will be able to recognize the quality of the work being performed, alerting you to changes you need to make to correct the problem in real-time. Point of Sale (PoS) terminals will be able to make offers to your client to improve their shopping experience. Machines will be able to predict their own need for service directing your attention to problems before they occur and result in shutting down your operations.

Also Read: The State of IoT Edge Devices and the Case for IoT Security Updates

The Future Is Now

We have been anticipating this shift for the last few years. Gartner has estimated that 50% of enterprise data will be created and processed at the edge in the next couple of years. Just the growth in the number of cameras and thus the rapid spike in the amount of video data being generated at our locations of business already is creating the conditions where those predictions are likely to come true. We have seen estimates of upwards of 50 billion edge devices being in the market in the next few years. Compare that to the 3.25 billion mobile devices in the market today, and it is clear this market is poised to grow. 

But, in fact, that market is here now. All the pieces needed to build a successful edge solution are available now and are already being used to bring advantages to numerous use cases.  Let’s delve into one here.

The Mayflower Autonomous Ship

As some of you may remember from History class, the Mayflower left Plymouth, England, 400 years ago on a journey to the New World to form a new colony. To celebrate its anniversary, the city of Plymouth wanted to do something special. Brett Phaneuf proposed that instead of enshrining the past, we should celebrate the future by doing something that would be as memorable 400 years from now as the Mayflower’s sailing was 400 years ago. He proposed to build a ship that would sail from Plymouth, England, to North America entirely on its own. That is, without anyone on board, and without anyone to drive it. A fully autonomous ship that can take in its surroundings, understand the rules of the sea, navigate and propel itself – in essence, an autonomous ship able to make its own decisions for how to safely navigate the waters from one world to another.

That turns out to be much harder than it sounds. The oceans are treacherous – a huge range of weather, tidal and wave conditions; surrounded by the shore and other land hazards; intersected by other ship traffic; exposed to marine life on the surface, and even sometimes just below the surface; littered with jetsam and flotsam; and a long ways away from home, help or even communication. Oh, and did we mention icebergs and hurricanes? It’s difficult to navigate the oceans even when there are humans aboard to make all the myriad decisions that need to be made, let alone for a ship that must do that entirely on its own. 

The challenge was to build a boat that could propel itself for very long distances and instrument it with enough sensors and compute to perceive its environment, identify obstacles, make decisions, optimize amongst competing alternatives, and execute those decisions in both fair and difficult conditions. In essence, the ship needed to be an edge device – something that can bring computers close to where the data is created and decisions are made. 

This challenge was extended further when the team adopted an even greater purpose for the ship. The oceans play an essential role in our changing global climate. Oceans are burdened with absorbing much of our rising CO2 atmospheric gases. Ocean temperatures are being affected by atmospheric temperature. These changes are having a profound impact on marine life, chemistry, and topology. To understand climate change, we must also understand oceanic changes. 

Also Read: All You Need To Know About eUICC Enabled SIM and Why It’s Important for a Greener Planet

However, like all sciences, marine science relies on the collection of massive amounts of data. But sending scientists out to sea to collect that data comes with all the hazards of being at sea – not to mention the discomforts and inconveniences of being in remote and isolated locations for days and weeks at a time. 

But an autonomous ship – one that can navigate on its own, that does not need any human presence, that can sustain itself for weeks or even months at a time without maintenance or supplies, would be an ideal platform for gathering marine data. And so, the Mayflower Autonomous Ship was outfitted with additional sensors and software designed specifically to sample water temperature, salinity, and chemistry; to measure wave energy and direction; to listen for whale song; to test the abundance of phytoplankton; to detect microplastics; amongst other data. 

All this data will be collected on its maiden voyage and will prove the potential for the Mayflower Autonomous Ship to be a platform for future marine science. 

More immediately, however, the Mayflower Autonomous Ship is a proof point that edge computing is here now. The ship is the equipment with its own compute. The software has been developed using cloud-native practices. And it is managed autonomously, both ensuring the workloads continue to work even in the presence of failure and kept up to date under changing conditions and intermittent connectivity, putting the right software at the right place at the right time, securely.  

Did you find this article helpful? Tell us what you think on LinkedInOpens a new window , TwitterOpens a new window , or FacebookOpens a new window . We’d be thrilled to hear from you.