Edge To Be $800B Industry by 2028: Reveals State of the Edge 2021 Report

essidsolutions

Linux Foundation’s State of the Edge 2021 report reveals edge computing is poised for high growth in upcoming years, owing to open source collaboration and emerging technologies.

Contrary to popular belief, the COVID-19 pandemic may just have accelerated the adoption of edge (and not only cloud), so much so that the State of the Edge 2021 reportOpens a new window by the Linux Foundation’s LF Edge discovered that expertise in legacy data centers could become obsolete in just the next few years.

LF Edge, an umbrella organization under the non-profit technology consortium currently maintains nine projects including the State of the Edge, whose mission is to “establish an open, interoperable framework for edge computing independent of hardware, silicon, cloud, or operating system.” It is basically a project aimed at understanding and quantifying the edge computing market, and consolidating industry standards for different use cases.

LF Edge estimates that the cumulative capital expenditure on new IT server equipment and edge computing infrastructure, both for first-time implementations as well as the replacement for existing ones, will reach up to $800 billion in the 10 years between 2019 and 2028.

What is Edge Computing?

According to Cloudflare, edge computing is a networking philosophy focused on bringing computing as close to the source of data as possible in order to reduce latency and bandwidth use, and by extension the response time. In other words, data processing is carried out at the source, applications, or the user viz., right where it is needed.

The technology enables efficiency, first by running fewer processes in the cloud, and second by executing them on devices like a computer, an edge server, or an IoT device. Bringing computation to the network’s edge minimizes the amount of long-distance communication that has to happen between a client and server.

Edge computing offers decentralization that becomes necessary for the present-day requirements in cloud and legacy infrastructure. Thus it eliminates the shortcomings of centralized computing such as latency, bandwidth allocation, and more importantly privacy and autonomy of user data.

“You used to be able to store all of the data and then analyze it later, and now you can’t. There’s just too much of it. We’re in an era where we have data flows that never stop,” said Simon CrosbyOpens a new window CTO, SWIM.AI. “Businesses must constantly process and analyze streaming data to get continuous intelligence, to make your organization more responsive, or whatever your need happens to be. The cloud has been tremendously successful, but stateless computing is a million times slower than the CPU, which means you’re getting results in hours versus milliseconds.”

Moreover, it broadens the scope of the cloud and supports IoT implementation, both of which along with network gateways and other user devices such as smartphones, smart devices, etc., generate 1.7 MB every secondOpens a new window , or 1.145 trillion MB in one single day. Imagine having to move this to the centralized computing infrastructure, process it, and move it back to the device.

Edge computing has multiple applications such as autonomous or self-driving cars, consumer IoT devices, medicine, security systems, content delivery networks, manufacturing, etc. It can improve performance, application throughput, responsiveness, save costs, is highly scalable, and its distributed nature provides a cushion against complete system failures.

See Also: Developers to Increasingly Turn to Edge Computing in 2021

Edge Taxonomy

LF Edge divides edge computing into two distinct edge tiers called Service Provider Edge and User Edge across the physical infrastructure which the Linux Foundation refers to as a continuum. The physical continuum comprises the internet, centralized data centers, and devices.

Opens a new window

Edge Continuum | Source: Linux Foundation

The far right of the diagram has centralized data centers that represent cloud-based computing. Central data centers are necessary for economies of scale and for the fact that they provide flexibility that becomes impossible on devices. Moreover, centralized cloud infra (providing compute, storage and networking) can be provisioned unlimitedly. At the same time, compute resources from devices are limited.

Then comes the Service Provider Edge, which is distributed and is not owned by them. Service Provider Edge has shared resources and is relatively more secure than the User Edge, and is more standardized considering use cases are mainly for service delivery.

User Edge, located at the other end of the continuum is not directly connected to the centralized data centers, has private devices like smartphones, consumer wearables, etc., and gateway devices such as IoT aggregators, switching, and routing devices.

Report: Key Insights

Now in its fourth year, LF Edge’s State of the Edge report focuses more on insights from industry leaders rather than a data-driven approach to what the future holds for edge computing.

For instance, Dean BubleyOpens a new window , founder and director of tech business advisory firm Disruptive Analysis shares his view on the differing perceptions of people on what exactly edge is. “When I start a conversation about the edge, I always calibrate where people are on the scale of things. Some think of the edge as a megawatt data center in a Tier 3 city. Other people think the edge is a milliwatt processor on a sensor. And there’s another bifurcation. For some use cases, microseconds matter. For many others, as long as this year’s latency is better than last year’s latency, that’s good. I think the edge has maybe nine orders of magnitude in both latency time and power, about all of which people say, ‘That’s the edge.’ Different magnitudes of edge apply in different conversations.”

So what does edge encompass?

There are a lot of components to edge computing, some of which are discussed in the LF Edge’s State of the Edge report. The data center, including hyperscale, regional, interconnection, micro-modular, as well as street-side cabinet, is one of the critical infrastructures for edge computing.

But besides data centers, newer critical infrastructure includes wireless towers, fiber optic connections, satellite connectivity, their interconnections, and the vendors of such infrastructure.

Chetan VenkateshOpens a new window CEO and Co-founder, at Macrometa Corp, an cloud services IT company, said, “Applications need a robust data infrastructure and dependable data layer. At the edge, the problem is how to make this data layer reliable in a fully distributed way, across potentially hundreds of locations. I see three parts of the industry that are rapidly maturing in parallel, converging towards a singularity that’s going to create an explosion of value.”

The report also talks about the unique hardware and software requirements to build edge networks, which according to IDC will represent over 60% of all deployed cloud infrastructure by 2023. Hardware such as SoCs, GPUs, hyper-converged systems, AI processing chips, SmartNICs (Smart Network Interface Controllers are all in the works. Hardware is central to edge computing since IoT or even a non-processing device such as a CCTV camera will need processing power.

Jeffrey RickerOpens a new window CEO at IoT and cloud data services company Hivecell explained, “A lot of edge data may never even reach the cloud. Cargo ships have gone from hundreds of sensors on board to tens of thousands. We will never push all that data to the cloud over a satellite connection. We need cloud-like compute power on the ship that can run machine learning models.”

He adds, “But wait, it gets more fun….

Cargo ships are only in port a day or two at most, at the whim of weather and port traffic. How will you schedule a technical crew to install the hardware? The answer is, you can’t. Data center hardware will not work in this edge use case.”

And since 75% of data is expected to be created and processed within the edge environment by 2025, that’s where the appropriate hardware needs to be. As such, the Linux Foundation is bringing Open19, an open-source standard for physical and operational hardware at the infrastructure edge.

Besides processing power, hardware is also essential to storage needs. And then there’s e Software-Defined Networking (SDN) and Network Functions Virtualization (NFV), both of which are decentralized technologies for virtual networking.

Both SDN and NFV are parts of an evolving networking architecture necessitated for edge networks. Some considerations for an apt edge network include latency, jitter, network hop, tail latency, not to mention speed. 

Evolving Network Architectures | Source: Linux Foundation

Victor BahlOpens a new window , Ph.D., Technical Fellow & CTO at Azure for Operators at Microsoft said, “The newer direction of edge computing is about making computing part of the networking infrastructure fabric. Edge computers that make things like autonomous driving, AR and VR experiences, fast action cloud gaming, IoT analytics, live video analytics, and many more such applications possible will now also provide core networking services.”

“The pace of innovation is incredibly fast, but it takes time for paradigm-shifting ideas to seep in especially when they’re possibly disruptive. The cloud was a big idea. Edge is an equally big and perhaps an even bigger idea. It’s about enabling ubiquitous low-latency computing. Thankfully, I no longer have to convince people on the need for edge computing,” he added.

As hardware, the software is the other big enabler for edge computing. It is a means to deliver and manage workloads and operations. The software comprises three layers namely systems layer, operating system, and hypervisor for workload orchestration.

Despite multiple new software innovations, Ashok IyengarOpens a new window , Garage Solution Engineering: Network Edge Computing at IBM Cloud believes that we’re still lacking a proper edge app that fulfills all expectations, and that AI and 5G will be at the center of it. 

Iyengar said, “While there are many edge applications, people are still looking for that killer app. Why is that? Is it a case of unrealistic expectation or did we over promise? I believe we just have to find the right recipe of combining AI, 5G, analytics, and hitherto unheralded technology.”

To speed up the process, VMware’s Open Source Lead for IoT and Edge, Malini BhandaruOpens a new window suggested opening up edge development to more people. “As we try to reach a larger landscape of applications and solutions that have different demands, the needs change. There’s a lot of engineering effort going into building protocols that can be standardized,” she explains.

“The approach is a stone soup: let’s come together, build it together and then leverage it together. By making these open-source projects and the infrastructure available, you’re opening the floodgates to many more adopters, to many more applications and solutions coming to market sooner.”

See Also: Close to the Edge: Automated Data Pipelines for Real-time Processing

Closing Remarks

Edge certainly is the next big thing after cloud, spawning an $800 billion industry in 10 years. However, it isn’t developed enough for large-scale adoption, and that’s exactly where we’re heading. But before it is adopted on an industry scale, two things need to happen, whether the edge is developed as an open-source model or not: 1. Security of edge needs to be revisited to build confidence. This includes physical, as well as logical, and application security. 2. Deployment costs need to be reduced.

Some of the other LF Edge projects include Akraino, and EdgeX Foundry, both in Stage 3: Impact, EVE, Fledge, Home Edge, Open Horizon (Stage 2: Growth), and Baetyl, Secure Device Onboard (Stage 1: At Large). The State of the Edge is currently in the Growth phase.

Let us know if you enjoyed reading this news on LinkedInOpens a new window , TwitterOpens a new window , or FacebookOpens a new window . We would love to hear from you!