AI, Cloud, and Telcos – A Tantalizing Interplay

essidsolutions

AI will be pivotal in presenting 5G in its unvarnished format. The cloud-native 5G network will facilitate the gradual entry of AI into the management of real-time RAN algorithms, explains Kaustubha Parkhi, Principal Analyst, Insight Research.

5G is upon us, albeit in a truncated form. Sooner than later though, telcos will seek to unleash the full force of 5G. It will include the standalone (SA) mode, new radio (NR), and the complete range of use-cases — enhanced mobile broadband (eMBB), ultra-reliable low latency communications (URLLC), and massive machine type communications (MMTC). Unveiling these features is not straightforward – it requires a complete reimagination of networks, including the network functions (NFs) themselves. 5G demands a network structure that is supremely agile.

This is where the cloud-native network functions (CNFs) come in.

CNFs present the next major step in decoupling the network function, principally including the radio access network (RAN). CNFs were preceded by virtualized network functions (VNF) that sought to, and with some success, were able to port the virtual machine (VM) – hypervisor model to the NF stack. CNFs move a step ahead, and usher the container philosophy into the largely well-set world of NF engineering.

The effects of these changes have been far-reaching.

Practically every established telco equipment vendor — be it Ericsson, Nokia, Huawei, or ZTE — offers virtualized or cloud-native versions of their products. And then there are companies like Mavenir, Athonet, Baicells, CCN, Phluido, and Quortus, that offer virtualized or cloud-native versions of the RAN and the packet core as their principal product. Clearly, the reign of hardware-driven, proprietary RAN and packet core is facing a serious challenge.

The interplay of CNFs and 5G brings to the table a more radical prospect – the role of artificial intelligence (AI) and machine learning (ML) in the core network architecture. It is important to take a pause here to make a few important distinctions.

The predominant discussion surrounding AI with telcos, especially 5G straddles two dominant strands:

  • How disparate AI applications will be greatly benefitted by the spectacular improvements in data throughput and network slicing abilities of 5G
  • How AI will help telcos in making sense of the overload of data that 5G-enabled use-cases will generate. The data in question mainly pertains to billing and customer care.

This article does not dive into what 5G can do for AI, it also does not focus on how role AI is impacting billing and customer care, functions that are not exclusive to telco operations. Instead, it attempts to trace and project the journey of AI deeper into the heart of the cloud-native 5G network.

Learn More: Narrow Versus Wide IT Strategies- On Which Side Should Your Enterprise Be Focused

What Does the Cloud-Native Nature of 5G Mean?

CNFs ride on containers, and containers lend themselves favorably to AI and ML constructs. Given their portability and agility, containers are ideal vehicles for ensuring that the computing power required for running distributed AI applications reaches the destination in an agile and flexible manner. 

Individual microservices contribute individual logic strands for the conventional software application. AI can be viewed in a similar manner – an application consisting of primitive algorithms, which can be broken down along individual strands. Typical AI primitive algorithms include pattern recognition, prediction and regression among other functions. 

Consider the methodologies for wrapping AI/ML models in containers. While being novel and challenging, they are now well-understood. DevOps lifecycle tools, such as GitLab and Jenkins Build Jobs, can be used for building the ML model and the container wrapper. Container registry and service tools as well as cloud-based storage services are mature products. ML platforms, such as AWS SageMaker, facilitate autotuning of models by enabling single-click training of petabyte-sized datasets.

The task of distributing AI/ML build and tuning updates is thus practically seamless. In this context, the isolation provided by microservices lends itself favorably to the differential evolution rates of individual primitives. AI microservices need to be reusable and organized hierarchically to facilitate their orchestration sequence.

Learn More: How To Create Test-Drive Solutions That Pique the Interest of Developers?

What Role Can AI Play at the Network Architecture Level?

We know that containers and AI are no strangers. What does their comfort level mean to the telcos?

As the RAN stack becomes increasingly cloud-native, AI can realistically graduate from managing customer experience or billing-related functions to network planning, management and troubleshooting. Indeed, AI can be called upon to play a role in the design and management of the RAN stack itself, in the near future.

The potential upside in the troubleshooting function is vast. Imagine AI being able to spot anomalies in spectrum usage or network traffic patterns. In March 2020, Nokia launched the AI-based, Microsoft Azure-hosted AVA 5G Cognitive Operations framework to help telcos anticipate, predict, and mitigate network failures up to seven days in advance.

Let us dig deeper and look at the role AI can play in managing the RAN stack itself.

The RAN is dynamic. RAN management, therefore, involves updating and tweaking different parameters at different update frequencies. Typically, design-level parameters require less frequent updates, and actual operating algorithms require updates on a millisecond basis. The more frequent the updates, the more attractive AI is as a management tool. 

Thus, ML is likely to find traction in manipulating the power control, radio resource management, beam-formation, quality of experience optimization, modulation, scheduling, carrier aggregation, and multi-connectivity functions, among others. Some of these functions require updates at millisecond levels. This will lead to obvious challenges.

Learn More: Fund Management: Leveraging AI To Streamline Data Operations

Challenges of Integrating AI With Network Functions

The challenge in integrating AI/ML with NFs is a subset of challenges that confront CNFs. CNFs ride on finely defined strands of microservices, which take NF customization, optimization and troubleshooting to a whole new level. Microservices are not without their challenges – latency and inspection overload being the key areas. 

In the context of AI and ML, the fine-grained nature of microservices creates an overload of parameters to monitor. One of the reasons why CNFs find traction in 5G is due to the increasingly complex nature of the 5G NR, which by some estimates, brings about an increase of several orders of magnitude in network operational data compared to 4G. 

While AI is eminently positioned to handle these challenges, it needs to be fed with the proper data sets. Designers need to relook at the very core of their network functionality rules as well as their data collection and storage practices. These rules and practices need to be made amenable to being modelled for and read by AI tools.

An additional level of challenge arises based on where the AI application is employed in the RAN stack. The farther the application from the base-band unit, the more constraining are the latency challenges. It can thus be counterproductive to engage AI for managing real-time configurable algorithms.

An intermediate approach will involve engaging AI in training parameters that are updated less frequently but require the crunching of complex and voluminous datasetsOpens a new window . In most cases, these parameters are the variables that feed into the “real-time” algorithms. The training can be facilitated by setting up programmable APIs in the RAN stack at the CU.

AI and its application in RAN will require validation and testing methodology of a different kind – one that is capable of ascertaining that AI is living up to its promised results. This is a major opportunity for network testing solution vendors.

AI is thus poised to play a pivotal role in the cloud-native 5G Opens a new window RAN design and management. The speed and the intensity of its progress will be decided by how quickly the RAN datasets lend themselves to AI application training. On a larger note, a broader consensus on the framework of a cloud-native RAN is also essential in making it AI-friendly.

Let us know if you liked this article on LinkedInOpens a new window , TwitterOpens a new window , or FacebookOpens a new window . We would love to hear from you! 

Disclaimer: This article is an analysis of data available in the public domain. The author makes no claims of original research. References are voluminous and can be made available on request