How the Slowdown of Moore’s Law Has Fueled the Rise of Computational Storage

essidsolutions

Hao Zhong, CEO, ScaleFlux, in this article, explains that as Moore’s Law comes to a close, organizations are struggling to find ways to quickly process the growing amount of data that is being produced. With the need for bringing data closer to compute, many are turning to computational storage.

Moore’s Law has served as a framework for predicting the trajectory of technology for over 50 years. The principle statesOpens a new window , “the number of transistors in a dense integrated circuit (IC) doubles about every two years.” In simpler terms, this means that what’s possible (i.e., the compute power) of personal computers and other devices will roughly double every two years and become more affordable with each transistor added. A true win-win. It’s this rate of acceleration that has made possible some of the most life-changing and promising technologies in existence today. 

But all good things must come to an end. The volume of data being generated today is unprecedented. Approximately 90% of it has been createdOpens a new window in the past two years alone. We’re living in very different times than Gordon MooreOpens a new window . And, as technologies like artificial intelligence (AI) propel technological advancements even faster, Moore’s Law is slowing down significantly. Some experts, like MIT computer scientist and parallel computing pioneer Charles Leiserson, have even gone so far as to sayOpens a new window Moore’s Law is over. But whether it’s over or just slowing down, it’s critical that we focus on building infrastructure that facilitates continued innovation and adapts to the new challenges of our changing world. 

Today, central processing units (CPU) can simply no longer sustain the historical performance growth rate of Moore’s Law. For CPUs to keep up with the current explosion in data, modern infrastructure must adequately support them with additional computational capability. This need has created extraordinary new opportunities for innovation within the storage industry, including computational storageOpens a new window . 

Let’s take a look at how technology is shifting to accommodate these changes. 

Supporting the Data Center

Networking and storage are two major building blocks for data center infrastructure. It’s essential to increase both the computational capability and intelligence of these components to combat the challenges impacting CPUs as a result of the slowdown of Moore’s Law. 

Graphic processing unitsOpens a new window (GPUs), tensor processing units (TPUs), and field-programmable gate arrays (FPGAs) are effective at offloading AI workloads from CPUs. Additionally, SmartNICs are helping CPUs by taking more network-related computation into the NIC chipset. Computational storage plays an integral role in supporting CPUs by offloading storage-related compute functions. 

Learn More: Edge Data Centers: The Cure for Latency

The Benefits of Computational Storage

Transitioning storage-related computation functions from CPUs to storage devices provides three notable advantages. 

Firstly, it reduces data movement. With computational storage, there is no need to move all of an organization’s data from solid-state drives (SSDs) to dynamic random-access memory (DRAM) and CPUs since they can be processed at storage devices.

Secondly, it markedly accelerates computing speed. Some functions, such as compression and encryptionOpens a new window , can run orders of magnitude faster and accelerate application workloads if implemented in application-specific integrated circuits (ASICs) at the storage device side versus in CPUs.

And finally, computational storage increases compute parallelism. The most abundant slots in a server are for SSDs. Because of this, whenever it’s possible to offload storage-associated computation to SSDs, it dramatically increases compute parallelism and thus scalability. 

Learn More: Is Your Colocation Data Center Living in the Past?

Computational Storage in Action

Computational storage is providing value for relational databases like MySQL, PostgreSQL, and MariaDB by reducing data storage costs and simultaneously improving database performance. 

Big data and data warehousing software such as Spark and Snowflake rely on the technology to improve data analytics performance by offloading pre-processing and filtering operations. Other technologies, like content delivery networks (CDN), employ computational storage to accelerate security features for flash storage, while AI benefits by increasing the usable capacity of SSDs.

Despite the slowdown of Moore’s Law, there are still ways to keep technological advancement on the fast track. Computational storage is doing its part by supporting the data center in a major way and therefore facilitating technological breakthroughs, as well as evolving the day-to-day technologies we’ve all come to depend on. As the computational storage proceeds to accelerate data center workload, it can be leveraged to help edge computing where the storage and compute are also in great demand.

Let us know if you liked this article or tell us on LinkedInOpens a new window , TwitterOpens a new window , or FacebookOpens a new window . We would love to hear from you!