How to Thrive in the Extreme Data Economy

essidsolutions

“A data-powered business operates at hyper-scale, with hyper-complexity and at hyper speed.”

Grab a cup of coffee, clear a little space on your calendar and take the time to read this essential and immersive interview on the shape of the Extreme Data Economy. Nima Negahban, CTO and Co-founder, Kinetica, talks about Big Data, Extreme Data and how to best navigate this time when businesses are faced with an explosion of data from all sources.

How have you seen the C-suite’s approach to data as a business driver change and evolve over the last several years? What have been the major milestones in the journey?

Businesses used to use data to validate their decisions. Then came the big data era where business leaders turned to data to inform decisions that affected their overall strategies. Now, faced with a massive increase in the volume, complexity, and unpredictability of data driven by the Internet of Things (IoT) and digital transformation we’re in the post-big data era – what’s called the Extreme Data Economy.

With so much data on hand, organizations, and even countries, are turning to data to power their operations and serve as new sources of revenue. The data generated through business activities has often become more valuable than the business activities themselves due to the rich intelligence and context it provides.

A key milestone along this data journey has been the introduction of graphics processing units (GPUs) to the data center. The unique architecture of GPUs offer massive parallel compute capabilities and enable businesses to analyze huge amounts of complex and fast data.

The next key development has been the emergence of technologies that leverage GPUs to analyze extreme data. For example, a GPU-accelerated engine can help companies to perform streaming data analytics, gain location-based insights, and streamline machine learning, all within a single platform. The ability to ingest, analyze, visualize, and act on billions of rows of data to uncover patterns and anomalies, while delivering actionable insights in milliseconds for real-time response is critical in the Extreme Data Economy.
As more business leaders understand the value of the data they possess, they need to be equipped with tools and technologies that allow them to monetize and maximize this valuable asset. Extreme data requires new solutions as those created for big data can’t keep up.

Tell us more about your vision for the extreme data economy. Why is it crucial for business leaders across industries?

In 2011, the World Economic Forum identified data as a new asset class. How organizations utilize data will determine their success or failure moving forward. In the future, businesses may even be valued on the data they have and how they’re able to use it to power their business. As new data sources continue to emerge, the importance of data analysis has increased exponentially. It’s up to business leaders and technologists to evolve their thinking around data to the point where data shapes business strategies, drives investments, and enables hyper growth.

Extreme data is a challenge that stretches across all industries. For example, in financial services, asset risk calculations are becoming harder, with hundreds of variables to simulate. In logistics, streaming data analysis for real-time fleet management and optimization is becoming increasingly complex. In the retail industry, challenges like micro-segmentation and micro-personalization mean solving a much more difficult data problem. And, in the telecom sector, analyzing exploding network and device usage is getting more complicated each day.

What are the typical challenges large and enterprise business leaders face when it comes to competing in an extreme data economy?

A data-powered business operates at hyper-scale, with hyper-complexity and at hyper speed. Coupled with the unpredictable and perishable nature of extreme data, there are several significant challenges facing business leaders. Organizations across many industries need:

Instant analysis of streaming data: Ingest data from any source and get insights simultaneously

Visual insights: Visualize billions of rows of temporal, geospatial, and streaming data to reveal new patterns and opportunities

Streamline machine learning: Integrate machine learning and data analysis workflows to operationalize artificial intelligence

If a CTO or CIO of a large organization in a ‘non-tech savvy’ industry is tasked with building a competitive data strategy for the business, where would she begin and what should she prioritize?

One of the first priorities would be to identify the type of data-intensive use cases an organization is trying to solve. Does it involve real-time analysis of streaming data? A need to gain location intelligence at global scale? Or maybe a way to make machine learning algorithms accessible not only to data scientists, but to business analysts as well? Once the use case is defined, they can then outline obstacles to implementation and success. Often a significant challenge is the presence of legacy systems that can slow digital transformation initiatives. Organizations should then consider what new technologies they will need to invest in to be able to execute their data strategy and deliver solutions for their use cases.

The scale and complexity of the data challenge the organization faces will be an initial indication of the magnitude of changes needed in order to compete and remain relevant in a data-first world. For a non-tech savvy organization looking to significantly improve their use of data, this may require them to consider cutting-edge technologies that can help to accelerate their transformation into a data-powered business.

Understanding and deploying data solutions that can provide the right insights in the least amount of time will be a gamechanger for businesses and their customers.

One of the reasons technology leaders become less relevant is because they did not fully understand the language of functional leaders. What are your personal tips for CTOs and CIOs trying to better understand the language of business and to demonstrate the business impact of technology?

Tech leaders, such as CTOs and CIOs, should focus on getting the business teams max ROI from investments in innovative technology. For example, a large AdTech organization needed to be able to sweep through vast volumes of complex streaming data in milliseconds, in order to create, target, and deliver ads with incredible speed and precision. To do so, they needed to invest in additional technology, a GPU-based engine that could take advantage of artificial intelligence capabilities, allowing them to optimize auctioning by discovering patterns and uncovering hidden insights in sub-seconds. The ability to run ad-decision algorithms allowed them to easily target the right audience and display the ads likeliest to appeal to that target audience – a tangible and substantial business benefit from investing in innovative technology.

Additionally, the ability to operationalize an entire ML pipeline – such as with GPU-optimized engines – will make it possible to bring AI and IoT to business intelligence cost-effectively. And this will enable an organization to begin realizing a satisfactory ROI on these and prior technology investments.

If AI-powered BI is the future, and the complexity and velocity of data sources is only increasing; what mindset and skillset changes are needed today – across core functions – in order for large organizations to remain competitive tomorrow?

In order to remain competitive, it is incumbent upon business leaders and technologists to evolve their thinking around data to the point where data shapes business strategies, drives investments, and enables hyper growth – basically they need to become data-powered businesses.

In addition, it is important for leaders to also focus on education – retraining and upskilling workers as the result of implementing transformational technologies, such as AI and ML, enabled by accelerated parallel processing of GPU-powered engines.

What technologies and trends are your tracking in data management as we go into 2020 and beyond?

Trend #1

Organizations Demand a Return on Their IoT investments: Companies continued to invest in IoT initiatives, but we look forward to IoT monetization. While it is a good start for enterprises to collect and store IoT data, what is more meaningful is understanding it, analyzing it, and leveraging the insights to improve efficiency. The focus on location intelligence, predictive analytics, and streaming data analysis use cases will dramatically increase to drive a return on IoT investments.

Trend #2

Enterprises Will Move from AI Science Experiments to Truly Operationalizing it: Enterprises have spent the past few years educating themselves on various AI frameworks and tools. But as AI goes mainstream, it will move beyond small-scale experiments to being automated and operationalized. As enterprises move forward with operationalizing AI, they will look for products and tools to automate, manage, and streamline the entire machine learning and deep learning life cycle. Investments in AI lifecycle management will increase and technologies that house the data and supervise the process will mature.

Trend #3

Beginning of the End for the Traditional Data Warehouse: The traditional data warehouse is struggling with managing and analyzing the volume, velocity, and variety of data. While in-memory databases have helped alleviate the problem to some extent by providing better performance, data analytics workloads continue to be more compute-bound. We predict that enterprises will start to seriously re-think their traditional data warehousing approach and look at moving to next-generation databases, either leveraging memory or advanced processors architectures (GPU, SIMD), or both.

Trend #4

Building Safer Artificial Intelligence with Audit Trails: AI is increasingly getting used for applications like drug discovery or the connected car, and these applications can have a detrimental impact on human life if an incorrect decision is made. Detecting exactly what caused the final incorrect decision leading to a serious problem is something enterprises will start to look at. Auditing and tracking every input and every score that a framework produces will help with detecting the human-written code that ultimately caused the problem.

About Nima Negahban:

Nima is the Chief Technology Officer, original developer and software architect of the Kinetica platform. Leveraging his unique insight into data processing, he established the core vision and goal of the Kinetica platform. Early in his career, Nima was a Senior Consultant with Booz Allen Hamilton. Nima holds a B.S. in Computer Science from the University of Maryland.

About Kinetica:

Kinetica is the insight engine for the Extreme Data Economy. The Kinetica engine combines artificial intelligence and machine learning, data visualization and location-based analytics, and the accelerated computing power of a GPU database across healthcare, energy, telecommunications, retail, and financial services. Kinetica has a rich partner ecosystem, including NVIDIA, Dell, HP, and IBM, and is privately held, backed by leading global venture capital firms Canvas Ventures, Citi Ventures, GreatPoint Ventures, and Meritech Capital Partners. For more information visit kinetica.com.