Setting Big Data Standards to Improve Data Collection and Usability

essidsolutions

The importance of data management to commercial successOpens a new window may be an old chestnut in the evolutionary confluence of IT, information management and business that has come together in the Internet of Things. But for many companies, the big data that drives advanced artificial intelligence and machine learning technologies remains the Gordian knot – and it’s becoming increasingly critical for business intelligence as the volume of data expands.

The looming implementation of 5th generation mobile communications technology in the US, Europe and other developed markets promises still more profound change in a field that’s seen its fair share of digitally-driven disruptions – including the latest, the EU’s General Data Protection Regulation (GDPR)Opens a new window , which takes effect in May.

Thanks to its ability to shift greater volumes of information at faster speeds across a wider spectrum of frequencies than the 4G standard it will replace, 5G leverages the IoT in ways that add depth and breadth to the “data lakes” generated by connected devices and from which companies must tease out operational and customer information in pursuit of increased efficiency and revenue growth.

5G as a Data Management Pain Point

However, just as it’s predicted that around half of companies that do business in the EU will fail to comply with GDPR notification and reporting requirements when the legislation comes into force, the arrival of 5G appears likely to be a pain point for enterprises already struggling with information overload.

According to an annual benchmarking studyOpens a new window by credit-scoring service Experian last November, businesses are looking harder than ever at data as a tool for operational problem-solving and to meet the demand dynamics of their customers. However, they’re finding big data’s so-called six V’s — volume, variety, variability, veracity, velocity and value — are overwhelming their data management systems and analysis applications.

Small wonder that half the US companies surveyed plan to undertake data integration projects in 2018 aimed at providing a consistent view of data from disparate systems to the whole of the enterprise.

However, the solutions offered by vendors – focused on hardware, software and cloud-based platform and service delivery – take almost as many forms as the silos in which enterprise data is generated. Capturing information from different layers and refining that raw intake with tools that employ rules-based analytics engines is their golden mean for generating actionable insights from the data deluge.

The Human Factor in Information Management

Meanwhile, management consultants advise leaders to centralize ownership for data management and add expertise. Creating the human infrastructure needed for communicating strategy and monitoring progress is the organizational route to integrating systems and technologies that have evolved to solve specific business problems.

It’s also critical to creating and implementing standards for data gathering, analysis and reporting across the enterprise – all of which is essential for achieving efficiencies and realizing opportunities from 5G that boost the bottom line.

Neither prescription is cost-free. And, as with the considerable spend for GDPR compliance, addressing security concerns is vying for budgetary priority with building of use cases for 5G and IoT that might generate positive returns from investments into database management, analytics and reporting.

A 2016 studyOpens a new window by management consultancy CapGemini and cloud software specialist Informatica found that more than half of all big data projects underway at the 200 companies surveyed fail to deliver a positive return on investment, and budgetary constraints topped the list of reasons why fewer than a third did more than to pay for themselves.

Holistic Approach to Data Analysis and Management

The competition for resources with data governance and data security highlights the importance of adopting a holistic approach to data management.Opens a new window Doing so ahead of the introduction of 5G will enable companies to assimilate the packets of information emitted by networks of IoT sensors, arriving on a scale of terabytes per day in the case of industrial users, with the structured data from customers and operations that populate warehouses and databases.

It will also help them to marry flows of unstructured data, such as the text, e-mail and video content that will get a traffic boost from the 5G standard’s capacity to enhance mobile communication, into those repositories.

In the technology stack, this focuses on improving quality through capture, profiling and validation of data to pre-determined standards using defined processes. Analyzed for insights and enriched with third-party information from customers and suppliers, data stored centrally can be made accessible across the enterprise for use at the edges, in business processes, event generation and reporting.

Conducted against a background of continuous improvement in analytics and metrics, the resulting enhanced persistence and usability contribute to offsetting the costs of management and warehousing.

Governance and Stewardship in Business Intelligence

Within the organization, the delegation of data management to digital officersOpens a new window is a step toward better governance – and doing so at board level even more as it communicates top-level sponsorship of big data projects.

So, too, is the installation of data stewardsOpens a new window that straddle IT and operations. By working with the business line managers who own the relevant data in the C-suite and those charged with handling that data at the coalface, they together can monitor consistency and delivery of information across departmental and operational lines and make adjustments to strategy based on business outcomes.

The resulting transparency is useful not only in improving accessibility. It facilitates compliance with regulatory initiatives like the GDPR, which will require companies to gain consent from consumers ahead of using their data, as well as to enhance levels of security and improve the timeliness of notifications in the event that breaches occur.

The new EU regime is already prompting companies to revisit their data management strategies ahead of its introduction and to create end-to-end reporting trails around uses of personal information.

Gearing Up for the GDPR

These strategic revisions extend beyond integration exercises. Experian’s latest annual benchmarking exercise found an across-the-spectrum rise among 1,000 respondents in US and global markets, with fully half undertaking data management projects in the current year.

Big Four accountancy EY, in association with the International Association of Privacy Professionals, put spending by the Fortune Global 500 on GDPR complianceOpens a new window at $7.8 billion as of last November. Those companies anticipate an average of almost 250 GDPR enquires per month, according to US-based digital security firm Senzing, extending across more than 40 databases and requiring almost 1,260 hours of employee search time to resolve.

As a result, projects aimed at boosting the pace of extraction and transformation are focused on monetizing that data in order to cover their costs as an underlying goal. Standards that permit the embrace of larger data volumes are especially useful in detecting lost revenues from billing errors and omissions, and in detecting fraud and combating piracy.

Turning Big Data into Profit

Better data also helps raise conversion rates and lowers costs through improvements in CRM, business intelligence and marketing. And they do so through the AI and machine learning technologies that are poised to receive a powerful boost from the introduction of 5G.

When that torrent starts in 2020, companies that can leverage data analysisOpens a new window as much for process efficiency with IoT as for transformation of operational models will enjoy a decided advantage.

While they may not be able to slice through the knots of bigger and faster data flows at a stroke, addressing both the technical and business aspects of bringing data together at enterprise level is the best way for most companies to profit from the changes around the corner.