In this article Emily Washington, executive vice president of product management at Infogix tells us organizations that want to maximize the value of their data utilize a business-friendly approach to data governance that prioritizes data understanding. Still, business users won’t leverage information they do not trust, making data quality a key element of any data governance program.
Companies of every size and across every industry are regularly using data analytics to fuel innovation, improve business outcomes and achieve critical objectives for competitive advantage. Business users are increasingly on the front line of these analytics initiatives, spending hours sifting through data in search of valuable customer insights, market trends and opportunities to enhance their business. That’s why organizations that wish to maximize the value of analytics insights must also create data-savvy business users who understand and appropriately leverage enterprise data assets.
Data governance is a critical tool that can increase data understanding among data consumers and help build a data-driven culture. Yet many organizations continue to view data governance as an IT initiative or struggle to establish enterprise adoption that encourages open communication across disparate business units and IT resources.
Companies that embrace a business-first enterprise data governance strategy not only promote communication between IT and business, but also foster collaboration between business and technical data users to build business-friendly governance tools like data catalogs. Data catalogs provide both technical information and business context for available data assets, along with business terms, definitions and data lineage. When organizations prioritize business user understanding, they are empowered to quickly turn data assets into actionable business insights. Nonetheless, business users won’t utilize information they do not trust, making data quality a crucial component of any data governance effort.
Merging Data Governance and Data Quality Efforts
Complex regulatory compliance, growing reliance on analytics and the ever-increasing speed and scale of data make data quality more critical than ever before. As data moves across the data supply chain, it is continuously subject to new uses, processes and transformationsâ€”exposing it to additional data integrity risk. Companies must monitor, track and reconcile data within and between every system and environment to solve quality issues before bad data creates significant business problems or generates faulty insights.
Data governance is the people, technologies and processes that enable data users to easily understand, access and use enterprise data for business purpose, and provides a framework for protecting data integrity across the entire data landscape.
Data governance also informs data consumers about data usage, meaning and quality levels, to build trust and encourage data utilization. But data trust requires comprehensive data quality rules to ensure reliable and accurate information.
Instituting Robust Data Quality Rules
To ensure data integrity enterprise-wide, organizations need to establish data quality checks as an integral part of a data governance framework, including:
Data profiling to ensure the completeness, consistency and conformity of data. Data profiling validations are critical for measuring the quality of data being used for different analytics projects.
Balancing and reconciliation to assure that data arrives accurately, at the appropriate location. These validations are used to monitor critical business processes and to make sure missing or inaccurate data doesn’t hurt a company.
Timeliness checks are necessary to make sure files arrive in time and flag any late or missing files. Timeliness validations are especially important when dealing with outside data to ensure third party information is not negatively impacting a business.
Statistical checks are vital to validate data sets based on statistical values, like expected standard deviation or other industry-defined methods to locate any data abnormalities.
Reasonability checks are essential to verify data values meet expected thresholds. Both statistical and reasonability validations allow businesses to find data inconsistencies that otherwise would go unnoticed by traditional checks.
With data quality powered data governance, organizations can easily score and monitor data integrity, preventing data issues and building trust among business users. In addition, it supplies both technical and non-technical users a full view of their data, leading to increased revenue, customer retention, competitive advantage and an enterprise-wide culture of data.