This case study is the first in a series of enterprise excellence whitepapers
Recent years have witnessed significant push to deploy data and quality governance (DQG). While some were successful, most were not! Why?
At the core, data and quality governance (DQG) is about ensuring enterprise performance is optimally managed and risks are mitigated. There are several components that predicate governance success; all of which promises nothing less than changing the very way we conduct business. Easy enough? In this paper, we will discuss the core premises for DQG success followed by examining each premise in greater details in future series.
First of all, why all the buzz for DQG? In the post-era of extensive ERP and CRM deployments, BI initiatives quickly followed—all with the promise of streamlining enterprise performance and yielding JIT information management. Similarly, CDI and MDM were often deployed in parallel with BI solutions. While these large investments in many instances yielded an integrated view of the customer along with key functions, information quality and accuracy was a different matter!
Design premises for DQG, like its name, must cater to understanding the data flow, information needs to respective stakeholders, and the critical success drivers before we begin to think of governance. Therefore, and here we go, “I submit that data governance is obsolete unless the following design premises are made part of the whole:”
Case in point, here are few examples from this $40-billion global manufacturing company,
Premise 1: “Data & quality governance is a holistic enterprise transformation”
First, the DQG team proposed a data vision for manufacturing operations:
“Enhance confidence, agility and accuracy in decision making by empowering internal and external customers with consistent, reliable and relevant information”
Translation meant the following:
1. Confidence, consistent, reliable information: install two-tiered metrics: (1) quality confidence metrics in support of (2) key functional metrics
2. Agility: reengineer the way we input and enrich information by using a single information source with JIT views for stakeholders
3. Empowering internal and external customers: reengineer critical customer-facing and internal-information paths to cater to both external and internal Voices of the Customer (VOC). In other words, what information matters most to these stakeholders? This brings us to premise 2.
Premise 2: “DQG is about business process improvements--technology is the enabling arm, not the other way around!”
Second, the DQG team execution steps built on the following guidance:
> “Leverage current state assessment to identify data management and data governance priorities
> Define and establish data governance processes, policies and organization
> Build and automate data quality scorecard for key operational metrics”
For process optimization, building on best practices such as the DMAIC methodology is bound to accelerate the DQG cycle:
For the Define, Measure and Analyze phases, define current and desired future state including data attributes and systems relationships; assign success measures to the future state followed by analyzing “as is” point of failures such as broken a data flow, an incomplete or ill-conceived policy…
The Improve phase will cater to your analysis findings. Improvements in data flow may include data sourcing, standardizing, validating, matching and consolidating, to name a few
Premise 3: “Execution of DQG is proportionately predicated by the degree of understanding organization adoption and maturity models”
Third, cross-functional participation for governance included the following in DQG team charter
“Establish a functioning governance body to drive data priorities, decisions and metrics across manufacturing operations”
The journey of data evolution often refers to the following maturity stages:
(1) Undisciplined à(2) Reactive à (3) Proactive à (4) Optimized (with governance)
Start with building a realistic time-bound transformation roadmap with milestones for each phase. Because data and quality governance is abstract to many stakeholders, the adoption risks are often mitigated when selecting one functional area and building a prototype first.
The governance quest: governance is the last phase in the data maturity continuum; and, it cannot be accomplished without understanding and integrating a change management framework for an iterative organization adoption.
Define clear accountability for a process (data elements); you may have heard some of these terms: Data Steward, Data Custodian, Metrics Steward, and Data Trustee; when defined clearly and executed as part of a performance management system, all play pivotal roles in cultivating a culture of cross-functional data accountability; we will expand on these roles in the next series. Key components of the data governance phase may include the following:
I. Data Quality process: Ensures clear data requirements with single source of truth (SSOT)
II. Data Profiling: Ensures consolidation of data: parse, match, validate
III. Data Monitoring: Provides proactive monitor and notification of “process X” data health
IV. Data Governance: Enables clear accountability for “process X, Y, Z” data by defining roles for Data Stewards and Trustees
Leading practices in this domain continues to evolve; CobiT for instance, while I would argue it remains incomplete, it nonetheless, offers good insights into governance frameworks
Key-takeaway for planning and executing a DQG framework:
1. Start with defining your data vision and execution strategy; this is best done when involving multiple functional leads—build a realistic roadmap
2. Borrow from a proven reengineering approach. Understand the data flow paths and corresponding organizations. Borrow from existing frameworks. For example, DMAIC methodology and PMI’s risk management practices... Other practices such as CobiT and ITIL provide additional insights
3. Adopt a change management approach. Harvard Business Review offers many case studies and whitepapers on this topic: Checkout Amazon: http://www.amazon.com/Harvard-Business-Review-Change-Paperback/dp/0875848842
We will take a deeper dive in the follow-up whitepaper using the same and other case studies
See you soon,
--Terry Jabali
About the author: Terry has over 20 years of enterprise leadership with key emphasis on data, process, and technology transformations. He is a Six Sigma Master Black Belt and best practices contributor to CMMi® and PMI® certification standards; he is the managing director at Applied Enterprise Dynamics; you can reach him at tjabali@pmoiq.net