Derailers to Data Management Success

Data Management Evolution

A current state assessment of how the data management industry has evolved over the past ten years uncovers some major derailers to success, apparent even in some of the leading organizations practicing data management.

Using the principle of What and How can help simplify the identification of accomplishments and gaps. Two primary components of data management are the Data Framework and the Data Management Practice.  There is a What and How side of both. Interestingly, the accomplishment and gap for each are inverted.  The objective of this paper is to explore the root cause of the gaps and propose a path for gap closure.

  • Data Framework – the structure to define and organize data
  • Data Management Practice – the use of the development and execution of architectures, policies, processes, and procedures to manage the information lifecycle needs of an organization

Data Framework Gap: Who is Accountable?

Historically, business executives have unfairly deferred to the Technology function of the organization to “manage our data for us.” Business leaders have not accepted their accountability for the data generated from their business processes. They have erroneously viewed data as an output of a technology process rather than an output of the business process that the technical environment supports. They have not wanted to be encumbered with data management activities, seeing them as a distraction to their core business: selling a loan; treating a patient; or, managing the finances or risk of the business.

Data has continued as a Technology responsibility even as the value of data is increasingly recognized as a critical component for managing the business, meeting regulatory requirements, and most importantly, as a key to innovation and delivering differentiation to customers. Early execution of a formal data management function for the organization has mostly aligned to the Technology function. In those organizations, the new role of Chief Data Officer typically reported to the Chief Information or Technology Officer.

Not to entirely let the Technology executives off the hook, many have been eager to keep the data accountability. In an environment where data is not governed, anyone who needed data would call Technology to gain access to the data. As the keeper of the database, authority to grant access to the data defaulted to Technology.  By accepting the data accountability, they continued to give a hall pass to the business instead of demanding robust requirements for data.  When the business does not define requirements for data as part of designing the business process, the result will always be poor quality data – but this has been traditionally tagged as a failure of Technology when it is a failure of the business process design.

The Business is the owner of the business process that produces the data and thus, must accept accountability to define and govern the data accordingly.  This is the What side of the data framework.  Essentially, the Business must specify what they want the data to do for them with business rules to ensure the data captured is appropriate for all its intended uses.  Once the data is defined with requirements that are integrated into the business process, then technology can deliver the How side of the data – how to control the capture, storage, and transport of data through the data ecosystem. Only then do you have people, process, data, and technology aligned to capture and maintain high-quality data.

Data Management Practice Gap: Standard Process Design

The Enterprise Data Management Council (EDM Council – an industry association founded to elevate the practice of data management as a business and operational priority) has developed the Data Management Capability Assessment Model (DCAM™).  The DCAM™ is a tool that defines the What. This model describes the core component capabilities that should exist in a comprehensive data management practice.  These capabilities can be categorized as follows:

  • Data Strategy
  • Data Program
  • Data Governance
  • Data Content
  • Data Quality
  • Data Collaboration

In addition to capability models like DCAM, several other inputs are defining the What.  In some industries, regulators have described expectations in a data management practice.  Organizations have established data management standards along with data management strategies.  Collectively, these represent the What of data management practice.  Converting the What into tactical How executions have been one-off exercises with little effort to design standard data management processes even across an organization, let alone across the industry.

These instances of one-off process design and execution in a business domain have a positive impact on improving the quality of the siloed data in the domain. However, the result is the creation of arbitrary barriers to working across domains in the organization. These hurdles are not because one domain designed the process right and the other wrong.  It is that they are simply designing a different process both achieving a similar or acceptable result.

  • Different requirements for the same data.
  • Different governance processes.
  • Different data quality remediation processes.
  • Different technology and data management toolsets.
  • Etc.

These differences not only prevent working seamlessly across domains in the organization but also across organizations in an industry.  When more organizations attempt to consume Big Data into their data ecosystem, this will be accentuated.

Closing the Gaps

There is now an opportunity to leverage the best practices of various one-off instances and formally integrate them into a comprehensive set of standardized data management processes. Ideally, the process design would be vetted openly with data management industry practitioners from multiple organizations and industries through an industry group like the EDM Council.  This effort could establish standard level 2-3 processes (L1-5 scale with 5 being the most granular). The value of industry open design would then empower individual organizations with a solid starting point to fast-track custom level 5 designs to meet the unique execution needs of their organization while maintaining linkage to standard designs.

In addition to developing standardized processes, for an individual organization to close the gap in data framework and data management practice the owner of the business process that produces the data must become accountable for data management.  Two critical tools are required for an organization to achieve adoption of this accountability.

  • Define the formal relationship between the business and technology in data management
  • Develop an operating model that appropriately engages all stakeholders in data management

Blackline Between the Business and Technology

As the owner of the business process that produces data, the Business must not only be accountable for the data but also for the practice of data management. The responsibility to execute the data management processes, however, is a partnership with Technology.

Within the data management practice, the Business executes the processes supporting the data What, and Technology executes the data How. Adopting the concept of a Business Element as the What and Data Element as the How can help clarify this.

A Business Element is a unit of information that has a distinct business meaning in the context of a business process or collection of processes within a business domain. It does not physically exist in the data ecosystem of a domain.  Instead, it is a concept that when given structure through a comprehensive set of requirements, can be physically executed in the form of a Data Element.

A Data Element is a specific unit of data that has precise meaning or precise semantics.  The Data Element is the physical execution of the requirements of the Business Element with additional supporting technical controls to ensure data quality is achieved and maintained as an element is captured and moves through the data ecosystem.

This is the basis for the blackline between the Business and Technology in an operating model. Stated simply, when executing data management processes the business is responsible for the Business Element and technology is responsible for the Data Element. On the surface, this appears oversimplified, but all data management processes revolve around the Business Element and Data Element.

Data Ecosystem Stakeholder Operating Model

The data ecosystem within an organization or even in an individual business domain is a complex network of processes and stakeholders – Business and Technology.  Stakeholders are an individual or group engaged in a data ecosystem that is vested in either the creation or consumption of the ecosystem data. It is accurate to categorize the stakeholders at a high level as business and technology functions, however, there are important subgroups in each of those categories.

 

The purple section of the diagram contains those business-oriented stakeholders focused on the Business Element.  Conversely, the green section is those Technology-oriented stakeholders focused on the Data Element.  The pivotal bridge between the two foci is the data architecture function.

The first column of the diagram contains three types of business process domain stakeholders.

Business Process – every business process domain is a data producer.

Data Consumers – some business process domains are also data consumers. They consume data from another business process domain as an input to their business process.  During the execution of their process, the consumed data is transformed or used to derive new data.

Data Analysis – a specific type of data consumer for the purpose of the data analysis processes.  This can be simple data analytics and reporting, more formal business intelligence or highly sophisticated data science processes.

All three types of business process domains across the ecosystem impose What requirements on the business process and data of the data producer.  These requirements must integrate into the data set captured in the data producer’s business processes.

The third column of the diagram contains two technology process stakeholders.

Technology Architecture – responsible for How the requirements across the ecosystem are achieved by the technical environment structure and design.

Application Technology – responsible for How the design, build and run of the technical environment achieves the ecosystem requirements.

The center column of the diagram contains the drivers of the data management process.

Business Data Management – the business-aligned function accountable to facilitate across the business stakeholders to define and govern the Business Element.

Technology Data Management – the technology-aligned function accountable to facilitate across the technology stakeholders to achieve the Business Element requirements through the technology environment.

Data Architecture – the bridge between the Business Element requirements and the Data Element execution accountable to provide the framework to organize the data.

Conclusion

The CxOs and their leadership teams should honestly assess whether they have empowered or required the owner of the business process that produces the data to be accountable for the data and the data management practice.  They should use the stakeholder operating model as a guide to evaluate their current stakeholder engagement.

If found to be lacking, it is time for an overhaul of their data management program.

  • Obtain CxO adoption of the blackline between the Business and Technology establishing data producer accountability.
  • Leverage CxO leadership and the stakeholder operating model to engage all required stakeholders.
  • Realign the office of data management model for central enterprise level and federated local domains to the stakeholder operating model with defined execution roles across all stakeholders.
  • Support the industry organization efforts to develop mid-level (L2-3) standard data management processes.
  • Modify the industry standard processes to align with the organizational culture and structure.
  • Complete level 5 data management process designs for both the central and federated processes.

Closing these gaps is critical to achieving a data ecosystem that can efficiently and consistently deliver the business strategies of the organization.

Join the Conversation

Please provide your feedback on any points raised in this paper. Specifically, if you believe that you and your organization could benefit from engaging with other industry practitioners to share best practices and develop standard data management processes, please respond with your voice of support and ideas for how to structure an open source process design framework. Share this with other practitioners – let’s get the crowd moving.

The wisdom of many is greater than one.

About the Author

Mark’s unique value proposition is a business versus technology perspective on process design and data management. After 20+ year experience with a major financial institution in business driven process design and data management executive roles he now provides consulting services to help data management practitioners establish a business-centric data management program. He has also been engaged by the EDM Council to support the development of the How side standard data management processes for the industry.


Mark Thomas McQueen
Founder, FutureDATA Consulting
+1.615.308.6465
mcqueen@futuredataconsulting.com

Mark McQueen

Leave a Reply

Your email address will not be published. Required fields are marked *

Join the DataMgmt Design Network Knowledge Crowd. Be a thought leader, share your best practice with other industry practitioners. Then share this invitation with your peers - let’s get the crowd moving.
Join the Crowd
LinkedIn
Follow by Email
Facebook
Google+
Twitter