Affiliated with:

Six Crucial Refinements to Data Strategy Conventional Wisdom

image 9

Although data strategy involves many best practices, some aspects to the conventional wisdom may be more effective in real life

Most data management professionals have read one or a few books and articles on devising a data strategy and may have noticed some repeated themes. Although the guidance conveyed in these themes may be correct, they can also be dangerously vague, even when supported by hundreds of pages of elaboration. Many data management leaders have followed both the spirit and letter of widely-repeated principles and veered far off track, resulting in data and analytics initiatives that take too long, cost too much, or are severely misaligned with company priorities. After following a strategy based on rigorous study of well-regarded literature, these results can be confusing and disheartening – and extremely costly.

However, some subtle refinements in conventional wisdom can produce a dramatic difference in results. After having encountered the standard advice many times, it’s easy for similar – but crucially different – suggestions to be interpreted as a repetition of the same old ideas. There are differences between what is commonly known and the more specific guidance that will help  create a successful data strategy.

Conventional Wisdom Refinement #1

Conventional Wisdom:

All data strategy efforts start by identifying the business value of data and analytics.

Refinement:

It is better to support a funded business initiative with good quality data and analytics.

Why it makes a difference:

Supporting funded business initiatives does something important for the organization – by definition. Proposing the business value of data and analytics projects independent of other funded initiatives can result in a competition with those other initiatives instead of contributing directly to their success.

Meanwhile, the other initiatives will be left on their own to collect the data for their analytic needs. Virtually all major initiatives need reporting and analytics, whether embedded within applications or delivered through specialized tools, and they could use help to collect the data effectively. The goal should be to provide this help by contributing to and reusing a shared data resource that advances in breadth and quality each time to help a funded business initiative.

Conventional Wisdom Refinement #2

Conventional Wisdom:

A strong executive sponsor is essential for the data and analytics program.

Refinement:

An executive sponsor for data and analytics and at least one sponsor of a funded business initiative that needs the data and analytics is the minimum requirement for executive support.

Why it makes a difference:

If there is an executive sponsor responsible for a funded business initiative willing to defend a project, then the effort is positioned appropriately, consistent with the previous principle. If it has been possible to recruit only an executive sponsor of the data and analytics initiative itself, it’s an indication that the effort is not aligned to the strategic value-producing priorities of the company. Having a single, strong executive sponsor can even have a downside – he or she can keep a doomed program alive for too long.  This can be evident when the one sponsor is distracted by other interests, takes on a new role, or leaves the company, and there is no sponsor of a funded business initiative willing to confirm that the data is absolutely needed.

Conventional Wisdom Refinement #3

Conventional Wisdom:

It is critical to regard the program as a continuing journey, not a destination.

Refinement:

Institutionalize the program by embedding data and analytics planning, implementation, and operation into the machinery of the organization.

Why it makes a difference:

In any large organization, new data and analytics requirements will emerge for as long as there are changes in business and technology. In other words – forever. But to meet these requirements effectively, it’s not enough to have a sponsor, a team, and a series of projects. To make the “journey” more systematic, a program needs to be ensconced in the organization by linking to existing functions outside of the data and analytics program itself.

For example, the organization’s strategic planning process should include identification of data needs so that new data initiatives are created in support of and at the same time as strategic business initiatives. The project funding process should examine all projects – not just “data and analytic” projects – to verify or recommend the role of shared data resources to serve the projects, preferably as part of a broader enterprise architecture function that offers shared resources of all kinds, including data resources. Project methodologies should establish standard tasks for proactive data management such as data profiling, logical and physical data modeling, and data quality monitoring, along with the role of the data steward and other business and IT data management roles. With this approach, the data and analytics program becomes an inherent part of the way the organization does business.

Conventional Wisdom Refinement #4

Conventional Wisdom:

Build an iterative plan – the “big bang” approach doesn’t work.

Refinement:

Deploy data just-in-time and just-enough to meet the needs of in-scope applications.

Why it makes a difference:

It’s true that the “big bang” approach to data and analytics doesn’t work. It’s also true that nobody does that. There are plenty of projects that are much too large in scope, but those projects are always either one-off projects or part of an even larger iterative plan. So, take this planning a step further. Scope the iterative projects so that each data element to be deployed and each data issue to be resolved are essential to meet the needs of named, targeted, in-scope application projects within supported business initiatives.

Without specific needs identified – in a rigid dependency relationship – there is no basis for controlling scope, even within an iterative block of work delivering just one or a few data subjects. If the objective is to deliver Customer data, for example, which attributes will be collected of the potentially hundreds of possibilities? How the team determine the level of data quality needed? Is perfection possible? Should the team just ask the end users what they want? Aligning to supported application projects makes scope control much easier and ensures that every action related to data can be traced to a real business need, not just a desire to have “good” data.

Conventional Wisdom Refinement #5

Conventional Wisdom:

Integrate data with each project, contributing to coherent data resources iteratively.

Refinement:

Architect and design shared data resources using principles of scalability and extensibility.

Why it makes a difference:

One reason data management leaders believe they need to deliver entire data subjects one by one, rather than targeting specific applications, is that they are afraid they will miss something and have to do excessive rework when new requirements emerge.

But it doesn’t have to be that way. By applying principles of scalability and extensibility, it is possible to deliver only the data needed by in-scope applications, treating each data delivery as a puzzle piece that fits with all the other puzzle pieces in the shared data resource puzzle.

For example, practices that promote scalability and extensibility include modeling data at the lowest level of detail, obtaining data from original sources, and building right-time and adjustable integration processes. Also, developing enterprise data governance and metadata management policies and standards can support multiple initiatives simultaneously. Each of these approaches promotes the highest level of reuse and minimizes rework, even if each data delivery project requires at least some new work to meet the needs of new applications. The more the efforts take this approach, the less work is needed to meet the needs of new applications because most of the data is already available, with more being added all the time.

Conventional Wisdom Refinement #6

Conventional Wisdom:

Enable self-service for data and analytics as much as possible.

Refinement:

Carefully differentiate between experimentation and production and treat each approach accordingly.

Why it makes a difference:

Qualified end users should have the ability to import raw data, leverage production data, and experiment with analytic methods to test hypotheses, build prototypes, and take immediate business action based on their analysis when there’s an opportunity to do so. But deploying production data is IT’s job – especially data that is needed across many business use cases. There should be a healthy interplay between end user experimentation and production delivery, with a careful articulation of who is responsible for what and when.

In too many cases, end user-developed systems become de facto production systems, which defeats the purpose of self-service because the end users become burdened with excessive maintenance and support. Worse, many end users perform redundant work managing the same data in slightly different forms across the organization. It’s much better for IT to deploy shared production data so the work is consolidated, and shared data resources benefit from the ever-increasing quality and integration requirements of multiple initiatives. If IT has challenges with production data planning and delivery, those should be addressed directly. Getting around the problem by deploying data haphazardly across the enterprise ultimately makes the underlying problem much worse.

Conclusion

When the author started a journey to build a data and analytics program, that process included reading all the well-known books on the subject, applying the core principles to the extent possible. Yet, even with all the studying and careful planning, many significant and sometimes painful course corrections were required while learning what works in real life. Carefully considering these refinements to the conventional wisdom can help sidestep common struggles by developing a practical data strategy that works the first time, no matter how large or complex the organization.

LinkedIn
Facebook
Twitter

Kevin M. Lewis

Kevin M. Lewis is Consulting Director for Teradata’s data strategy practice where he works across all major industries, helping clients establish and improve large-scale data and analytics programs. Kevin provides guidance in all areas of data and analytics including end-to-end methodology, organizational structure, governance, and business alignment. Kevin’s approach, based on his background in enterprise data architecture, guides clients to support the most important business initiatives of the enterprise while simultaneously contributing to a coherent foundation with every step.

© Since 1997 to the present – Enterprise Warehousing Solutions, Inc. (EWSolutions). All Rights Reserved

Subscribe To DMU

Be the first to hear about articles, tips, and opportunities for improving your data management career.