Affiliated with:

Agile Program Framework for Data and Analytics


It is important to connect program-level agile frameworks with data and analytics delivery and the variety of application programs that will benefit from agile, flexible development

As many organizations move beyond agile for individual projects, they make a transition to program-level agile frameworks.  Although these frameworks provide some guidance on how to support agile programs with shared services provided by enterprise architecture, they do not explain directly the role of enterprise data and analytics.  More specifically, while it is easy to find ideas for leveraging agile for data and analytics independently, the program-level frameworks being widely adopted do not explain how to connect data and analytics delivery with the variety of application programs that will benefit from these resources and capabilities.  As a result, many data and analytics leaders are on the periphery of the transition, reluctant to dive in and use agile methods.

The hesitancy is understandable.  How can a data / analytics leader serve multiple iterative programs with the data users need – and keep the data integrated along the way – when the whole idea of agile is to be flexible, allowing for uncertainty as requirements gradually evolve?  Won’t this inevitably exacerbate the data proliferation problem?

Agile Data Management and Analytics Principles

It is counterintuitive, but adopting a program-level agile approach with data analytics programs and in partnership with other agile programs and projects within the enterprise, can actually improve the ability to deliver data more coherently.  Why?  Because the principles that make enterprise data and analytics successful within an agile program are essentially the same principles that have always applied, except that they become even more important.  These principles include:

  • Building a central team to deploy data while maintaining an interdependent relationship with application teams that require the data, with an appropriate division of responsibilities between the two
  • Deploying only the data needed when it is needed in direct support of applications and analytics that require the data
  • Architecting and designing with extensibility in mind so that each data deployment contributes to a shared resource, rather than proliferating and splintering data
  • Allowing for experimentation with the option to “fail fast” – without confusing this activity with rigorous production development (“fail fast and correct fast”).

Popular program-level agile frameworks offer mechanisms and a vocabulary to realize these principles, often for the first time.  For example, SAFe® for Lean Enterprises, without addressing data and analytics directly, includes several specific concepts that can be used to institutionalize good principles for deploying enterprise data and analytics.  For example,

  • Agile Release Trains (ARTs) are long-lived groups of teams that deliver functional or enabling capabilities.  To coordinate activity across programs effectively, there should be a single ART (or a single agile “team” for smaller programs) to provide shared data to the ARTs that develop business functionality.
  • Enablers are backlog items that support functional backlog items.  Deployment of shared data should be positioned as enabler items, not functional items.  If there is no functional backlog item that requires the data, either the link is missing, or the data is being deployed prematurely and should be deferred.
  • Architectural Runway describes the deployment of enabling architecture on a just-in-time and just-enough basis.  Shared data should be deployed as part of the architectural runway in service to applications that require the data.
  • Lean Startup Cycle allows for testing hypotheses before committing to developing a production solution.  For data and analytics, this cycle promotes experimenting with new data sources and analytic techniques while preventing these experiments from becoming unstable “de facto” production solutions.

Figure 1: SAFe® for Lean Enterprises Essential SAFe; Copyright © Scaled Agile, Inc.

There are many other elements of SAFe® and similar large-scale agile approaches that can be used to make the role of the data and analytics program more explicit within the larger organizational context.

This approach also helps to avoid the common extremes that lead to difficulties.  Against one extreme, data proliferation is avoided by deploying data as a highly visible, shared resource to other programs. Against the other extreme, excessively large “foundational” projects are avoided by deploying only the data that is to be used for direct value, and no more. In many cases, training in these and related concepts can support an agile data analytics initiative.


Agile development is about flexibility, iterative value, rapid course correction, and effective collaboration within and across teams.  A program-level approach to agile development is exactly what organizations need for successful delivery of enterprise data and analytics.

A version of this article appeared on the Agile Data Strategy website:


Kevin M. Lewis

Kevin M. Lewis is Consulting Director for Teradata’s data strategy practice where he works across all major industries, helping clients establish and improve large-scale data and analytics programs. Kevin provides guidance in all areas of data and analytics including end-to-end methodology, organizational structure, governance, and business alignment. Kevin’s approach, based on his background in enterprise data architecture, guides clients to support the most important business initiatives of the enterprise while simultaneously contributing to a coherent foundation with every step.

© Since 1997 to the present – Enterprise Warehousing Solutions, Inc. (EWSolutions). All Rights Reserved

Subscribe To DMU

Be the first to hear about articles, tips, and opportunities for improving your data management career.