A fact qualifier matrix can identify the specific data and usage needs for data delivery. However, a successful fact qualifier matrix relies on robust data definitions
Many organizations attempt to provide DW/BI/analytics users with data delivery capabilities, but the products do not offer sufficient insight into the data sources, data usage, meaning, etc. As a result, the landscape is littered with project failures, cost and schedule overruns, and solutions that do not meet business value goals.
Technical and business resources are trained to deliver solutions, with a focus on the definition of requirements. One favorite solution is a fact/qualifier matrix. This tool was developed to help IT and the business to identify the specific data and usage needs that will lead to a successful solution. It is considered to be an excellent way to manage requirements identification in its simple form. The addition of the data definitions from business adds to the value of the matrix, especially if the definitions come from the organization’s enterprise data model
Enterprise Data Model and Fact Qualifier Matrix
If the organization has developed and populated an enterprise data model, and the appropriate metadata is available to explain it, completing a matrix is quick and efficient. If not, it is often an exercise in futility. Finalizing the matrix takes a long time, and by the time the business has adopted the matrix’s structure and content, the fields and derivatives often have evolved significantly from the initial matrix.
The challenge with the simple fact matrix (absent an enterprise data model) arises from the fact that the analyst builds it with someone from a specific part of the organization, trying to understand how this team want to see data; ultimately, the matrix is from their perspective. When it comes to how other people see it, they may use the same terms with completely different meanings. Since these two solutions cannot be reconciled without a lot of effort, it is covered up by the statement “our data is just too complex”. No, it is not the nature of the data that is complex, it is the nature of the usage.
Using Technology to Solve Terminology Problems
Invariably, although much focus is put on the technologies, they will only be effective if the data definitions are established clearly. If an enterprise’s data is clearly defined, managed, and governed, the application of BI type toolsets to manage it is an easy and extremely effective task. Think of the number and variety of front end BI tools in the marketplace today. They range from simple reporting tools to analytics, dashboards/scorecards, and even heavy duty data mining. All have their loyal customers that swear by, and sometimes at, their specific tools. Thus, ask if there is really only one good tool that will serve the solution, or if the tool is just a method of delivery.
It is easy to capture a user’s imagination of the value of any technology through demonstrations. However, if the data was clearly defined and available on the appropriate platform, providing actual value would be an viable. First, define the specifics of the rule that needs to be executed. Using “if / then” logic is all it takes to begin to monitor that rule (not to mention there are several good rules engines that work extremely well against sound data). Yet, few organizations are able to realize this because they have not captured the full complement of their metadata, which is really a business – not a technical – issue.
A specific example could be in the finance industry centered on the definition of net sales. While sales viewed this as <gross – SGA expenses>, marketing looked at is as <gross – SGA – commissions>, and finance viewed it as <gross – SGA – commissions – rebates>. Through talking with various business leaders, it became apparent that this was an issue they had been dealing with for years. Since each area referred to it as “net sales” and they didn’t come together to resolve the differences, it created deeper and wider distances among these business units. The truth was simply that there were 3 different outcomes they were looking for, and all 3 had measurable value with different definitions.
Do Not Build Data Dump
An enterprise data model, with sound and complete metadata, and strong business data governance takes significant time to establish. Business leaders often command specific solutions, yet they are very strong thinkers and generally are very willing to discuss options. When IT and the business truly collaborate on how to satisfy the business needs both now and in the future, the reality of “doing more with less” can be realized. One measure of IT leadership success should be the ability to properly inform the business of the complexities, challenges, and importance of a sound, scalable design. Yet, most often solutions are to grab what can satisfy the immediate need and promise that all of these “one-off” solutions will magically fit together in the future.
Most solutions highlight the tools and technologies, not the design. Most large companies have re-architected their data warehousing / business intelligence solutions many times for this exact reason. When they try to understand why they spend so much every year on maintaining their various solutions, they return to the drawing board and ironically, the enterprise data management initiative is the first one cut. It is equivalent to IT acknowledging to the business that we will continue to deliver data solutions held together with duct tape and band aids. We have all heard that the definition of “insanity” is doing something over again the same way and expecting different results. In some cases, starting with a new enterprise data management approach may be the right way for your organization, but managing data and information as strategic assets is always a good objective.
Focus On Data Definition
By starting at the beginning and clearly understanding all of the information around the organization leaders can establish a holistic baseline for all analytics. This usually results in significant business impact and changes to improve the quality of the data captured in all transactional systems.
By combining an Enterprise Data Governance program run by the business and comprehensive metadata that is incorporated into the delivery solutions, the organization now has the clarity to unlock the analytics and measurements required to impact strategy.
When questions of the definition, timeliness, and validity of data arise, the tools and business authority will be there to provide assurance. For a successful enterprise architecture, make sure there is a solid data architecture at the foundation. Then, services architecture becomes an exercise in development, not a never-ending battle to get agreement on terms and valid values.
One last key is to avoid data model / terminology perfection – this is the epitome of the “boiling the ocean” concept. Getting everyone to agree on every aspect of data terminology is a black hole that will eventually swallow all projects / programs. Getting executive business support and keeping this risk at the forefront of these efforts will maintain focus on the major goals – progress and collaboration – not perfect conformity.
Many organizations struggle with the delivery of data definitions to requirements, and the use of an enhanced fact qualifier matrix based on the organization’s enterprise data model can support applications, integrations, and DW/BI/analytics for many years. Using data governance and metadata management programs as the foundation, this tool can be very valuable.