Affiliated with:

Discovering Usage Patterns for Analytical Applications

image 43

It is important to determine expected usage patterns for business intelligence and analytical applications before design, and to evaluate their actual usage after implementation to gauge success.

Building an analytical solution can be an arduous task.  Many factors are required to deliver a successful solution.  It takes good planning, a solid architecture and design, good installation and usage of the front-end delivery tools, and validation that the retrieved data is accurate.  While building the solution is challenging, ultimately, value is only realized when users leverage the system.

Too often, the project to provide the analytical solution is deemed a success, but the usage is not there.  Rest assured that the success of a project in the short term might be evaluated by whether it was on schedule and budget.  However, in the end, every effort is judged much more by the level of usage and value received from the solution.  So how does a project team determine if the analytical system is delivering value?

Audit Activity

Start by looking at the real usage that is occurring.  It is one thing to estimate what activity users will have with the database during the requirements phase.  It is quite another to see the level of activity that actually occurs 3, 6, 9, 12 months or later.  Following are some of the many important reasons for having an audit trail of all usage data:

  1. Security Auditing– Protection of the data in the system is usually a concern.  Often, security is defined to protect access to the data, but no one is assigned to monitor the actual access by humans or other systems.  The protection of access to personal data in a proven, secure manner is crucial.
  2. Performance Monitoring– There is many minor enhancements that can be made to a solution that will greatly improve the overall experience of the users.  Simply adding an index or an aggregation table for commonly submitted requests can often be the difference between a frustrating user experience and having business resources singing the praises of the application.  Knowing what queries are running really long, which ones are running frequently, and what data is most often requested will allow administrators to target performance enhancements.
  3. Usage Auditing– This is probably the most challenging, but also the most rewarding category.  One key is to leverage usage auditing for the same reason users rely on business intelligence – to discover how to improve the service provided with information.  Examine who is using the system (what type of user), for what type of activity, what data are they interested in, etc.…  This does not just require numbers; it requires analysis by data stewards and watching the user experience unfold side by side with them.  Systems analysts may discover an application used for purposes beyond its strength – reports used as dashboards, summarized data trying to be used for data mining, etc.

Installing applications and tracking results to enable the ability to audit is only the first step.  If the solution is to be of any value, administrators must actively engage the appropriate resources to monitor the results.  Many areas and techniques can be explored to help insure the success of the business intelligence or analytics solution.

Query Numbers Low – Why?

Administrators may wonder what to do when there are low levels of user access or queries based on the defined requirements.  The right reaction to this situation is much more important than the fact that the level of interaction from users is not meeting expectations.  First, look at the possible causes for the lack of usage.  To investigate this, use several techniques.  Most importantly, solicit feedback from all the user community – active and inactive.

  • A survey is an optimal way to discover who is using the system and who really sees potential in the system.
    • If a survey receives little or no attention, this could indicate a requirements gap or a communications gap.  Perhaps leadership has seen great potential benefits in an analytical solution, but they have not shared the strategy and ideas with the people actually targeted to use the application.  Few organizations use the “Build it and they will come” approach to the solution, so there must be requirements.
    • If the survey receives many responses and they are all negative, that shows people really care and see potential value in the solution.  There are probably specific nuances that are extremely frustrating to them that are causing their displeasure with the solution.  As long as the data is accurate, there are always going to be ways to improve the delivery mechanism.
    • If there is a small number of responses of varying levels of acceptance, realize this is a great position.  Leverage those that like it to share with others the techniques and results that have caused them to be early adopters.  Also, leverage the negative comments to find performance improvement or increased usage opportunities.
  • Observe user interaction with the solution.  This is a favorite method for many researchers, since they can see first-hand what the users are doing and the results they are receiving.  The observer will discover quickly what users really like – this is critical so no changes are made that could have a negative impact on something very important.  In addition, the observer will be able to see areas of challenge that the development team thought should have been simple or intuitive.  Look at the data the users retrieve, look for how they format their output and look for whom they share results with and why.  Observe one of the heaviest volume users, a medium volume user, and a light or no volume user.
  • Talk to business management responsible for business intelligence and / or analytics.  Try to discover their views and their staff’s challenges with the solution, what they are doing to encourage usage, etc.…  Perhaps other initiatives do not provide them with the time or resources to leverage the solution’s optimal usage.
  • Examine database access activity.  It is important to have some of this to analyze prior to meeting with management and users, to provide information on specific usage rates and patterns for discussion and to target whom to observe, it can help identify requests that everyone is running that that can become a common report or part of a dashboard or other analytical item.  There may be points that become requirements for enhancements.

One Example – Usability

As an example, examine the functionality of usability.  Listen to the complaints and witness the actual usage patterns.  This will help define the reason for low usage and possible solutions.

One of the first aspects of usability is to determine if the application has adopted the right front-end delivery mechanism for the actual need.  By watching and listening to what the users want to be able to do, the analyst should be able to define if the front-end application is the right tool for the job.  Many users ask for a dashboard or scorecard when what they really were looking for was reports, and vice versa.  Ideally, the development team catches these issues during requirements and design, but too often, the actual need only becomes apparent after the users have had access to the solution for a few months.

Next, look for how the users navigate the front-end access tools and data, and the frustrations that are obvious.  Any number of things could become apparent.  The tool itself is hard to navigate– this could come from a poor front-end design, or using the wrong solution for the job.  Too many clicks– having many options can be wonderful; having to click many times for a simple request can be frustrating.  Tool / front-end is too technical– takes a rocket scientist to use it.  No metadata – users do not know what the data means and do not know where to go to find out more about the data definition, heritage, and lineage.

Conclusion

Organizations build analytical solutions to improve business and operations.  This enables them to apply tactics, change strategies or approaches, and monitor/measure the impact.  The designers and other technical resources must provide the same support and same desire for improvement that end users expect.  By searching through the activities taking place against systems and staying closely engaged with users, technical staff could best serve them and their organizations.  When the project is over, the real work starts.  How designers handle the rollout and improvements to the analytical system often has an equal or greater impact to its success.  That achievement will bring a tremendous partnership with the business and provide lasting value.

LinkedIn
Facebook
Twitter

Bruce D. Johnson

Bruce D. Johnson is an experienced IT consultant focused on data / application architecture, and IT management, mostly relating to Data Warehousing. His work spans the industries of healthcare, finance, travel, transportation, and retailing. Bruce has successfully engaged business leadership in understanding the value of enterprise data management and establishing the backing and funding to build enterprise data architecture programs for large organizations. He has taught classes to business and IT resources and speaks at conferences on a variety of data management, data architecture, and data warehousing topics.

© Since 1997 to the present – Enterprise Warehousing Solutions, Inc. (EWSolutions). All Rights Reserved

Subscribe To DMU

Be the first to hear about articles, tips, and opportunities for improving your data management career.