Featured

Demystifying Analytics Audit

We suggested in our previous post that it makes sense to conduct an objective third part audit of Analytics projects, aka an Analytics Audit, even if it costs some extra money. We also enumerated the outcomes of such an Analytics Audit.

In many situations, the organizations have data and so they develop statistical models, without any regard to the choice of models or appropriateness of those models or the robustness of process adopted to develop these models. An Analytics Audit provides an objective third party assessment of these data-based Analytics projects, with respect to their choice of statistical models, the appropriateness of those models for the business context, and also the robustness of process adopted to develop these models.

Also, some organizations hire external consultants to build statistical/Analytical models for them as inputs to specific applications. Later, when they need to make the transition to in-house team(s), they may not be in a position to validate these models in terms of relevance and may not know how to keep these models current through periodic evaluation. Here again, the services of an experienced Analytics Audit professional would help in internalizing the Analytics models and also to set up a rhythm for periodic evaluation of these models’ relevance and currency.

But, what exactly is Analytics Audit and how it should be done? In our inaugural post on Why Analytics Audit, we had delineated the steps involved in a Typical Analytics Projects Approach. In this blog post, we would link Analytics Audit steps with the Analytics projects approach to highlight how an objective audit of Analytics projects would typically be carried out. The steps involved in Analytics Audit are listed in Table 2. This table also lists some of the questions that need to be asked and answered at each of these steps to ensure that Analytics projects achieve the organizational objectives, for which these projects were undertaken in the first place!

Table 2 – Analytics Audit Steps Involved in a Typical Analytics Projects Approach
S. No. Typical Analytics Projects Approach Analytics Audit Steps Questions to be Asked
1. Harvesting in-house data Checking quality, completeness and comprehensiveness of data Have we looked into all available in-house sources of data?
2. Data driven decisions (3D) identification Validating business requirement for those data driven decisions (3D) What are the decisions we want to make based on available data?
3. Organizational metrics delineation Validating appropriateness of organizational metrics Do the metrics measure what they are supposed to measure?
4. Metrics of interest prioritization Validating business logic for metrics prioritization Which are the pressing issues we want to address using data?
5. Base-lining of priority metrics Validating descriptive statistics of prioritized metrics What is the current state of prioritized metrics?
6. Analytical / statistical modeling Validating the statistical methods chosen for modeling

Validating the appropriateness of those chosen models

Validating the robustness of process of statistical modeling

Which are the most appropriate statistical modeling methods for the available data?

Are the models developed following standard modeling process?

7. Modeling results evaluation through 3D Validating modeling outcomes in the context of decisions of interest Does the modeling output helps in better decision-making?
8. Operationalization of modeling outcomes Validating incorporation of modeling output in actual decision-making Are the decisions truly data driven?

Have we incorporated Analytics projects’ output into the decision-making process?

Analytics Audit Disclaimer

All links provided in the blog posts on Analytics Audit site are representative links. The author(s) do(es) not approve or disapprove any single or particular website/vendor. There is no attempt to provide comprehensive details of any topic discussed or listed. The readers are advised to carry out their own research, if they are looking for any definitive solutions or approaches. Though for objective audits of Analytics projects, you may contact the principal author of this blog site at vikas_iimb@hotmail.com.

Why Analytics Audit?

Before we establish that we really need Analytics Audit, we need to discern whether Analytics is an Art or Science! Since an artistic work can’t be validated, it can only be appreciated. Whereas a scientific work must be validated, before it is accepted as a truly scientific work!

Let’s evaluate typical Analytics projects approach on the constituents of Scientific Method as described by Wikipedia, to ascertain whether they really qualify to be called scientific projects. On similar lines, for data-mining and other Analytics projects, we have established methodologies like CRISP-DM and SEMMA. KDnuggets website also contains an overview of main methodologies for analytics, data mining, or data science projects.

Comparison of a Typical Analytics Projects Approach with CRISP-DM and Scientific Method is presented in Table 1 below. As seen from Table 1, typical Analytics projects approach pretty much mirrors scientific method. So, we may conclude with fair reasonableness that following this approach should lead to scientifically verifiable results. But, then why would we need Analytics Audit? Well, a little extra effort does not harm! It can provide Analytics Assurance and ensure Analytics projects’ fidelity.

 

Table 1 – Comparison of a Typical Analytics Projects Approach with CRISP-DM and Scientific Method
S. No. Scientific Method CRISP-DM Phases Typical Analytics Projects Approach
1. Make observations Data understanding Harvesting in-house data
2. Think of interesting questions Business understanding Data driven decisions (3D) identification
3. Formulate hypotheses Business understanding Organizational metrics delineation
4. Develop testable predictions Data understanding Metrics of interest prioritization
5. Gather data to test predictions Data preparation Base-lining of priority metrics
6. Refine, alter, expand, or reject hypotheses Modeling Analytical / statistical modeling
7. Develop general theories Evaluation Modeling results evaluation through 3D
8. Develop general theories Deployment Operationalization of modeling outcomes