Featured

Demystifying Analytics Audit

We suggested in our previous post that it makes sense to conduct an objective third part audit of Analytics projects, aka an Analytics Audit, even if it costs some extra money. We also enumerated the outcomes of such an Analytics Audit.

In many situations, the organizations have data and so they develop statistical models, without any regard to the choice of models or appropriateness of those models or the robustness of process adopted to develop these models. An Analytics Audit provides an objective third party assessment of these data-based Analytics projects, with respect to their choice of statistical models, the appropriateness of those models for the business context, and also the robustness of process adopted to develop these models.

Also, some organizations hire external consultants to build statistical/Analytical models for them as inputs to specific applications. Later, when they need to make the transition to in-house team(s), they may not be in a position to validate these models in terms of relevance and may not know how to keep these models current through periodic evaluation. Here again, the services of an experienced Analytics Audit professional would help in internalizing the Analytics models and also to set up a rhythm for periodic evaluation of these models’ relevance and currency.

But, what exactly is Analytics Audit and how it should be done? In our inaugural post on Why Analytics Audit, we had delineated the steps involved in a Typical Analytics Projects Approach. In this blog post, we would link Analytics Audit steps with the Analytics projects approach to highlight how an objective audit of Analytics projects would typically be carried out. The steps involved in Analytics Audit are listed in Table 2. This table also lists some of the questions that need to be asked and answered at each of these steps to ensure that Analytics projects achieve the organizational objectives, for which these projects were undertaken in the first place!

Table 2 – Analytics Audit Steps Involved in a Typical Analytics Projects Approach
S. No. Typical Analytics Projects Approach Analytics Audit Steps Questions to be Asked
1. Harvesting in-house data Checking quality, completeness and comprehensiveness of data Have we looked into all available in-house sources of data?
2. Data driven decisions (3D) identification Validating business requirement for those data driven decisions (3D) What are the decisions we want to make based on available data?
3. Organizational metrics delineation Validating appropriateness of organizational metrics Do the metrics measure what they are supposed to measure?
4. Metrics of interest prioritization Validating business logic for metrics prioritization Which are the pressing issues we want to address using data?
5. Base-lining of priority metrics Validating descriptive statistics of prioritized metrics What is the current state of prioritized metrics?
6. Analytical / statistical modeling Validating the statistical methods chosen for modeling

Validating the appropriateness of those chosen models

Validating the robustness of process of statistical modeling

Which are the most appropriate statistical modeling methods for the available data?

Are the models developed following standard modeling process?

7. Modeling results evaluation through 3D Validating modeling outcomes in the context of decisions of interest Does the modeling output helps in better decision-making?
8. Operationalization of modeling outcomes Validating incorporation of modeling output in actual decision-making Are the decisions truly data driven?

Have we incorporated Analytics projects’ output into the decision-making process?

Analytics Audit Disclaimer

All links provided in the blog posts on Analytics Audit site are representative links. The author(s) do(es) not approve or disapprove any single or particular website/vendor. There is no attempt to provide comprehensive details of any topic discussed or listed. The readers are advised to carry out their own research, if they are looking for any definitive solutions or approaches. Though for objective audits of Analytics projects, you may contact the principal author of this blog site at vikas_iimb@hotmail.com.

Analytics Maturity Models

We recognized in our previous post that just like in Academic world where we do double-blind peer-review of journal articles before they are published, we should have an independent review of Analytics projects as well, aka an Analytics Audit!

It also follows from the CMM (Capability Maturity Model) for the software development process in which we have audits of organizational capabilities. It follows necessarily that even though those capabilities are at the organizational level, we need to do an audit at the most basic level to ensure that all relevant processes are at the minimum requisite maturity level.

Similarly, we have PCMM (People Capability Maturity Model) for organizations, which go beyond any specific product or process, but deep into the realm of how we develop our people assets and how mature are those processes to develop and grow our most valuable resources, our human resources.

In the international banking domain, we have BASEL norms. Actually, we have more than one set of norms, with progressively deeper level auditing of Banks’ risk practices. And we do need to do audit of their processes, IT systems as well as the quantitative risk models so as to adequately provide for the risk capital!

In the Analytics domain, we have multiple maturity models measuring the capabilities of organizations across various dimensions. Most of these Analytics Capability Maturity Models (ACMMs) are propagated by the vendors of Analytics products and services. Though there are other Analytics Maturity Models (AMMs), which are offered either by non-profit organizations, like INFORMS or is postulated by researchers and practitioners.

Then, there is International Institute for Analytics, which provides Analytics research and advisory services. They have their own DELTA model for assessing Analytics maturity of organizations, led by Thomas H. Davenport.

In all these AMMs, there is an underlying assumption that an organization’s Analytics model building capabilities are commensurate with its level of maturity as per the assessment. It is implicit in these AMMs that if we provide for the elements identified by them, then we would have good Analytics model building capabilities. This may be true for some of the organizations, but may not hold for some of the others. To elaborate, the most common statistical inference anyone would draw is about correlation between two sets of data and that could be quite misleading! If someone doesn’t agree, she may look at the spurious and absurd correlations listed here!

So, what is the way out? As has been suggested in our previous posts, it makes sense to conduct an objective third part audit of Analytics projects. If outcome of the Analytics Audit confirms that the Analytics models are good, that gives us a confidence in the Analytics model building capabilities of the organization. If outcome of the Analytics Audit suggests that the Analytics models are not good enough and need improvement, that gives us a handle to do next iteration to make them good! Well, it may cost an extra bit for third party objective Analytics audits, but it saves a lot of heartburn and expenses downstream!

Analytics as Decision Science

We acknowledged in our previous post that just like in Accounting, Analytics projects, having data science professionals working with data and human ingenuity, need checks and balances through an independent validation of those projects, aka an Analytics Audit!

We consider Analytics as Data Driven Decision (D3) making in the business world, which is the empirical application of Decision Theory. The science in Decision Sciences comes from the “body of knowledge and related analytical techniques of different degrees of formality designed to help a decision maker choose among a set of alternatives in light of their possible consequences.

As per INSEAD, “the area of Decision Sciences includes risk management, decision making under uncertainty, statistics and forecasting, operations research, negotiation and auction analysis, and behavioural decision theory.” Interestingly, even at my own Alma Mater, Indian Institute of Management Bangalore (IIMB), my area of Quantitative Methods and Information Systems (QMIS) in which I did my Fellowship, has been renamed as Decision Sciences and Information Systems (DSIS)! By the way, my first Analytics job was in the Decision Sciences group in a corporate setting!

The Decision Sciences Journal is in publication since 1970, well before anyone would have heard or used the term ‘Analytics’. This Journalseeks research papers which address contemporary business problems primarily focused on operations, supply chain and information systems and simultaneously provide novel managerial and/or theoretical insights.

The research papers in academic journals are always double-blind peer-reviewed, before they are accepted for publications, even when they are written by established and renowned researchers and scientists. Then, why do we not follow similar practice in a scientific realm of Analytics! Should not we have an independent review of Analytics projects, aka an Analytics Audit?

Why Analytics Audit?

Before we establish that we really need Analytics Audit, we need to discern whether Analytics is an Art or Science! Since an artistic work can’t be validated, it can only be appreciated. Whereas a scientific work must be validated, before it is accepted as a truly scientific work!

Let’s evaluate typical Analytics projects approach on the constituents of Scientific Method as described by Wikipedia, to ascertain whether they really qualify to be called scientific projects. On similar lines, for data-mining and other Analytics projects, we have established methodologies like CRISP-DM and SEMMA. KDnuggets website also contains an overview of main methodologies for analytics, data mining, or data science projects.

Comparison of a Typical Analytics Projects Approach with CRISP-DM and Scientific Method is presented in Table 1 below. As seen from Table 1, typical Analytics projects approach pretty much mirrors scientific method. So, we may conclude with fair reasonableness that following this approach should lead to scientifically verifiable results. But, then why would we need Analytics Audit? Well, a little extra effort does not harm! It can provide Analytics Assurance and ensure Analytics projects’ fidelity.

 

Table 1 – Comparison of a Typical Analytics Projects Approach with CRISP-DM and Scientific Method
S. No. Scientific Method CRISP-DM Phases Typical Analytics Projects Approach
1. Make observations Data understanding Harvesting in-house data
2. Think of interesting questions Business understanding Data driven decisions (3D) identification
3. Formulate hypotheses Business understanding Organizational metrics delineation
4. Develop testable predictions Data understanding Metrics of interest prioritization
5. Gather data to test predictions Data preparation Base-lining of priority metrics
6. Refine, alter, expand, or reject hypotheses Modeling Analytical / statistical modeling
7. Develop general theories Evaluation Modeling results evaluation through 3D
8. Develop general theories Deployment Operationalization of modeling outcomes