Home > Digitalizacija > Andrew D. Banasiewicz: Evidence-Based Decision-Making; How to Leverage Available Data and Avoid Cognitive Biases

Andrew D. Banasiewicz: Evidence-Based Decision-Making; How to Leverage Available Data and Avoid Cognitive Biases

If reliance on intuition is too ingrained in human nature, then we should focus our efforts on making sure that the informational basis of intuition is sound. While intuition is inherently subjective, it can nonetheless be shaped by objective facts.

When we look on organizational choice-making which so often falls short of systematically leveraging the wealth of information that is readily available, it is because those data are too big in volume and are to variable, to make some sense from them.

When confronted with a task of quickly making sense of a large and diverse set of situational stimuli, the brain often makes use of sensemaking heuristics, or shortcuts, producing what is commonly referred to as intuition. Defined as the ability to understand something immediately and without the need for conscious reasoning, those nearly instantaneous conclusions feel very natural, and typically very right, but can ultimately turn out to be unwarranted or outright incorrect.

Human brain can be described as three pounds of very soft and highly fat tissue, made up of some 80-100 billion nerve cells known as neurons. Each neuron can form several thousand connections, which in aggregate translates into staggering 160+ trillion synaptic connections. The brain’s storage capacity can grow, but it does not decrease. What drops off, at times precipitously, is the retrieval strength, especially when memories – including semantic and procedural knowledge critical to abstract thinking – are not reinforced. One of the amazing qualities of brain is its ability to rewire itself, a phenomenon known as neuroplasticity or brain plasticity. Though generally a desirable feature supporting ongoing learning, neuroplasticity has some distinct disadvantages, one of which is ongoing alteration of our recollection of past events. In the end, computers prevail not because they are faster or have greater memory, but because they make better use of objective evidence.

When we talk about learning, we can ask ourselves how we know what we know and how we learn. Learning can be characterized as modifying information already stored in memory based on new input or experience. Long-term memories can be grouped into two broad types: declarative or explicit and non-declarative or implicit. When we talk about decision-making, reasoning process can be deductive or inductive. Four learning paradigms are: observational, theoretical, computational and simulational. With rise of computational and simulational learning we are introducing man-made devices that are capable of learning.

NLP (Natural Language Processing) started with text, then using lexical analysis, syntactic analysis, semantic analysis, output transformation, we came to data.

Our conception of what constitutes knowledge can be understood in terms of two separate sets of consideration: sources, which captures the how aspect of what we know or believe to be true, and dimensions, which represents the what facet of what we know, in the sense of type of knowledge. The source of knowledge can take the form of formal learning, commonly labeled explicit knowledge, or informal learning also known as tacit knowledge.

We know three dimensions of knowledge: semantic – which encapsulates abstract ideas and facts; procedural – which captures behavioral abilities to perform specific tasks; episodic – which encompasses an array of hedonic or emotive memories.

We can look at schematics:

  • Knowledge
    • Sources
      • Explicit
        • Factual & objective
        • Community-shared & intermittent
        • Exposure, absorption and availability-shaped
      • Tacit
        • Interpretive & subjective
        • Individually-held & constantly accruing
        • Exposure, experience and perspective-shaped
    • Dimensions
      • Semantic
        • Ideas, facts & concepts
        • Not related to specific experiences
        • Domain-specifics
      • Procedural
        • Behaviors, habits & skills
        • Implicit or unconscious
        • Process-specifics
      • Episodic
        • Events, experiences & emotions
        • Recall-based
        • Context-specifics
  • Everything together effective topical knowledge

Effective topical knowledge is what we draw on when presented with task or choice to made. Since knowledge is very slippery field. We need to check its validity, that can be broadly characterized as truthfulness, which encompasses factual accuracy as well as logical correctness and reliability, that is expression of dependability, or stability of what we know. Basically, we can say all models are wrong, but some are useful. Knowledge can be thought of as a library – a collection of systematic, procedural, and episodic remembrances acquired via explicit and tacit learning. Decision on the other hand can be characterized as a commitment to course of action, intended to serve a specific purpose or accomplish a specific objective. A person makes, on average, about 35.00 decisions a day.

As mentioned, knowledge is very fragile. Cognitive bias are patterns that can lead to irrational choices. But we should not mistake error in judgment with biased judgment. Sources of cognitive bias can be reasoning or emotions.

If we try to expand structure of cognitive bias:

  • Cognitive bias
    • Influence
      • Peers
        • Bandwagon effect
        • Cheerleader effect
        • Availability cascade
        • Third person effect
        • Not invented here
      • Experience
        • Confirmation bias
        • Forer effect
        • Impact bias
        • Negativity bias
        • Social comparison bias
      • Values
        • Projection bias
        • Loss aversion
        • Selective perception
        • Endowment effect
        • Zero-sum bias
      • Reference
        • Decoy effect
        • Contrast effect
        • Pareidolla
        • Exaggerated expectations
        • Social comparison bias
      • Prior choices
        • Gambler’s fallacy
        • Survivorship bias
        • Hot-hand fallacy
        • Semmelweis reflex
        • Hindsight bias
    • Effects
      • Overconfidence
        • Course of knowledge
        • Overconfidence effect
        • Recency illusion
        • Ostrich effect
        • Frequency illusion
      • Irrationality
        • Omission bias
        • Disposition effect
        • Duration neglect
        • Distinction bias
        • Rhyme-as-reason effect
      • Uncertainty
        • Ambiguity effect
        • Illusion of validity
        • Normalcy bias
        • Congruence bias
        • Neglect of probability
    • Perception
      • Inferences
        • Belief bias
        • Framing effect
        • Information bias
        • Illusory correlation
        • Optimism bias
      • Estimation
        • Availability heuristic
        • Dunning-Kruger effect
        • Planning fallacy
        • Sample size insensitivity
        • Social desirability bias
      • Anecdotes
        • Conjunction fallacy
        • Base rate fallacy
        • Anchoring
        • Berkson’s paradox
        • Weber-Fechner law

Learning organizations, knowledge management, emotional intelligence, total quality management, employee management are all examples of quick fix, templated management frameworks. Easy to communicate and comprehend, seemingly applicable to everyone’s problems everywhere, and in-tune with the zeitgeist.

The different manifestations of validity can be grouped as follows:

  • Construct truthfulness assessment:
    • Face validity
    • Content validity
    • Discriminant validity
  • Observable-latent specification assessment
    • Convergent validity
    • Predictive validity
    • Concurrent validity

Reliability captures the repeatability or consistency of a particular operationalization.

  • Internal reliability
    • Split-half method
  • External reliability
    • Test re-test
    • Inter-rater

A good story is a verbal equivalent of a picture that is worth the proverbial thousand words. Anecdotes have the power to capture attention and deliver a clear message. Any informal account or a testimonial can become anecdotal evidence.

Benchmarks and best practices are another potential source for decision-making. But you have to be careful about peer group composition, sample size when talking about benchmarks. Best practices can be derived from experience or research. But the main question is, will something that worked somewhere else worked also for you. Usually both of them are more appropriate for not being left behind, then for attaining leadership.

When we talk about data as an asset, it is more correct to use it as potential asset. There are three key, sequential dependent steps to use data as a source of decision-guiding insights: data selection, data analyses and result interpretation. One of the core aspects of data selection is embodied in the notion of sampling. It is done with moving from universe to population to initial sample to final sample progression. Another important aspect of sampling is related to recency. And you have to be careful about sample composition and sample sizing.

The term insight extraction itself is taking on a broader meaning, as its meaning is no longer synonymous with generation of decision-guiding inputs into human decision-making, but now also includes curation of multi-sources data used to power automated and even autonomous systems.

Working with data you have to be careful on their usage. This is especially important for use of PII (personally identifiable information).

Business organizations tend to view database infrastructure as competitive necessity, investing heavily in ever larger and more complex data capture, storage, curation, management and analysis systems. Overall the database revolution did more for the fortune of data service suppliers that it did for the competitiveness of an average database using organization. We can talk about information-savvy organizations that have data analytical know-how. If companies don’t have that, they are data-rich, information-poor type of organizations.

We can categorize data by source, usage, type and situation. And we can divide them on events (behaviors) and attributes. When we analyses them, we look for what happened and why it happened. So, one of the most important research elements in data is causality. Differentiating cause-and-effect type of events from just concurrence-based ones.

Creation of objective knowledge can be either theory or data laden.

Reporting, that is used more and more in organizations can be a problem, because it does not focus on why dimension, only what. Because data can be either signal or noise, creating data-driven culture needs to clear this dilemma and data analyses need to be integrated into workflows inside the company. It is also important that either analysts become decision-makers or decision-makers become more analytical. Business analytics can have three segments: predictive & prescriptive analytics, BI and measurement & optimization. Some of the BI tools on the market: MicroStrategy, Tableau, Sisense, QlikView, Dundas BI, Business Objects, Oracle BI, SAS BI, IBM Cognos and Easy Insight.

BI tools:

  • Consume data from any source through file uploads, database querying and application connectors
  • Provide an architecture to transform data into a useful and relatable model
  • Support data modeling, blending and discovery processes
  • Create reports and visualization with business utility
  • Create and deploy internal analytics applications

Evidence-based decision-making is hard to introduce as a standard framework. Although that was tried with EBP (evidence-based practice) movement. It is hard to say that management has some standard body of knowledge, so it is hard to establish standardized practices in this area. Approaches like business process re-engineering, total quality management, knowledge management or learning organization management framework are today almost forgotten, but ideas like talent management, emotional intelligence, employee engagement or agile management, are on the other hand quite alive. In States idea of EBP was first connected with medicine. It was attempt to create body of knowledge from different sources, with methods like research review and synthesis tools: systematic reviews (SR) and meta-analyses (MA). SR has more stages: research question definition, search for relevant data, extraction from relevant data, assessment of data quality, data analyses and amalgamation.

If field of medicine could benefit from common approach, managers have no incentive to contribute to common-to-all database of knowledge. Human rationality is bounded by mental skills, habits and reflexes. The efficacy of managerial decision-making can be improved by learning-based enhancements of a decision-maker’s mental skills and providing informationally compelling evidence that can overcome potential biased habits and reflexes.

EBP is making decisions through conscientious, explicit and judicious of the best available evidence from multiple sources by (6A):

  • Asking
  • Acquiring
  • Appraising
  • Aggregating
  • Applying
  • Assessing

Defining evidence as information, facts or data supporting a claim, hypothesis, evidence may come from four distinct sources:

  • The scientist literature
  • The organization
  • Practitioners
  • Stakeholders

Good decision-making should be based on a combination of critical thinking and the best available evidence. Organizational management is about making choices under conditions of uncertainty and judicious and explicit use of best available evidence offers the best hope of making the best possible choices. One’s ability to consistently reduce decision-making uncertainty is tied to being able to consistently lean on the totality of available statistical, research and anecdotal evidence.

We can group evidence:

  • Evidence
    • Empirical
      • Operational data
      • Empirical research
    • Experiential
      • Expert judgment
      • Norms and standards

One of the intellectual foundations of the Western scientific tradition is the reliance on the scientific method, which is set of procedures consisting of systematic measurement (by observation or experiment) and the formulation, testing and modification of hypothesis, as the basic of knowledge creation. Scientific research encompasses a wide and rich array of approaches and techniques, best summarized with the help of several dualities: theoretical/applied, quantitative/qualitative, exploratory/explanatory and observational/experimental.

Talking about operational data, we can see data as signal – informative element and noise – informationless element. A wide range of important decisions are based on statistical inferences drawn from operational data. We should be careful about indepency of data and data-derived insight.

When we look at norms and standards, usually it is about benchmarks and best practices.

Another potential insight can come from anecdotal evidence, but if not check it can emphasize some biases.

Coming back to uncertainty, a decision-maker’s uncertainty can be a result of unfamiliarity with decision-pertinent information or inability to make use of probalistic interferences. Becoming a thoughtful and confident user of probalistic evidence calls for a rudimentary competency in three dimensions of probalistic thinking: computational, inferential (statistical) or evidentiary. Approaches to probability estimation are either Bayesians or frequentist. Three main types of probability are marginal, joint or conditional.

When we use data analyses, errors can come from data, methods or interpretation. To help with accuracy we can use pooling methods, but we should be careful how we do it. Using systemic review is one part of pooling. Being built around a predetermined, detailed and comprehensive plan and search strategy, SR are expressly focused on reducing selection bias by identifying, reviewing, and synthesizing all relevant studies on a particular topic.

Operational data are either batch or streaming data. Because of their rich diversity, we should use metadata and sampling to help us get good insight.

To have decision-guiding value, all available evidence needs to be synthesized into a singular conclusion. Though individual informational inputs might individually suggest somewhat different courses of action, when considered jointly, conclusive convergence should be the goal.

CEBMa-s evidence-based management practice framework includes organizational stakeholders as one of the four sources of evidence. There are at least five discernible groups of organizational stakeholders: shareholders, employees, regulators, customer, suppliers and creditors.

The Empirical & Experiential Evidence (3E) Framework is an attempt to give management practices operational framework. It sees operational data at its core, but also see theoretical research informationally valuable. The 3E process begins with within-source steps: identification of distinct sources, then assessment of available and applicable data, and aggregation of type-specific evidence; then you move to cross-source steps defining weights, agglomeration and then incorporation of evidentiary conclusion into the decision-making process. As mentioned, every informational source on it owns can be imperfect, so we use informational triangulation, to come to better conclusions. 

Schema of 3E:

Operational data are the most important source of evidence for organization evidence-based decision-making systems. But we should be careful since past behavioral patterns are not always best predictors of future behaviors. Probabilistic analyses can be either confirmatory or exploratory. One of the important techniques is SST (statistical significance test). It is designed to differentiate between material and spurious effects.

Confirmatory Analysis:

  • Need Identification
    • Informational Needs
    • Organizational Objectives
  • Knowledge Creation
    • Data Mobilization
    • Data Analysis
    • Knowledge Construction
    • Analytic Planning
  • Utilization & Validation
    • Knowledge Dissemination
    • Impact Assessment

Systematic review steps:

  • Define research question
  • Select databases
  • Select articles
  • Extract data from studies
  • Assess the quality of studies
  • Combine data
  • Review & discuss

Meta-Analysis

  • Define research question
  • Select databases
  • Select articles
  • Extract data from studies
  • Analyze the data
  • Report the results

When we look at the benchmarks and best practices, making sure that they bring value to organization, we can check them for: utility, credibility, independence and impartiality.

Years of education tend to imprint human minds with a variety of abstract notions while also conditioning human psyche to accept as true a wide range of ideas. When a particular claim violates what one deems to be reasonable, the result is cognitive dissonance. Most find it difficult to accept an assertion as being true, even if the underlying rationale, and even the empirical method both seem acceptable and correct. Arthur Conan Doyle: “Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth.”[1] Even the most deeply held beliefs should be re-assessed when objective evidence is made available.

Decision-makers typically seek information that might help to reduce choice-related ambiguity, but to be of value, decision-aiding insights need to be as recent and as specific as possible. Average person is able to simultaneously process in attention only about 7+-2 informational chunks. And that can deprive decision makers from considering all available pertinent information. When historical data can be considered reliable, and the decision at hand can be categorized as routine, data-driven, automated decision engines can not only equal but also even surpass the efficacy of human decision-making.

Operational data is usually based on database infrastructure. From the standpoint of informational content, the most pertinent aspects of a database are as follows: scope (data mart vs. Datawarehouse), content (form of encoding – text, numeric or multimedia) and organization (structure of database: entity-relationship, relational and object-oriented).

An analytical dataset is a subset of all available data, properly structured, organized and cleansed. Factors such as human error, occasional technical glitches, inapplicability to some of the records, or imperfect data capture methods virtually guarantee that all data types exhibit some degree of incompleteness. So, we always have to do some data cleansing and normalization.

Exploratory data analysis (EDA) can be called also data mining. One of popular approaches to it is data visualization. Confirmatory analysis can be grouped into dependence and interdependence statistical methodologies.

Data = Smooth + Rough. Where smooth is the relationship between two or more variables and rough are the remaining residuals, or what is left over after all the patterns have been extracted from a dataset (noise). Steps of EDA can be defined as: review of previous findings, examination of data quality and analyses of data.

Knowledge creation is a cumulative process. MECE (mutually exclusive and collectively exhaustive) principle can be used as an intellectual guide to the development and maintenance of such a knowledge base. Individual variables should exhibit: availability, scaling, tributional and interpretational characteristic. Numeric data can be broadly grouped into categorical or continuous measures. Continuous metrics can always be re-coded into categorical ones, but the reverse is not possible. Categorical variables are described using counts and frequencies, whereas continuous variables are best characterized with the help of measures of central tendency, variability and the spread. Correlation is important feature of continuous variables. Two main coefficients are Pearson’s product-moment correlation coefficient and Spearman’s rank correlation coefficient.

Confirmation analysis is important from perspective that dependence analysis working on specific cause-effect relationship and interdependence is about general (correlation) relationships.

Perhaps management academics should be more application minded and perhaps management practitioners should be more theoretically inquisitive. In organizational decision-making meta-analysis techniques are suitable for their quantitative effect size. When we talk about value of usage of standards and norms, we could use this evaluation and summarization framework:

  • Evaluation guidelines
    • Evidence gathering philosophy
    • Evidence gathering plan
    • Enumeration of competencies
  • Evaluation conduct
    • Scope and objectives
    • Timeliness and intentionality
    • Methodology
    • Communication and dissemination
  • Evaluation outcomes
    • Quality assurance
    • Quality control

Regarding expert judgments, we can say that pooling is helping bringing more credibility. Knowing the right answer is often a product of instinctively understanding what information can be discarded as unimportant. Intuition can have that ability. Intuitive mind is a sacred gift and rational mind a faithful servant according to Einstein. As John Wanamaker said, about wasting one dollar out of two for marketing, that he didn’t know which one. Delphi method is a systematic, interactive forecasting technique which relies on a panel of independent experts, originally developed at the beginning of the cold war to forecast the impact of technology on warfare.

When choosing expert approach and maybe using Delphi method, first we need to define who experts are:

  • Technical credentials
  • Relevant experience
  • Independent contribution
  • Forecasting ability
  • Willingness to engage

And when we ask panel to make estimations we should:

  • Plan to pretest
  • Avoid vague and imprecise terms
  • Avoid complex sentences
  • Avoid double-barreled questions
  • Provide appropriate frames of reference
  • Avoid questions using leading, emotional or evocative language

In the age of relentless self-promotion, fake news, and unsubstantiated claims, the truly knowledgeable and authorative voices are often drowned out by what is, in an informational sense, nothing more than a background noise.

In a more general sense, organization-unique combination of structure, culture, and founders or managers philosophy create contexts withing which organizational choices are made. At their core organizations are groups of people joined together in pursuit of shared goals. Organizations can be informal or formal. Formal can be defined by type of grouping: function, beneficiaries or power.

  • Function
    • Production/economic
    • Political/managerial
    • Integrative
    • Adaptive
    • Maintenance
  • Beneficiaries
    • Mutual benefit associations (shared goal)
    • Business organizations (owners)
    • Service organizations (clients)
    • Commonwealth organizations (public at large)
  • Power
    • Authority
    • Compliance

The combined effect of globalization, hyper-competition, rapid technological change and environmental and market turbulence are changing the nature of work, and effective forcing organizations to become more adaptive. That means de-emphasizing formal hierarchies in favor of informal networks or moving away from specialized departments to improvised processes or temporary project teams.

We have two subsets of typology-wise organizational structures:

  • Mature
    • Functional
    • Divisional
    • Matrix
  • Newer, emergent
    • Flatarchies
    • Holacracy

The key to discerning organizational culture is understanding the source of it. To fully understood, business-related decision-making needs to be examined within the context of organizational stakeholders’ ecosystems. A publicly owned corporation can have up to seven distinct sets of constituents: shareholders, employees, regulators, customers, creditors, suppliers and competitors, all of whom have constituent interest in the corporation’s decisions. We have two class of employees: executive and non-executive and two types of regulators; business general and industry specific. Usually corporate decision-making is controlled by corporate managers. Group dynamics is also important to understand decision-making process.

Mc Kinsey’s schema od types of decision:

  • Big bet decisions (broad scope, infrequent, unfamiliar)
  • Cross-cutting decisions (broad, frequent, familiar)
  • Delegated decisions (frequent, familiar (narrow))
  • Ad-hoc decisions (narrow, unfamiliar, infrequent)

Empirical evidence is stronger in group settings, experiential has more effect on individual decision-making. Groups can have more decision confidence, but that does not mean that decision is always better. Think about group thinking, tendency to support group decision. It is important to have strong individual critical thinking inside group decision-making process. The combination of ownership bias, preference effect and poor unshared information pooling can materially degrade the efficacy of group decision-making.

Learning paradigms are:

  • Observational – product of sensory experiences
  • Theoretical – deductive
  • Computational – emergence of rapid growth of electronic transaction processing and communication infrastructure
  • Simulational – evolution of computational learning

Organizational learning encompasses behavior (act of doing) and cognition (act of knowing). It is shaped by: culture, strategy, structure and environment. Types of organizational learning are: adaptive, generative and transformative.

There are no universal, off-the shelf shortcuts to reducing organizational decision-making uncertainties brought about by uncontrollable environmental factors. If we are looking at outside factors and potential sources of information, we can use to reduce uncertainty, caused by those outside factors, we can draw those parallels:

  • Market and competition – operational data
  • Societal – empirical research
  • Political/legal – norms and standards
  • Technology – expert judgment

ERM (enterprise risk management) framework is built on identification, estimation, mapping and response to individual risks.


[1] In the book on page 145