Home > Digitalizacija > Bill Schmarzo: The Economics of Data, Analytics and Digital transformation

Bill Schmarzo: The Economics of Data, Analytics and Digital transformation

Digital transformation

Improving decisions in a world of constant change will only happen if we create a culture of continuous exploring, learning and adapting. Only when the learnings gleaned from detailed data can be quickly codified, disseminated and assimilated will we drive more accurate and more relevant policy and operational decisions. But data without analytics is an empty promise. There is no value in just having data. If you want to change the game, change the frame.

Value driven

In the same way that oil fueled the economic growth of the 20th century, data will be the catalyst for the economic growth of the 21st century. Data may be the new oil or the most valuable resource in the world, but it is the customer, product and operational analytic insights (propensities) buried in the data that will determine the winners and losers in the 21st century. “Propensities” are an inclination or natural tendency for customers, products and operations to behave or act in a predictable way.

Let’s deep dive into each phase of the Big Data Business Model Maturity Index (BDBMMI).

  • Phase 1: Business Monitoring: The Business Monitoring phase seeks to monitor and report on “What’s Happened”, with respect to the operations of the business. If organizations want to cross the Analytics Chasm phase to become more predictive and prescriptive in their business operations, then they need to embrace an economics mindset, not a technology mindset.
  • Phase 2: Business Insights: This phase seeks to uncover actionable customer, product and operational insights buried within and across the organization’s data.
  • Phase 3: Business Optimization: The Business Optimization phase seeks to embed prescriptive analytics (recommendations and propensity scores) into the operational systems in order to automate the optimization of the organization’s key operational processes.
  • Phase 4: Insights Monetization: Organizations are realizing that the best way to monetize their data isn’t to sell it, but instead to leverage the customer, product and operational insights (propensities) that have been gathered throughout the Business Insights and Business Optimization phases to create new revenue or monetization opportunities.
  • Phase 5: Digital Transformation: The final phase of the BDBMMI has as much to do with culture as it does with data and analytics.

Transition from Phase 1: Business Monitoring to Phase 2: Business Insights:

  • Identify an organizational Strategic Business Initiative.
  • Identify, validate, value and prioritize the organization’s key business and operational Decisions.
  • Capture, cleanse, normalize, transform, enrich and make available the relevant data sources in a Data Lake.
  • Create an analytics (data science) Sandbox.
  • Deploy and use Predictive Analytics to uncover potentially actionable and predictive customer, product and operational insights.
  • Train business users to “Think Like A Data Scientist”.
  • Create a “Right time” analytics capabilities.
  • Master Data Science capabilities.
  • Master Design Thinking capabilities – Personas, Stakeholder Maps, Envisioning, Facilitation and Hypothesis Development.
  • Develop Business Case with financial and business justification.

Transition from Phase 2: Business Insights to Phase 3: Business Optimization

  • Evaluate the customer, product and operational Analytic Insights.
  • Develop Prescriptive and Preventative Analytics.
  • Deploy a Data Lake with full data management capabilities.
  • Capture or store the customer, product and operational insights in Asset Models (Analytic Profiles for humans and Digital Twins for devices and machines).
  • Leverage DevOps disciplines.
  • Measure Decision Effectiveness.
  • Master Data Science capabilities.
  • Master Design Thinking capabilities such as customer journey maps, mockups, prototyping and storyboards.
  • Master Value Engineering to quantify the economic value of data and analytics.

Transition from Phase 3: Business Optimization to Phase 4: Insights Monetization

  • Aggregate, cluster and classify the customer, product and operational insights.
  • Find a way into new revenue or Monetization opportunities.
  • Create customer and operational Journey Maps.
  • Explore new customer and market “as a service” consumption models.
  • Apply Data Science concepts such as Asset Models, Analytic Profiles and Digital Twins.
  • Apply Design Thinking concepts such as Personas, Prototypes, Customer Journey Maps and Storyboarding.
  • Test, validate and Operationalize the data management.
  • Create a Business Plan.

Transition 4: Insights Monetization to Phase 5: Digital Transformations

  • Drive business decisions by leveraging the Economic Value of Data.
  • Leverage Design Thinking techniques to create a Collaborative Value Creation Culture.
  • Create composable, reusable, continuously learning Analytic Modules.
  • Update the Key Performance Indicators (KPIs) and Metrics.
  • Create an analytics-enabled 3rd-party Co-creation Ecosystem.
  • Create Intelligent Apps, Smart Places and Smart Things.
  • Create a culture that leverages Deep Reinforcement Learning and AI.

Value Engineering

The Data Science Value Engineering Framework (process) provides a simple yet effective methodology for exploiting the economic value of your data and analytic assets. You can do it by using Value Engineering

Strategic business initiatives focus on business outcomes that have articulated financial value such as optimizing operational efficiency, reducing costs, improving revenues and profits, enhancing customer value creation, mitigating risk and creating new revenue opportunities. A critical part of understanding your strategic business initiative is to identify the metrics and Key Performance Indicators (KPIs).

Once we have identified the targeted business initiative and the metrics against which we are going to measure progress and success, next we want to identify the Business Stakeholders.

The next step in the Value Engineering process is to brainstorm the Decisions that each of the different stakeholders needs to make in support of the targeted business initiative. My findings are that if you identify the right set of stakeholders in Step 2, then the brainstorming and prioritizing of decisions flows very quickly and naturally.

At this stage of the Value Engineering process, we need to aggregate decisions into Use Cases or clusters of decisions around a common subject area that have measurable financial ramifications.

The Prioritization Matrix process provides a framework for driving organizational alignment around the relative value and implementation feasibility of each of the organization’s use cases.

Now that we know our top priority use case, we want to identify the predictive and prescriptive analytics that supports the targeted use case. We start with the question and then convert the question into a predictive statement. Next, we ask the stakeholders if we had those predictions, how would you use those predictions to make operational decisions. The next step is to brainstorm with the business stakeholders what data you might need. We complete the brainstorming session between the business stakeholders and the data science team by creating a matrix of ranked data sources.

Finally, we’ll need a modern architecture with state-of-the-art technologies (likely with lots of open-source options) upon which we can build a solution that delivers the business value.

If “what” your organization seeks is to exploit the potential of data science to power your business models , then the Data Science Value Engineering Framework provides the “how” your organization can do it.

Basic Economics concepts

Economics is the branch of knowledge concerned with the production, consumption and transfer of wealth or value. It is the scientific study of human action and behaviors, particularly as it relates to human choice and the utilization of scarce assets to achieve certain outcomes.

The Economic Value Curve measures the relationship between a dependent variable and independent variables to achieve a particular business or operational outcome, such as retaining customers, increasing operational uptime or reducing inventory costs. The challenge with the Economic Value Curve is the Law of Diminishing Returns.

The Law of Diminishing Returns is a measure of the decrease in the marginal (incremental) output of production as the amount of a single factor of production (input) is incrementally increased, while the amounts of all other factors of production stay constant.

The companies are realizing that it isn’t the volume of data that one monetizes, it’s the granularity at the level of the individual that one monetizes.

Analytic Profiles are an asset model for capturing analytic insights (propensities) about the organization’s most valuable assets in a way that facilitates the refinement and sharing of those analytic insights across multiple use cases. An Analytic Profile consists of propensity scores, predictive indicators, clustering, segmentation, and business rules that codify the behaviors, preferences, inclinations, tendencies, interests, associations and affiliations for the organization’s key business entities.

The Law of Supply and Demand dictates the relationship between the quantity of a commodity that producers wish to sell at various prices and the quantity that consumers wish to buy.

The Economic Multiplier Effect refers to the increase in value arising from any new injection of usage. The size of the multiplier effect depends upon Marginal Propensity to Consume (MPC). The MPC measures the impact of a change in output (production) as a ratio to the change in input (investment).

The ability to reuse data and analytic assets across multiple use cases at a near-zero marginal cost is truly a business game-changer.

Sunk Costs are costs that have already been incurred and cannot be recovered.

Through the reuse of the organization’s data and analytic assets, marginal costs can be flattened (since the costs of acquiring those assets are now considered sunk costs) while marginal revenue (value) can continue to increase through the reuse of those data and analytic assets.

Scarcity refers to resources being finite and limited. Scarcity means we have to decide how and what to produce from these limited resources. Scarcity is at the heart of the economics discussion because organizations do not have unlimited financial, human or time resources.

And there are always more opportunities than there are resources (the real essence of the scarcity dilemma).

Postponement Theory is an economic strategy that maximizes possible benefits and minimizes risks by delaying a decision in order to gain additional data or analytic insights.

For most organizations, data as capital gets converted into revenue in four ways:

  • Driving the on-going optimization of key operational and business use cases.
  • Mitigating security, compliance, regulatory and governance risks.
  • Uncovering new revenue opportunities.
  • Delivering a more compelling customer experience.

Price Elasticity of demand is the degree to which the effective demand for some item or service changes as its price changes.

Data confidence costs are the costs associated with instilling user confidence in the data. If your users do not have confidence in the data, then no matter how cheap the storage and computational costs, and no matter how easy-to-use the data access and exploration tools, the users just won’t use the data in a way that can derive and drive value to the organization.

The Economic Utility of a good or service directly influences the demand and the price of that good or service.

Marginal Utility is the utility gained by consuming an additional unit of a good or service.

University of San Francisco research paper

Data as an asset exhibits unusual characteristics when compared to other balance sheet assets. Most assets depreciate with usage. However, data appreciates or gains more value with usage.

Organizations need a framework — what we will call the collaborative value creation platform — that maximizes the economic value of data and analytic assets across the organization.

Step 1: Prioritizing Business Use Cases

Step 2: Role of Analytic Profiles

Step 3: Role of the Data Lake. A data lake is a data structure that holds large amounts of structured and unstructured data in its native format. The data lake becomes the organization’s “collaborative value creation” platform by facilitating the capture, refinement and reuse of the organization’s data and analytic assets across multiple business use cases.

Scores are a rating system that aids in comparisons, performance tracking, and decision-making. Scores are used to predict the likelihood of certain actions or outcomes. Scores are actionable, analytics-based measures that support the key decisions your organization is trying to make.

Identifying and capturing the analytics is a 3 – step process:

  • We first list the decisions needed to support the targeted use case.
  • Next, we identify or brainstorm the recommendations that need to be developed to support the decisions.
  • Finally, we identify the scores and the metrics that comprise the score, that support the recommendations and decisions.

NCE stands for Norma Curve Equivalent and is a way of standardizing scores into a 0 – 100 scale. Beta is a measure of the volatility or rapid change in the NCE score.

Value of Data Theorems

Accounting uses a “Value in Exchange” methodology for determining asset valuation based upon the acquisition cost of an asset. Economics uses a “Value in Use” methodology for determining asset valuation. When we change our frame from an accounting to an economics perspective, understanding how to determine the value of one’s data and analytic assets almost become self-evident.

It isn’t the data itself that’s valuable; it’s the trends, patterns and relationships (insights) gleaned from the data about your customers, products and operations that are valuable.

Instead of trying to sell your data, organizations should focus on monetizing the customer, product and operational insights that are gleaned from the data.

It is the quantification of trends, patterns and relationships that drive predictions about what is likely to happen. The best way to codify and ultimately monetize those trends, patterns and relationships is through the use of Analytic Profiles.

Predictions drive monetization opportunities through improved (optimized) business and operational use cases.

The ability to reuse the same data sets across multiple use cases at near-zero marginal cost is the real economic game-changer.

The ability to easily reuse data sets is highly dependent upon the creation of a modern data lake comprised of both raw data and curated data.

The “collaborative value creation platform” drives organization alignment around identifying and prioritizing the data sources that power the use case roadmap.

Trying to optimize across a diverse set of objectives can yield more granular, higher fidelity business and operational outcomes that enable “doing more with less”.

The Economic Value of a Dataset (EvD) equals the sum of the Attributed Financial Value (FV) of a specific Use Case (Use case_FV) that each dataset provides to that specific Use Case.

Artificial Intelligence

Using Artificial Intelligence (AI), you can create assets that appreciate in value (not depreciate), the more that these assets are used.

Orphaned analytics are one-off analytics developed to address a specific business need but never “operationalized” or packaged for reuse across the organization.

The modern organization needs an overarching analytic module framework to help organizations to capture, share, reuse and refine the organization’s analytic Intellectual Property (IP). For organizations to succeed in the 21st century, they must invest the data science and engineering time and discipline to master the development of composable, reusable, continuously learning analytic modules.

Analytic modules are composable, reusable, continuously learning analytic assets that deliver pre-defined business or operational outcomes. These composable, reusable, continuously learning analytic modules have the following capabilities:

  • Pre-defined data input definitions and data dictionary.
  • Pre-defined data integration and data transformation algorithms.
  • Pre-defined data enrichment algorithms to create higher-order metrics.
  • Algorithmic models.
  • A layer of abstraction above the AI, ML and DL frameworks.
  • Orchestration capability to “call” the most appropriate ML or DL framework.
  • Pre-defined outputs (APIs) that feed the analytic results to the downstream operational systems.

DL is a set of algorithms that analyze massive datasets using a multi-layered neural network structure, where each layer is comprised of numerous nodes, to train and learn to recognize and codify patterns, trends and relationships buried in the data … without human intervention.

Reinforcement Learning is a type of ML algorithm that seeks to “learn” by taking actions within a controlled environment with the goal of maximizing rewards while minimizing costs. RL uses trial-and-error techniques to map situations to actions to do so.

Transfer Learning (TL) is a technique whereby one neural network is first trained on one type of problem and then reapplied to another, similar problem with only minimal training. TL is key to accelerating neural network model reuse and to accelerating time-to-value.

In order to keep up with this influx of data and expedite the evolution of its machine learning engine, Google has open sourced its engine TensorFlow.

Tesla’s FSD AI-based Autopilot brain (a composable, reusable, continuously learning analytic module). Tesla cars become “smarter” and consequently more valuable with every mile.

The Holy Grail of AI is the creation of autonomous products, processes, policies and systems that continuously learn and adapt with no or little human intervention.

The Schmarzo Economic Digital Asset Valuation Theorem

Economies of Scale manifest themselves in cost advantages that enterprises can enjoy due to their scale of operation. Economies of Scale not only spread large fixed costs over high volumes of output to drive down costs per unit but also create a formidable barrier to entry that hinders new competitors from easily entering markets.

Organizations can exploit the Economies of Learning in how they manage their data science capabilities and the associated development of their data and analytic assets by deploying data science projects on a use-case- by-use-case basis.

The sharing and reapplication of learnings is a powerful concept — that the Economies of Learning are more powerful than the Economies of Scale.

3 economic effects that power the Schmarzo Economic Digital Asset Valuation Theorem:

  • Reusing “curated” data and analytic modules reduce the marginal costs for future use cases. “Curated” data is a dataset in which investments have been made to improve the data’s cleanliness, completeness, accuracy, granularity and latency. If organizations are seeking to drive down their data lake costs via the sharing and reuse of data across multiple use cases, then putting curated data into the data lack is critical.
  • Sharing and reusing data and analytic modules accelerate use case time-to-value and de-risks use case implementation.
  • The continuous predictive and operational improvements of a specific analytic module lifts the value of all use cases that have used that same analytic module. This economic value acceleration cannot occur if the applications are not built as composable, reusable, continuously learning Analytic Modules that can be shared, reused, adapted and “refined” (for continuous learning).

The investments that organizations must make to exploit the Schmarzo Economic Digital Asset Valuation Theorem include:

  • Creating a single centralized curated data lake or collaborative value creation platform.
  • Investing in the data science engineering work to create composable, reusable, continuously learning analytic modules.

Organizations must also stomp out the killers of the economic value of your organization’s data and analytic assets, including:

  • Data silos
  • Orphaned analytics
  • Management ignorance

The 8 Laws of Digital Transformation

Digital Transformation is the creation of a continuously learning and adapting, AI-driven and human-empowered business model that seeks to identify, codify and operationalize actionable customer, product and operational insights (propensities) in order to optimize (reinvent) operational efficiency, enhance customer value creation, mitigate operational and compliance risk and create new revenue opportunities.

“Digital Transformation (DX) Laws”; laws based on repeated observations that describe or predict a range of natural phenomena.

Digital Transformation is about reinventing and innovating business models, not just optimizing existing business processes.

Digital Transformation is about creating and leveraging new digital assets (data, analytics and insights or propensities about customers, products and operations) to reinvent your business model and create new sources of competitive differentiation. Any industry that relies on customers should be concerned about digital transformation.

Digital Transformation is about empathizing, ideating, validating and quantifying the creators and inhibitors of customer value.

Digital Transformation is about creating new digital assets — Analytic Profiles and analytic modules. Analytic Profiles and Analytic Modules are assets that economically behave like no other asset we have ever seen. These digital assets never wear out, never depreciate and can be used across an unlimited number of use cases at near-zero marginal cost. And on top of that, these digital assets become more valuable the more they are used.

Digital Transformation is about predicting what’s likely to happen, prescribing recommended actions, and continuously learning and adapting (autonomously) faster than your competition.

The three levels of analytics maturity — from reporting to autonomous:

  • Level 1: Insights and Foresight. The goal of Level 1 is to quantify cause-and-effect, determine confidence levels and measure the goodness of fit with respect to the predictive insights.
  • Level 2: Optimized Human Decision Making. The goal of Level 2 is to create analytics that can learn and codify trends, patterns and relationships. To build predictive models and deliver prescriptive recommendations and actions.
  • Level 3: The Learning and Intelligent Enterprise. These analytics seek to model the world around them — based upon the objectives as defined in the AI Utility Function.

These technologies are only the enablers of value creation; by themselves, they do not provide any value.

What if organizations could replace those static if-then types of policies with AI-based, continuously learning and adapting algorithms that learned and evolved based upon the constantly evolving state of the environment in which the business operates? “AI-driven Policies” are operational policies maintained by AI agents that take actions.

The heart of Digital Transformation is the ability to identify, codify and operationalize (scale) the sources of customer, product and operational value within an environment that is continuously learning and adapting to ever-changing customer and market needs.

Digital Transformation requires organizations to master four fundamental aspects of value creation:

  • Fundamental # 1: Identify Sources of Value Creation.
  • Fundamental # 2: Codify Sources of Value Creation.
  • Fundamental # 3: Operationalize Sources of Value Creation.
  • Fundamental # 4: Continuously Learn and Adapt to Sources of Value Creation.

Most organizations do two things very poorly — prioritize and focus. Organizations don’t fail due to a lack of “strategic” initiatives; organizations fail because they have too many.

The 3 Horizons Framework:

  • Horizon # 1: Optimize Your Current Operations. It employs descriptive and explorative analytics. Horizon 1 is focused on making money today.
  • Horizon # 2: Digitalize Your Current Operations. Horizon 2 applies predictive and Prescriptive analytics. Horizon 2 strives to master the creation of new digital assets (Analytic Profiles and Modules).
  • Horizon # 3: Digitally Transform (Reinvent) Your Business Model. Horizon 3 exploits the game-changing potential of automation and autonomous analytics. Horizon 3 seeks to create “economic moats”.

Culture

Ambiguity — the quality of being open to more than one interpretation — is the key to human, societal and organizational evolution.

Persona profiles (to personalize the customer’s challenge), customer journey maps (to understand the customer’s journey towards a solution), and stakeholder maps (with personal win conditions for each stakeholder) are just a few the Design Thinking tools and techniques that you can use to understand your customers, their jobs, the value propositions and their associated gains (benefits) and pains (impediments).

Organizational Improvisation yields flexible and malleable teams that can maintain operational integrity while morphing the team’s structure and execution in response to the changing needs of the situation.

The Game Boy © Final Fantasy Legend II ™ game is a surprisingly fabulous management tool that yields several valuable lessons in creating empowered teams, including:

  • It takes a team to win the game.
  • Discovery is a highly non-linear process.
  • You must test different hypotheses throughout the game to find the ones that win.
  • Failing is a natural way to learn.
  • Everyone takes a turn leading.
  • Embrace “unlearning”.
  • Be prepared to start all over.
  • Embrace diversity of perspectives.
  • Nurture strong collaboration across the ecosystem.

Organizations that are seeking to drive their digital transformation must replace their “OR” mentality with an “AND” mentality.

You may also like
Marvin Minsky: The Emotion Machine
Thomas H. Davenport, Jeanne G. Harris: Competing on Analytics; The New Science of Winning
Francis Buttle: Customer relationship management; Concepts and technologies – second edition
Introducing AI through ladder approach (Rob Thomas – IBM, published by O’Reilly media)