Four major technological forces — cloud computing, big data, artificial intelligence, and the internet of things — is causing a mass extinction event in industry after industry.
An artificial intelligence will have a profound role in determining national military capabilities and relations among the world’s leading powers.
Post-Industrial Society
Bell introduced the concept of the Post-Industrial Society and went on to predict a fundamental change in the structure of human economic and social interaction — a change with impact on the order of the Industrial Revolution — a change that he called “The Information Age.”
The term post – industrial society was used to describe a series of macro – economic and social changes in the global economic structure on the order of magnitude of the Industrial Revolution. Bell developed his theory in the context of the history of economic civilization, positing three constructs: Pre-Industrial; Industrial; and Post – Industrial.
Bell described pre-industrial society as a game against nature. The transformative energy is human. In pre-industrial societies, power is held by those who control the scarcest resources, in this case land.
Bell described goods-producing industrial societies as a game against fabricated nature. In industrial societies, the scarcest resource is access to various forms of capital, especially machinery. The transformative energy is mechanical.
A post-industrial society is about the delivery of services. It is a game between people. It is powered by information, not muscle power, not mechanical energy. In a post-industrial society the primary resource is knowledge.
It’s hard to overstate the scale of Bell’s vision for the Information Age. “If tool technology was an extension of man’s physical powers,” he wrote, “communication technology, as the extension of perception and knowledge, was the enlargement of human consciousness.”
Punctuated Equilibrium
Darwinian evolution is a force of continuous change — a slow and unceasing accumulation of the fittest traits over vast periods of time. By contrast, punctuated equilibrium suggests that evolution occurs as a series of bursts of evolutionary change. These bursts often occur in response to an environmental trigger and are separated by periods of evolutionary equilibrium. The reason this idea is so compelling is its parallel in the business world: Today we are seeing a burst of evolutionary change — a mass extinction among corporations and a mass speciation of new kinds of companies.
Evolutionary biologist and paleontologist Stephen Jay Gould published his new theory of evolution in Punctuated Equilibrium.
An essential piece of this evolutionary theory is scale. In punctuated equilibrium, Gould focuses on species-wide patterns of evolution, whereas Darwinian evolution draws insight from the traits, survival, and reproduction of individual organisms through generations.
Evolutionary punctuations are responsible for the cyclic nature of species: inception, diversification, extinction, repeat.
Evolutionary punctuations are not a matter of competitive advantage like beak size is; they are existential.
Every mass extinction is a new beginning.
The evidence suggests that we are in the midst of an evolutionary punctuation: We are witnessing a mass extinction in the corporate world in the early decades of the 21st century. Since 2000, 52 percent of the Fortune 500 companies have either been acquired, merged, or have declared bankruptcy.
Mass extinction and subsequent speciation don’t just happen without reason. In the business world, I believe the causal factor is “digital transformation”.
Businesses now face their own Oxygen Revolution: the Big Data Revolution. Like oxygen, big data are an important resource with the power to both suffocate and drive revolution. During the Great Oxidation Event, species began to create new channels of information flow, use resources more efficiently, and mediate connections previously unheard of, transforming oxygen from a lethal molecule into the source of life. Big data and AI, along with cloud computing and IoT, promise to transform the technoscape to a similar degree.
Digital Transformation
What is digital transformation?
Some describe it as the power of digital technology applied to every aspect of the organization. Some refer to it as using digital technologies and advanced analytics for economic value, agility, and speed.
Industry analyst Brian Solis of Altimeter Group writes, “Investing in technology isn’t the same as digital transformation.”
The first years of the internet caused disruption in business, government, education — every aspect of our lives.
Innovative companies streamlined processes, making them faster and more robust than their analog counterparts.
In all these cases, processes were streamlined, but not revolutionized: they were the same analog processes, duplicated in digital form.
Simply investing in technology to digitize existing functions and processes is not enough to truly transform a company or industry. It’s a necessary ingredient, but not sufficient. Digital transformation demands revolutionary changes to key competitive corporate processes.
Digital transformation is a disruptive evolution into an entirely new way of working and thinking.
And it is why digital transformation can be so frightening: Companies must shift their focus from what they know works and invest instead in alternatives they view as risky and unproven.
This is Clayton Christensen’s aptly named “Innovator’s Dilemma”: Companies fail to innovate, because it means changing the focus from what’s working to something unproven and risky.
Digital transformation goes by many different names. Perhaps the most familiar is “the fourth Industrial Revolution”.
Digital transformation will further accelerate the pace of disruption.
Leaders who focus on digital transformation understand that to survive, their companies will have to go through a fundamental change. And they are being proactive about that change.
Geoffrey Moore’s model of “context” versus “core” in business helps illustrate why transformation at the core is important. Moore’s model describes the cycle of innovation as it relates to both vital and support processes of a company. “Core” is what creates differentiation in the marketplace and wins customers. “Context” consists of everything else — things like finance, sales, and marketing.
Core gives a business bargaining power: it is what customers want and cannot get from anyone else.
Context means outsourcing, while core means intellectual property.
Digitalizing core is a true transformation. This digital transformation demands a complete overhaul of core processes and capabilities.
Centers of Excellence can align the organization around digital transformation efforts, unify disparate departments, and grant employees the skills necessary to be successful in this effort.
The lagging sectors are less than 15 percent as digitized as the leading sectors.
Workers in the most digitized industries enjoy wage growth that is twice the national average.
The stakes are high. Europe may either add € 1.25 trillion of gross industrial value, or lose € 605 billion of value by 2025, according to Roland Berger Strategy Consultants.
As McKinsey argues, “Looking at just three big areas of potential — online talent platforms, big data analytics, and the internet of things — we estimate that digitization could add up to $ 2.2 trillion to annual GDP by 2025, although the possibilities are much wider.”
The market for digital transformation consulting alone is worth an estimated $ 23 billion.
As more people move to cities (68 percent of the world will live in cities by 2050), public infrastructure and resources — water and energy in particular — are being strained.
Digital transformation requires companies to continuously monitor current trends, experiment, and adapt — and academic institutions are developing curricula to teach these new capabilities to future and current business leaders.
The many ways digital transformation will improve human life:
- In medicine, expect very early disease detection and diagnosis, genome-specific preventative care.
- In the automotive industry, expect self-driving cars.
- In manufacturing, 3D printing and manufacturing-as-a-service will allow for mass, inexpensive customization with low or no distribution costs.
- In resource management and sustainability, resources will be matched with need, waste minimized, and constraints alleviated.
Harvard economist Lawrence Katz claims: “We never have run out of jobs. There is no long-term trend of eliminating work for people. Over the long term, employment rates are fairly stable. People have always been able to create new jobs. People come up with new things to do”.
Robotics and artificial intelligence systems will not only be used to replace human tasks, but to augment their skills.
Regulations and public policy also need to keep up, to promote entrepreneurship and encourage the founding of new companies.
The Information Age Accelerates
In the case of one Fortune 500 manufacturer, the magnitude of the data aggregation problem is 50 petabytes fragmented across 5,000 systems representing customer, dealer, claims, ordering, pricing, product design, engineering, planning, manufacturing, control systems, accounting, human resources, logistics, and supplier systems, fragmented by mergers and acquisitions, product lines, geographies, and customer engagement channels.
Cloud computing, big data, AI, and IoT converge to unlock business value estimated by McKinsey of up to $ 23 trillion annually by 2030.
Cloud computing is the first of the four technologies that drive digital transformation. Without cloud computing, digital transformation would not be possible.
The public cloud computing market is estimated to reach a staggering $ 162 billion by 2020.
Cisco forecasts that by 2021, 94 percent of workloads will be processed by cloud data centers and 73 percent of cloud workloads will be in public cloud data centers.
A key enabler of cloud computing’s superior economies of scale is a technology innovation known as “virtualization”.
“Containers” are another innovation enabling efficient sharing of physical resources. A container is a lightweight, stand-alone, executable software package that includes everything needed to run it — code, runtime, system tools and libraries, and settings.
PaaS offerings provide software development tools and services specifically for building, deploying, operating, and managing software applications.
SaaS offerings are complete, prebuilt software applications delivered via the internet.
CIOs now recognize the importance of operating across multiple cloud vendors to reduce reliance on any one provider (so-called “vendor lock-in”) and to take advantage of differentiation in public cloud provider services.
The second technology vector driving digital transformation is “big data”. Data, of course, have always been important. But in the era of digital transformation, their value is greater than ever before.
The term “big data” was first used in fields such as astronomy and genomics in the early 2000s.
Historically, collecting data was time consuming and labor intensive. So, organizations resorted to statistics based on small samples (hundreds to thousands of data points) to make inferences about the whole population.
The third major technology driving digital transformation is artificial intelligence.
The number and complexity of AI applications are rapidly expanding. For example, AI is being applied to highly complex supply chain problems, such as inventory optimization; production problems, such as optimizing the yield of manufacturing assets; fleet management problems, such as maximizing asset uptime and availability; and health care problems, such as predicting drug dependency risk, to name just a few.
In this case, while the algorithm determines the weights, human analysts determined the features. We’ll see in deep learning approaches that the algorithm can determine both the relevant features and the associated weights directly from the data.
Deep learning is especially interesting because it can potentially be applied to any task — from predicting engine failure or diabetes to identifying fraud — with far less data scientist intervention compared to other machine learning methods, due to the greatly reduced or eliminated need for feature engineering.
The fourth technology driving digital transformation is the internet of things. The basic idea of IoT is to connect any device equipped with adequate processing and communication capabilities to the internet, so it can send and receive data.
The technical requirements to enable a complete, next-generation enterprise platform that brings together cloud computing, big data, AI, and IoT are extensive. They include 10 core requirements:
- Data Aggregation: Ingest, integrate, and normalize any kind of data
- Multi-Cloud Computing: Enable cost-effective, elastic, scale-out compute and storage
- Edge Computing: Enable low-latency local processing and AI predictions and inferences on edge devices
- Platform Services
- Enterprise Semantic Model: Provide a consistent object model across the business in order to simplify and speed application development
- Enterprise Microservices: Provide a comprehensive catalog of AI-based software services
- Enterprise Data Security
- System Simulation Using AI and Dynamic Optimization Algorithms: Enable full application lifecycle support
- Open Platform: Support multiple programming languages
- Common Platform for Collaborative Development: Enable software developers, data scientists, analysts and other team members to work in a common framework
These requirements are uniquely addressed through a “model-driven architecture”. Model-driven architectures define software systems by using platform – independent models.
The Elastic Cloud
The “elastic cloud” gains its name from the ability to rapidly and dynamically expand and contract to satisfy compute and storage resource needs. This elasticity has transformed software deployment models, the costs of IT, and how capital is allocated.
The evolution of the cloud began with the emergence of mainframe computers back in the 1950s. Mainframes subsequently became the bedrock of enterprise computing for several decades.
Application virtualization was first popularized in the early 1990s with Sun Microsystems’ Java. The Java Runtime Environment (JRE) enabled applications to run on any computer that had JRE installed.
In the early 2000s, VMware transformed application virtualization by introducing software known as a “hypervisor,” which didn’t require a host operating system to run.
Commercialized by major telecom giants in the late 1990s and 2000s, VPNs enabled organizations to conduct business online securely.
In 2006, Amazon Web Services (AWS) introduced Simple Storage Service (S3), Elastic Compute Cloud (EC2), and Simple Queue Service (SQS). This marked the introduction of the public cloud.
Google entered the cloud business with the launch of the Google App Engine in 2008. Microsoft announced Azure later that year and released its first cloud products in stages from 2009 through early 2010. IBM entered the fray with its acquisition of SoftLayer, the progenitor of IBM Cloud, in 2013.
Today, cloud computing critically underpins and helps drive digital transformation.
Five core features of cloud computing make it essential to digital transformation:
- Infinite capacity
- On-demand self-service
- Broad network access
- Resource pooling
- Rapid elasticity
Beyond its core technical features, two aspects of cloud computing — the deployment model (who owns the infrastructure) and the service model (what type of services are provided) — have significant impacts on business operations.
While cost, time, and flexibility advantages are the fundamental reasons to move to the elastic public cloud, there are other important benefits:
- Near-zero maintenance
- Guaranteed availability
- Cyber and physical security
- Latency
- Reliable disaster recovery
- Easier and faster development (DevOps)
- Subscription pricing
- Future-proofing
SaaS allows software producers to rapidly and frequently upgrade products, so customers always have the latest functionality.
Focusing on business, not on IT.
Big Data
What’s most different about big data, in the context of today’s digital transformation, is the fact that we can now store and analyze all the data we generate — regardless of its source, format, frequency, or whether it is structured or unstructured.
With regard to big data, incumbent organizations have a major advantage over startups and new entrants from other sectors.
Big data as a term first appeared in an October 1997 paper by NASA researchers Michael Cox and David Ellsworth, published in the Proceedings of the IEEE 8th Conference on Visualization. The authors wrote: “Visualization provides an interesting challenge for computer systems: data sets are generally quite large, taxing the capacities of main memory, local disk, and even remote disk. We call this the problem of big data.”
Five key challenges that organizations face in today’s era of big data:
- Handling a multiplicity of enterprise source systems
- Incorporating and contextualizing high-frequency data
- Working with data lakes (or data swamps)
- Ensuring data consistency, referential integrity, and continuous downstream use
- Enabling new tools and skills for new needs
Successful digital transformation hinges critically on an organization’s ability to extract value from big data.
The AI Renaissance
Logic-based algorithms represent the core of traditional computer science. For decades, computer scientists were trained to think of algorithms as a logical series of steps or processes that can be translated into machine-understandable instructions and effectively used to solve problems.
AI made a brief resurgence in the 1980s, with much of the work focused on helping machines become smarter by feeding them rules.
The concept of “expert systems” evolved and languages like LISP were used to more effectively encode logic.
The field of AI was reinvigorated in the 2000s, driven by three major forces. First was Moore’s Law in action — the rapid improvement of computational power.
Second, the growth of the internet resulted in a vastly increased amount of data that was rapidly available for analysis.
The internet also enabled the ubiquitous availability of compute resources through the emergence of cloud computing.
The most famous of these open-source code repositories is the Apache Software Foundation. At the same time, Python started to emerge as the machine learning programming language of choice — and a significant share of the source code contributions included Python libraries and tools.
AI has greatly evolved from the use of symbolic logic and expert systems (in the ’70s and ’80s), to machine learning systems in the 2000s, and to neural networks and deep learning systems in the 2010s.
There are two main categories of supervised learning techniques. The first is classification techniques. These predict outputs that are specific categories.
The second category is regression techniques. These predict values — such as a forecast of sales over the next week.
Neural networks — and deep neural networks in particular — represent a newer and rapidly growing category of machine learning algorithms. In a neural network, data inputs are fed into the input layer, and the output of the neural network is captured in the output layer. The layers in the middle are hidden “activation” layers that perform various transformations on the data to make inferences about different features of the data.
- Data Assembly and Preparation – The first step is to identify the required and relevant data sets, and then assemble the data in a unified image that is useful for machine learning.
- Feature Engineering – The next step is feature engineering. This involves going through the data and crafting individual signals that the data scientist and domain experts think will be relevant to the problem being solved.
- Labeling the Outcomes – This step involves labeling the outcomes the model tries to predict (e.g., “engine failure”).
- Setting Up the Training Data – Now comes the process of setting up the data set for training the algorithm.
- Deploying the Algorithm into Production – The machine learning algorithm then must be deployed to operate in a production environment: It needs to receive new data, generate outputs, and have some action or decision be made based on those outputs.
- Closed-Loop Continuous Improvement – Once in production, the performance of the AI algorithm needs to be tracked and managed. Algorithms typically require frequent retraining by data science teams as market conditions change, business objectives and processes evolve, and new data sources are identified.
In the corporate banking market, banks compete for business based on a number of factors, including product offerings, interest rates, and transaction fees. Banks generate revenue from fees charged on customers’ transactions and from interest earned by lending out the funds in customers’ accounts.
The AI application ingests and unifies data from numerous internal and external sources, including multiple years of historical data at various levels of frequency: customer transactions and account balances; changes in rates paid on cash balances; credit risk; GDP growth; short-term interest rates; money supply; and account-specific corporate action data from SEC filings and other sources. By applying multiple AI algorithms to these data in real time, the application can identify profiles of at-risk customers, predict those who are likely to reduce their balances for preventable reasons, and send prioritized alerts to account managers, enabling them to take proactive action.
There is a significant global shortage of AI talent today. Existing talent is extremely concentrated at a few technology companies like Google, Facebook, Amazon, and Microsoft.
Google acquired AI startup DeepMind Technologies, with just 75 employees, for an estimated $ 500 million — more than $6 million per employee.
In 2016 the information technology ministry estimated China will need 5 million more AI workers to satisfy its needs.
In addition to data scientists, companies will also increasingly need individuals whom McKinsey calls “translators.” Translators can bridge the divide between AI practitioners and the business.
The Internet of Things
The internet of things (IoT), refers to the ubiquitous sensoring of value chains so all devices in the chains become remotely machine addressable in real or near-real time.
IoT is a fundamental change in the form factor of computing, bringing unprecedented computational power — and the promise of real-time AI — to every manner of device.
The use of RFID tags to track objects in logistics is a well-known early example of IoT, and the technology is commonly used today to track shipments, prevent loss, monitor inventory levels, control entry access, and much more. In fact, industrial uses of IoT took hold first ahead of consumer uses.
Early M2M applications evolved in parallel as IP-based wireless networks.
By 2010, the idea of moving these largely proprietary networks to IP-based Ethernet protocols was seen as an inevitable direction.
IoT was slower to take hold in the consumer products world. Consumer IoT was further sparked in 2011 – 12, when several successful products like the Nest remote thermostat and the Philips Hue smart lightbulb were introduced. In 2014, IoT hit the mainstream when Google bought Nest for $ 3.2 billion, the Consumer Electronics Show showcased IoT, and Apple introduced its first smart watch.
The internet of things, along with AI, creates a powerful system that was barely imaginable at the beginning of the 21st century, enabling us to solve problems previously unsolvable.
As the form factor of computing devices continues to evolve, more edge devices are expected to have bidirectional control capabilities.
An IoT platform is the connection between the enterprise and the edge. IoT platforms must be able to aggregate, federate, and normalize large volumes of disparate, real-time operational data.
Today’s state-of-the-art IoT platforms function as application development platforms for enterprises.
The electric power grid, as it existed at the end of the 20th century, was largely as originally designed more than a hundred years earlier by Thomas Edison and George Westinghouse: power generation, power transmission over long distances at high voltage (115 kilovolts or greater), distribution over medium distances at stepped-down voltage (typically 2 to 35 kilovolts), and delivery to electric meters at low voltage (typically 440 volts for commercial or residential consumption).
The smart grid is essentially the power grid transformed by IoT. Smart meters are remotely monitored and commonly read at 15 – minute intervals.
When a power grid is fully sensored, we can aggregate, evaluate, and correlate the interactions and relationships of all the data from all the devices, plus weather, load, and generation capacity in near-real time. We can then apply AI machine learning algorithms to those data to optimize grid performance, reduce the cost of operation, increase resiliency and reliability, harden cybersecurity, enable bidirectional power flow, and reduce greenhouse gas emissions.
There are three primary reasons IoT will change the way business is done.
- First, the volume of data that IoT systems can generate is wholly unprecedented.
- Second, the data generated are valuable. As organizations sensor and measure areas of their business, those sensor readings help them make better, more profitable decisions.
- The third reason IoT will transform business is the power of Metcalfe’s Law — i.e., the value of a network is proportional to the square of the number of its members.
As another example, IoT has significant impact on agriculture. A potato farmer in the Netherlands now runs one of the world’s most advanced potato farms because of IoT. Multiple types of sensors on his farm — monitoring things like soil nutrients, moisture levels, sunlight, temperature, and other factors — provide large amounts of valuable data, enabling the farmer to use his land more efficiently than other farms.
IoT will change business in a major way. The question remains how. I argue it will profoundly change three fundamental aspects of business: how we make decisions, how we execute business processes, and how we differentiate products in the marketplace.
IoT changes the relationships we form with physical objects. IoT gives manufacturers unprecedented visibility into how customers use their products.
How much will all this IoT-driven change impact the economy? With the total number of connected devices projected to grow from about 20 billion today to 75 billion by 2025, analysts expect IoT will contribute up to $ 11.1 trillion in annual global economic value by 2025. That is a staggering amount, equivalent to approximately 11 percent of the global economy, based on the World Bank’s projection of $ 99.5 trillion in global GDP in 2025.
From a customer’s or end user’s perspective, IoT’s real value comes from services, IoT analytics, and applications, while the rest of the technology stack serves as an enabler with lower value and growth potential.
Enel, the large utility based in Rome, manages more than 40 million smart meters across Europe. These meters generate an unprecedented amount of data: more than 5 billion readings per day.
Enel estimates the application of AI algorithms on all this smart grid data across its entire network will yield more than € 600 million in annual economic value.
Unlike traditional ERP-based inventory solutions, the AI application applies advanced stochastic, AI-based optimization on top of the company’s large IoT-based data set.
Many players in the IoT market are positioned to deliver valuable solutions:
- Industrial companies and manufacturers like Siemens, John Deere, and Caterpillar can extend their digital capabilities to create new IoT offerings and enhance their existing products.
- Telecommunication companies like AT&T, Verizon, and Vodafone can leverage their vast networks of communication assets and rich customer data to provide IoT connectivity and value-added services such as IoT-enabled home security.
- Enterprise software giants like SAP, Microsoft, and Oracle are attempting to incorporate capabilities into their platforms to support IoT devices for end customers.
- Internet and tech giants such as Google, Amazon, and Apple — already established in the consumer IoT space with products like Google Home, Amazon Echo, and Apple HomePod — will continue to enhance and expand their IoT offerings.
For IoT technology to deliver on the promise of real-time decision-making, infrastructure developments in the area of 5G wireless communication must first come to fruition.
Regulation and public policy can also either spur development of the IoT market or significantly impede it.
Finally, organizations may have to make changes to the company culture to fully leverage IoT’s power.
Going forward, design of IoT-enabled products will become increasingly iterative, particularly for products with embedded software that requires frequent updates via the cloud.
As companies gather data through IoT devices, they gain new insights into customers and can better customize products to consumer needs.
Sales and marketing teams will need broader knowledge to effectively position offerings as part of these connected systems.
IoT will change the structure of entire industries, blurring the boundaries within industries and shifting bargaining power.
The possibilities now available require companies to answer major strategic questions such as what business they are in.
At the corporate level in multi-business companies, overlay structures are being put in place to evangelize IoT and AI opportunities:
- Stand – Alone Business Unit. This is a separate new unit, with profit and loss responsibility, in charge of executing the company’s strategy to design, launch, market, sell, and service IoT products and services.
- Center of Excellence (CoE). Supplemental to stand-alone business units, the CoE is a separate corporate unit, housing key expertise on smart, connected products. It does not have profit and loss responsibility but is a shared services cost center that other business units can tap.
- Cross-Business-Unit Steering Committee. This approach involves convening a committee of thought leaders across various business units who champion opportunities to share expertise and facilitate collaboration.
As a scientist at Xerox PARC in the 1970s, Bob Metcalfe invented Ethernet — a breakthrough that made it possible to connect previously discrete computers into interactive networks.
Metcalfe’s Law was first presented in 1980. Metcalfe’s Law states that the power of the network is a function of the square of the number of devices connected to the network.
You can think of digital transformation as a result of the collision and confluence of Moore’s Law and Metcalfe’s Law.
AI in Government
China is particularly active. China’s cyber-espionage division, known as Unit 61398, over 100,000 strong, has successfully completed numerous devastating cyber-war missions directed against the U.S. and others.
The Defense Advanced Research Projects Agency (DARPA) announced plans to spend more than $ 2 billion on research into so-called “third wave” artificial intelligence capabilities over the next few years. This is a small fraction — just 10 percent — of the $ 22 billion China intends to spend on AI in the near term.
Today the U.S. and China are engaged in a war for AI leadership. The outcome of that contest remains uncertain. China clearly is committed to an ambitious and explicitly stated national strategy to become the global AI leader. Unless the U.S. significantly steps up investment in AI across the board — in government, industry, and education — it is at risk of falling behind.
The Digital Enterprise
While the cost of failing to adapt is perilous, the future has never looked brighter for large companies embracing digital transformation. This is primarily for two reasons.
Large corporations stand to benefit from a similar paradigm regarding data. If properly used, the value of enterprise data also increases exponentially with scale.
If incumbent organizations can digitally transform, they will establish “data moats,” which are an asymmetric advantage that could dissuade competitors from easily entering their industries.
The second reason large companies are well positioned to exploit digital transformation is they typically have access to substantial capital.
ENGIE, the integrated French energy company is in many ways the archetype of a large enterprise embracing digital transformation.
In 2016, ENGIE CEO Isabelle Kocher recognized two inextricable forces shaking the core of ENGIE’s industry: digital and energy transformations. In ENGIE’s own words, the revolution in the energy industry is being driven by “decarbonization, decentralization, and digitalization.”
ENGIE Digital, a hub for digital transformation efforts across the company. ENGIE Digital includes its Digital Factory — a Center of Excellence (CoE ) where the company’s software developers, alongside its partners, incubate and roll out innovative IT tools across the organization.
ENGIE’s Digital Factory has created and prioritized a comprehensive project roadmap. Use cases span the company’s lines of businesses.
- For its gas assets, ENGIE uses predictive analytics and AI algorithms to perform predictive maintenance on its assets and optimize electricity generation.
- In customer management, ENGIE is rolling out an entire suite of online services for customers, including self-service applications that allow them to manage their own energy use.
- In renewables, ENGIE has developed a digital platform of applications to optimize generation of electricity from renewable sources.
- In smart cities, ENGIE plans to develop and deploy a number of applications — including efficient district heating and cooling, traffic control, green mobility, waste management, and security — to create sustainable, energy – efficient connected cities.
Enel, the Italian utility, is the second largest power producer in the world, with over 95 gigawatts of installed capacity, more than 70 million customers globally, € 75.7 billion in 2018 revenue, and 69,000 employees.
Let’s dive into two specific use cases along Enel’s digital transformation journey. First is the predictive maintenance of Enel’s 1.2 – million – kilometer distribution network in Italy. The second use case to highlight is revenue protection. Enel transformed its approach to identifying and prioritizing electricity theft (“non-technical loss”) to drive a significant increase in the recovery of unbilled energy, while improving productivity.
In 2016, Caterpillar’s then-CEO Doug Oberhelman announced, “Today, we’ve got 400,000 connected assets and growing. By this summer, every one of our machines will come off the line being able to be connected and provide some kind of feedback in operational productivity to the owner, to the dealer and to us.”
Leveraging the Enterprise Data Hub, Caterpillar is building a host of applications to enable its digital transformation. In a first instance, Caterpillar turned to its inventory. How do you manage a supply network that brings together over 28,000 suppliers to ship to 170 dealers, all with fluctuating demand?
Next, Caterpillar is focused on leveraging telemetry from its entire fleet of connected assets along with data related to ambient operating conditions for each asset.
Caterpillar makes all these changes to its operations through the establishment of a CoE — a cross-functional team that brings together outside experts and Caterpillar developers for intensive training on how to design, develop, deploy, and maintain applications using AI and predictive analytics.
John Deere is another industrial manufacturer embarking on a digital transformation strategy to transform its supply chain. John Deere could potentially reduce parts inventory by 25 to 35 percent.
3M sees numerous opportunities to apply AI to drive significant improvements in operational efficiency and productivity across a wide range of business processes with direct bottom-line benefits.
A New Technology Stack
Companies that failed to take advantage of each new generation of technology ceased to be competitive.
In the 1980s, when I was at Oracle Corporation, we introduced relational database management system (RDBMS) software to the market.
It proved an enabling technology for the next generation of enterprise applications that followed, including material requirements planning (MRP), enterprise resource planning (ERP), customer relationship management (CRM), manufacturing automation, and others.
But the primary competitor to Oracle, the one that became the world’s leading provider, was not any of these companies. It was in many cases the CIO. No one succeeded.
When we introduced enterprise application software to the market, including ERP and CRM in the 1990s, the primary software competitors included Oracle, SAP, and Siebel Systems. But in reality, the primary obstacle to adoption was the CIO. Many CIOs believed they had the knowledge, the experience, and the skills to develop these complex enterprise applications internally.
The problems that have to be addressed to solve the AI or IoT computing problem are nontrivial. Massively parallel elastic computing and storage capacity are prerequisite.
Data Integration: This problem has haunted the computing industry for decades. Prerequisite to machine learning and AI at industrial scale is the availability of a unified , federated image of all the data contained in the multitude of (1) enterprise information systems — ERP, CRM, SCADA, HR, MRP — typically thousands of systems in each large enterprise; (2) sensor IoT networks — SIM chips, smart meters, programmable logic arrays, machine telemetry, bioinformatics; and (3) relevant extraprise data — weather, terrain, satellite imagery, social media, biometrics, trade data, pricing, market data, etc.
- Data Persistence: The data aggregated and processed in these systems includes every type of structured and unstructured data imaginable.
- Platform Services: A myriad of sophisticated platform services are necessary for any enterprise AI or IoT application.
- Analytics Processing: The volumes and velocity of data acquisition in such systems are blinding and the types of data and analytics requirements are highly divergent, requiring a range of analytics processing services.
- Machine Learning Services: The whole point of these systems is to enable data scientists to develop and deploy machine learning models.
- Data Visualization Tools: Any viable AI architecture needs to enable a rich and varied set of data visualization tools.
- Developer Tools and UI Frameworks:
- Open, Extensible, Future-Proof:
Digital transformation requires an entirely new technology stack.
Just like relational databases, just like ERP, and just like CRM, the knee-jerk reaction of many IT organizations is to try to internally develop a general-purpose AI and IoT platform using “free” open-source software with a combination of microservices from cloud providers like AWS and Google. Many have attempted this, but to my knowledge no one has succeeded. The classic case study is GE Digital that expended eight years, 3,000 programmers, and $ 7 billion trying to succeed at this task. The end result of that effort included the collapse of that division and the termination of the CEO, and it contributed to the dissolution of one of the world’s iconic companies.
There are a number of problems with this approach:
- Complexity. Using structured programming, the number of software API connections that one needs to establish, harden, test, and verify for a complex system can approach the order of 1013.
- Brittleness. Spaghetti-code applications of this nature are highly dependent upon each and every component working properly.
- Future Proof. As new libraries, faster databases, and new machine learning techniques become available, you will want to make those new utilities available within your platform.
- Data Integration. An integrated, federated common object data model is absolutely necessary for this application domain.
Structured programming is a technique developed in the mid-1960s to simplify code development, testing, and maintenance.
The essential idea of structured programming was to break the code into a relatively simple “main routine” and then use something called an application programming interface (API) to call subroutines that were designed to be modular and reusable.
While this technique is appropriate for many classes of applications, it breaks down with the complexity and scale of the requirements for a modern AI or IoT application.
An alternative to the open-source cluster is to attempt to assemble the various services and microservices offered by the cloud service providers into a working seamless and cohesive enterprise AI platform.
The advantage of this approach over open source is that these products are developed, tested, and quality assured by highly professional enterprise engineering organizations.
The problem with this approach is that because these systems lack a model-driven architecture like that described in the following section, your programmers still need to employ structured programming to stitch together the various services.
Developed at the beginning of the 21st century, you can think of a model-driven architecture as the knife used to cut the Gordian knot of structured programming for highly complex problems.
Using a model-driven architecture, anything can be represented as a model — even, for example, applications including databases, natural language processing engines, and image recognition systems.
The optimal design for an object model to address AI and IoT applications uses abstract models as placeholders to which a programmer can link an appropriate application.
Another important feature of a model-driven architecture is that the application is entirely future-proofed.
Polyglot cloud support is, in my opinion, an essential capability of the New AI Technology Stack. This capability not only allows application portability from one cloud vendor to another but also affords the capability to run your AI and IoT applications on multiple clouds simultaneously.
The CEO Action Plan
The technologies driving digital transformation are now robust, mature, and widely available. Digital leaders will emerge across industries, harnessing AI and IoT to achieve step-function improvements in their business processes and outcompete slower rivals.
The private equity industry manages $2.5 trillion, with nearly $900 billion in “dry powder” waiting to be deployed. This capital will not only be used in takeover bids — it will also be used to fund rapidly growing digital-native competitors.
Digital transformation is the next do-or-die imperative. How CEOs respond will determine whether their companies thrive or perish.
Today, CEOs initiate and mandate digital transformations.
Rather than a sequential, step-by-step process, think of the CEO Action Plan as a set of 10 principles — or key success factors — to guide the transformation initiative.
CEO Action Plan for Digital Transformation – The Opportunity Is Exceeded Only by the Existential Threat
- Marshal the senior CXO team as the digital transformation engine.
- Appoint a Chief Digital Officer with authority and budget.
- Work incrementally to get wins and capture business value.
- Forge a strategic vision in parallel, and get going.
- Draft a digital transformation roadmap and communicate it to stakeholders.
- Pick your partners carefully.
- Focus on economic benefit.
- Create a transformative culture of innovation.
- Reeducate your leadership team.
- Continually reeducate your workforce — invest in self-learning.
Today, CEOs have to keep up with a deluge of information about ever-changing technologies, able to decide what is relevant to the business and prioritize which new technologies to focus on, and which to filter out.
The senior CXO team needs to marshal the funding, resources, and relationships necessary to enable digital transformation.
Digital transformation requires adopting a long-term perspective. It requires moving beyond just measuring financial performance for the next quarter to also thinking toward a broader, bigger picture of the future and how the enterprise will fit within it.
It also requires a certain personality type. Leaders need to be able to handle risk, they need to be willing to speak out, and they need an experimentation mindset.
The CDO’s primary role is chief evangelist and enabler of digital transformation — the one focused on the transformation strategy and who communicates across the organization on the action plans and results.
Best practice also requires a central organization to act as the hub of digital transformation — i.e., a Center of Excellence.
The CEO and CDO play key roles in forming, supporting, and engaging with the CoE.
The way to capture business value is to work out the use case first, identify the economic benefit, and worry about the IT later.
By adopting a phased delivery model — this is essentially the agile development model popular in software development today — teams can achieve results faster.
Digital transformation strategy should be focused on creating and capturing economic value.
Your value chain map — and your strategy — might initially center on inventory optimization, production optimization, AI predictive maintenance, and customer churn.
Two key elements of developing your strategy are benchmarking and assessing the forces of disruption in your industry.
You will want to benchmark your enterprise’s digital capabilities against those of your peers and best – of – breed exemplars. A benchmarking process might be as follows: (1) audit current approaches to digital transformation in your industry; (2) rank your capabilities against peers; (3) identify best practices from more advanced peers; and (4) develop a roadmap for improving capabilities.
Online benchmarks for measuring an enterprise’s level of digital transformation abound.
Developing your strategy requires an assessment of your industry and the forces of disruption likely to shake it up. A few key measures help to indicate if your industry is particularly vulnerable. The first is operational efficiency. Are incumbents operating with high operating costs and facing pressure to improve efficiency?
Next, think about barriers to entry. Are regulation or capital requirements the sole reason large incumbent players are able to thrive in your industry?
Finally, think about to what degree your industry has high dependency on fixed assets. In the age of digital transformation, a high dependency on fixed assets can be a potential weakness rather than a strong barrier to entry.
It’s time to draft your enterprise’s own roadmap and set a game plan for communicating it to stakeholders across the organization.
As BCG writes, “This involves building a portfolio of opportunities — identifying and prioritizing functions or units that can benefit most from transformation. It also involves locating and starting to address roadblocks to transformation.”
The best digital transformation roadmaps involve concrete plans and timelines to bring advanced AI applications into production.
Organizational alignment is critical. As a business leader, you need to communicate effectively and sell your vision to stakeholders across the organization.
To fulfill your vision of digital transformation, it’s vital to pick the right partners.
There are four key areas in AI-based transformations where partners can add significant value: strategy, technology, services, and change management.
Management consulting partners can help flesh out your AI strategy: Map out your value chain, uncover strategic opportunities and threats, and identify key AI applications and services you will need to develop in order to unlock economic value.
Software partners can provide the right technology stack to power your digital transformation.
Professional services partners can help build your advanced AI applications and/or augment your staff.
Once you have developed AI applications and services, a key next step involves all the business transformation and business process change required to capture economic value.
As much as you would like to delegate digital transformation to an outside agency, you can’t do it. This is your job.
Face the facts. Your organization does not have the skills today to succeed at this effort. You can’t just hire consultants to change the DNA of your company.