Digital twins are multiplying as their capabilities and sophistication grow. But realizing their full promise may require integrating systems and data across entire organizational ecosystems.
IMAGINE that you had a perfect digital copy of the physical world: a digital twin. This twin would enable you to collaborate virtually, intake sensor data and simulate conditions quickly, understand what-if scenarios clearly, predict results more accurately, and output instructions to manipulate the physical world.
Today, companies are using digital twin capabilities in a variety of ways. In the automotive1 and aircraft2 sectors, they are becoming essential tools for optimizing entire manufacturing value chains and innovating new products. In the energy sector, oil field service operators are capturing and analyzing massive amounts of in-hole data that they use to build digital models that guide drilling efforts in real time.3 In health care, cardiovascular researchers are creating highly accurate digital twins of the human heart for clinical diagnoses, education, and training.4 And in a remarkable feat of smart-city management, Singapore uses a detailed virtual model of itself in urban planning, maintenance, and disaster readiness projects.5
Digital twins can simulate any aspect of a physical object or process. They can represent a new product’s engineering drawings and dimensions, or represent all the subcomponents and corresponding lineage in the broader supply chain from the design table all the way to the consumer—the “as built” digital twin. They may also take an “as maintained” form—a physical representation of equipment on the production floor. The simulation captures how the equipment operates, how engineers maintain it, or even how the goods this equipment manufactures relates to customers. Digital twins may take many forms, but they all capture and utilize data that represents the physical world.
Recent MarketsandMarkets research suggests that such efforts are already underway: The digital twins market—worth US$3.8 billion in 2019—is projected to reach US$35.8 billion in value by 2025.6
What accounts for this kind of growth? And why now? After all, digital twin capabilities are not new. Since the early 2000s, pioneering companies have explored ways to use digital models to improve their products and processes.7 While digital twins’ potential was clear even then, many other companies found that the connectivity, computing, data storage, and bandwidth required to process massive volumes of data involved in creating digital twins were cost-prohibitive.8
The digital twins trend is gaining momentum thanks to rapidly evolving simulation and modeling capabilities, better interoperability and IoT sensors, and more availability of tools and computing infrastructure. As a result, digital twins’ capabilities are more accessible to organizations large and small, across industries. IDC projects that by 2022, 40 percent of IoT platform vendors will integrate simulation platforms, systems, and capabilities to create digital twins, with 70 percent of manufacturers using the technology to conduct process simulations and scenario evaluations.9
At the same time, access to larger volumes of data is making it possible to create simulations that are more detailed and dynamic than ever.10 For longtime digital twins users, it is like moving from fuzzy, black-and-white snapshots to colorful, high-definition digital pictures. The more information they add from digital sources, the more vivid—and revealing—the pictures become.
Digital twin capabilities began as a tool of choice in the engineer’s toolbox because they can streamline the design process and eliminate many aspects of prototype testing. Using 3D simulations and human-computer interfaces such as augmented reality and virtual reality,11 engineers can determine a product’s specifications, how it will be built and with what materials, and how the design measures against relevant policies, standards, and regulations. It helps engineers identify potential manufacturability, quality, and durability issues—all before the designs are finalized. Thus, traditional prototyping accelerates, with products moving into production more efficiently and at a lower cost.
Beyond design, digital twins are poised to transform the way companies perform predictive maintenance of products and machinery in the field. Sensors embedded in the machines feed performance data into a digital twin in real time, making it possible not only to identify and address malfunctions before they happen but to tailor service and maintenance plans to better meet unique customer needs. Recently, Royal Dutch Shell launched a two-year digital twin initiative to help oil and gas operators manage offshore assets more effectively, increase worker safety, and explore predictive maintenance opportunities.12
Digital twins can help optimize supply chains, distribution and fulfillment operations, and even the individual performance of the workers involved in each. As an example of this in action, global consumer products manufacturer Unilever has launched a digital twin project that aims to create virtual models of dozens of its factories. At each location, IoT sensors embedded in factory machines feed performance data into AI and machine learning applications for analysis. The analyzed operational information is to be fed into the digital twin simulations, which can identify opportunities for workers to perform predictive maintenance, optimize output, and limit waste from substandard products.13
Smart city initiatives are also using digital twins for applications addressing traffic congestion remediation, urban planning, and much more. Singapore’s ambitious Virtual Singapore initiative enables everything from planning for cell towers and solar cells to simulating traffic patterns and foot traffic. One potential use may be to enable emergency evacuation planning and routing during the city’s annual street closures for Formula 1 racing.14
Over the course of the last decade, deployment of digital twin capabilities has accelerated due to a number of factors:
The AI and machine learning algorithms that power digital twins require large volumes of data, and in many cases, data from the sensors on the production floor may have been corrupted, lost, or simply not collected consistently in the first place. So teams should begin collecting data now, particularly in areas with the largest number of issues and the highest outage costs. Taking steps to develop the necessary infrastructure and data management approach now can help shorten your time to benefit.
Even in cases where digital twin simulations are being created for new processes, systems, and devices, it’s not always possible to perfectly instrument the process. For chemical and biological reactions or extreme conditions, it may not be possible to directly measure the process itself; in some cases, it may not be cost-effective or practical to instrument the physical objects. As a result, organizations need to look to proxies (for example, relying on the instrumentation and sensors in a vehicle rather than putting sensors into tires) or things that are possible to detect (for example, heat or light coming from chemical or biological reactions).
And with the cost of sensors dropping, how many sensors is enough? Balancing the cost/benefit analysis is critical. Modern aircraft engines can have thousands or tens of thousands of sensors, generating terabytes of data every second. Combined with digital twins, machine learning, and predictive models, manufacturers are providing recommendations to help pilots optimize fuel consumption, help maintenance be proactive, and help fleets manage costs.15 Most use cases, however, require only a modest number of strategically placed sensors to detect key inputs, outputs, and stages within the process.
In the coming years, we expect to see digital twins deployed broadly across industries for multiple use cases. For logistics, manufacturing, and supply chains, digital twins combined with machine learning and advanced network connectivity such as 5G will increasingly track, monitor, route, and optimize the flow of goods throughout factories and around the world. Real-time visibility into locations and conditions (temperature, humidity, etc.) will be taken for granted. And without human intervention, the “control towers” will be able to take corrective actions by directing inventory transfers, adjusting process steps on an assembly line, or rerouting containers.
Organizations making the transition from selling products to selling bundled products and services, or selling as-a-service, are pioneering new digital twin use cases. Connecting a digital twin to embedded sensors and using it for financial analysis and projections enables better refinement and optimization of projections, pricing, and upsell opportunities.
For example, companies could monitor for higher wear-and-tear usage and offer additional warranty or maintenance options. Or organizations could sell output or throughput as-a-service in industries as varied as farming, transportation, and smart buildings. As capabilities and sophistication grow, expect to see more companies seeking new monetization strategies for products and services, modeled on digital twins.
As the digital twins trend accelerates in the coming years, more organizations may explore opportunities to use digital twins to optimize processes, make data-driven decision in real time, and design new products, services, and business models. Sectors that have capital-intensive assets and processes like manufacturing, utilities, and energy are pioneering digital twin use cases already. Others will follow as early adopters demonstrate first-mover advantage in their respective sectors.
Longer term, realizing digital twins’ full promise may require integrating systems and data across entire ecosystems. Creating a digital simulation of the complete customer life cycle or of a supply chain that includes not only first-tier suppliers but their suppliers, may provide an insight-rich macro view of operations, but it would also require incorporating external entities into internal digital ecosystems. Today, few organizations seem comfortable with external integration beyond point-to-point connections. Overcoming this hesitation could be an ongoing challenge but, ultimately, one that is worth the effort. In the future, expect to see companies use blockchain to break down information silos, and then validate and feed that information into digital twin simulations. This could free up previously inaccessible data in volumes sufficient to make simulations more detailed, dynamic, and potentially valuable than ever.
It’s time to transition your digital organization from black-and-white to color. Are you ready?
Airservices Australia is preparing for the aviation industry’s next evolution. As the continent’s provider of air navigation services, it expects the volume of conventional flights in its airspace to double over the next two decades. Meanwhile, the emergence of unmanned aerial vehicles in low altitude airspace—from aerial taxis to delivery drones—is accelerating the need for new intelligent systems, compounding an already difficult job.
Airservices is addressing these challenges by launching initiatives that will enable it to shift to leveraging the value of data and providing the information management services of the future. One of these initiatives is to explore how a digital twin, combined with IoT and machine learning capabilities, could enhance Airservices’ ability to manage air traffic today and in the years to come.
The Service Strategy team, led by Mick Snell,16 kicked off its digital twin development project in early 2019 with a practical objective: determine whether a digital twin can enhance Airservices’ ability to manage its current air traffic network. For example, could it be used to enhance flight routes, optimize takeoff times, and reduce delays?
The team began by developing a digital twin of Airservices’ air traffic network using historic air traffic data. The team has completed four proofs of concept proving out the original objective and is looking forward to piloting them in parallel with existing air traffic control systems. The proofs of concept were able to optimize flight routes based on real-time conditions to provide better traffic flow management.
While still in development, the digital twin project is also serving as a proving ground for enhancing Airservices’ traditional ways of working. The company’s heritage is built on safely delivering navigation services 24 hours a day, 365 days a year. With an unwavering focus on safe, efficient, and reliable service delivery, the increasing airspace complexity is driving Airservices to explore new solutions.
The digital twin project is helping change Airservices’ view of what’s possible. The team piloted an Agile development approach to improve time to market while preserving the focus on safety. The teams are delivering working software at a faster pace—iterating, testing, and learning in short sprints—and continuing to provide safe, accurate predictions. And while Airservices people have deep aeronautical expertise, the company also needed specialized technical knowledge to build and implement advanced analytic capabilities. The team filled that gap with vendors and advisers who offer highly relevant experience and off-the-shelf technology.
Meanwhile, the team continues to uncover relevant use cases for the digital twin. For example, air traffic controllers currently work in an assigned airspace regardless of traffic volume. To optimize the controllers’ workload, the team plans to use the digital twin to assign airspace to controllers based on predicted customer demand rather than fixed geographic locations.
Optimization is an extraordinarily complex issue that requires volumes of real-time data to support what-if scenarios on the fly to help air traffic controllers to make faster, smarter decisions. The digital twin can also enable Airservices customers (pilots) to optimize flights based on what’s most important in the moment. For example, optimizing airspace and routing helps increase on-time arrivals and saves fuel, but a pilot may decide to trade fuel for additional speed to avoid passengers missing their connections.
Eventually, Airservices plans to use digital twins to develop and test strategies for dealing with disruptive innovations likely to affect its airspace. Strategists will be able to quickly test a wide range of scenarios for managing the multidimensional airspace of the future.
With the proof-of-concept phase complete, the team is moving into preproduction. Members will be running trials with current data for several more months and then move into full-scale production, planned for 2020. Snell reports, “We’ve been able to accelerate to an outcome far faster—we’ve come further in the last eight months than in the last eight years.”
Bridgestone, the world’s largest tire and rubber manufacturer, is transforming to become a leader in mobility solutions. The company is reimagining its core business by developing digital capabilities that will enable it to revolutionize tire management services to its portfolio of offerings addressing vehicle manufacturers, fleet operators, and individual drivers.
Digital twin technology is at the heart of Bridgestone’s transformational journey. The company has used digital twin simulations augmented by sensor data as an R&D tool for several years to improve tire life and performance, but that’s just the beginning. Jerome Boulet, digital strategy director, and Hans Dorfi, director of digital engineering,17 together with their teams, are developing sophisticated digital twins to eventually deliver insights across Bridgestone’s entire value chain, with the goal of enhancing profitability, sustaining competitive advantage, reducing time-to-market, and delivering leading-edge tire-as-a-service offerings.
European fleets are gradually shifting to a price-per-kilometer (PPK) subscription model, a way for fleet operators to optimize cash flow and reduce total cost of ownership. But while the business model is simple, setting the appropriate price per kilometer is anything but. A tire’s lifespan is heavily influenced by a myriad of factors, including load, speed, road conditions, and driving behavior. A digital twin can provide insight into how these interrelated conditions affect tire performance by simulating various driving conditions. But without real-world data inputs for the digital twin, setting a price that hits the sweet spot at which the PPK is competitive—and sustainably profitable—is difficult if not impossible.
Bridgestone took a strategic leap by entering the PPK market with a product priced to win business from large fleets. The company used this initial install base to collect performance data that was then fed into advanced analytics algorithms.
According to Dorfi, “Some people ask, ‘Why do you need a digital twin if you have big data—why not just run analytics?’ I explain that while analytics plays a major role, it only augments the digital twin. The digital twin is able to capture the multidimensional performance envelope of tires and can also be applied to product in development, where no data is yet available.” He sees the digital twin as a key component of Bridgestone’s digital infrastructure. Incoming sensor data is augmented, cleaned, and processed; then digital simulations and analytics are applied to derive insights that inform decisions around maintenance, rotations, and other factors that can deliver more value for Bridgestone and its customers.
Bridgestone continues to enhance the digital twins. The 2019 acquisition of WebFleet Solutions18 and the development of next-generation sensors will enable Bridgestone to learn how vehicles and tires are being used in real time, enabling the company to help fleets select the appropriate tires for their specific driving conditions and provide customized insights into how they can reduce tire wear or avoid breakdowns. As the digital model becomes more and more accurate, Bridgestone will address increasingly advanced use cases for its PPK business model.
Today, Bridgestone is using digital technologies to add more value for its fleet customers. Over time, the company intends to expand its use of digital twin technology to connect its entire value chain, from drivers and fleet managers to retailers, distributors, and manufacturers. Looking ahead, leaders see opportunities to inform safety protocols in a world that includes self-driving vehicles. “We’re making sure we have the enablers in place that will take us into the future,” Dorfi says. “And that’s where digital twin technology comes in.”
Takeda Pharmaceuticals is constantly seeking scientific breakthroughs to deliver transformative therapies to patients worldwide. Christoph Pistek19 leads innovation during the company’s development life cycle, translating promising research ideas into tangible medical products. His team also develops processes for how commercial manufacturing partners will actually make the products.
Because the industry is tightly regulated with strict quality control mandates, any process innovation must be thoroughly tested in the development lab for compliance before being introduced to the manufacturing floor. It can take up to 15 years to bring a new medicine to patients, so Pistek is always looking for ways to accelerate experimentation and business processes.
Even in the digital age, pharmaceutical manufacturing processes may contain manual steps. For example, making biologics, vaccines, and other pharma products derived from living organisms involve biochemical reactions, which can be variable and difficult to measure, making automation challenging. And no one has yet perfected a method for automatically progressing from one manufacturing step to the next. True end-to-end manufacturing automation has become the industry’s “holy grail,” Pistek says.
This is where digital twins come in. They help his team accelerate experimentation, develop new manufacturing approaches, and generate data to enable more informed decisions and predictions that could help automate complex chemical and biochemical processes.
To that end, Pistek and his development team build sophisticated virtual representations of the manufacturing processes in their development labs. The team builds a digital twin for each process step and then links all parts via an overall digital twin that controls and automates the flow from one step to another, forming an end-to-end simulation of the manufacturing process.
While modeling chemical processes is complicated, modeling biochemical reactions can be far more complex and irregular. In many cases, real-time sensors cannot monitor the desired outputs, and the output quality remains unknown for hours or days. Instead, the development team uses “soft sensors” or proxy measurements to attempt to predict the time required to complete the biochemical reaction, which is fed into a digital twin that incorporates AI and machine learning. “ The important aspect is that the architecture of digital twins allows the system to evolve on its own,” Pistek says. “Every time we do an additional run and compare the soft sensor results against a true measurement that comes back from the quality control lab, we’re able to make the predictions more accurate.”
Some pharma companies think the key to automation is a matter of better equipment, sensors, or technology. But Pistek has a different opinion: “The true enabler for pharma is the control architecture across and around the process—and the foundation of that is a sophisticated digital twin that can mature itself over time while still in development.” The end goal is a digital twin that can control and steer the automation process without human intervention.
In Takeda’s development labs, the ecosystem for this integrated approach is up and running for one modality: biologics, which is the company’s fastest-growing category and involves one of pharma’s most complex manufacturing processes. The foundational work is complete—the twins are operational, the architecture is built, and the method is in place.
Now the team is refining the process to make it more robust. Pistek expects to expand this automation approach within the development lab across all modalities in the next year. And in two to three years, he expects to see sophisticated examples of this automation approach in use on the commercial manufacturing floor.
Modeling biology and chemical reactions in a digital twin is not straightforward and is difficult to recreate. Pistek’s advice to others considering building digital twins: “Don’t wait, don’t be intimidated, just do it. It’s a learning process that takes time. At Takeda, it’s a critical capability for the job we have to do—find cures for diseases and provide aid to those who suffer.”
People relate to the frustration that traffic congestion creates—and are dissatisfied that it often takes decades to build infrastructure improvements. Our mission is to plan and develop transportation systems that accommodate San Diego’s growing population and healthy economy, while meeting government requirements for improving traffic flow, air quality, and greenhouse gas emissions. And of course, we are working with our various communities to build public support for our anticipated recommendations. Anything we can do to get projects underway quickly can shave months or even years off the timeline.
These are the reasons SANDAG planners and data modelers are developing a nimble digital twin—or “sketch planning” tool—based on FutureScape™, a modeling and simulation platform that creates digital replicas of large systems, like those in a city or an entire region. We’re using FutureScape to complement our government-mandated travel demand model, a macro simulation tool we refer to as an Activity-Based Model.
Regulators require that we run our proposals through the model to certify that the proposals meet federal and state government criteria. It’s a deliberate, arduous process that requires months of calibration and testing and processing times that can take weeks to complete. The new sketch tool will enable us to quickly evaluate a wider range of traditional and innovative transportation options. The Activity-Based Model will process the most promising solutions to certify that the proposed transportation solutions meet regulatory requirements.
For example, one of our goals is to relieve rush-hour congestion between San Diego’s most populated residential areas and the region’s largest employment center. Widening roads is the traditional go-to solution in our car-oriented region, but we believe the sketch tool will enable us to compare road-widening with other options. These options include fast rail lines or light rail systems. Results from the tool should arrive within hours or days, not weeks.
Clearing the regulatory bar is only one factor, of course. Transportation planners must also wonder, “If we build it, will they come?” Evaluating different scenarios is key to answering that question. Using the Activity-Based Model’s historical data, which is largely based on dated commuter surveys and travel diaries, limits our ability to be dynamic and current in how we measure future utilization and demand. We are working to incorporate near-time digital data and, eventually, artificial intelligence into the sketch tool to help us better reflect behavior in response to new transportation options. We also want to consider proposals that include on-demand transportation options and new trends in mobility, such as ridesharing, electric scooters, and bikes, with an eye to incorporating driverless vehicles when they become a viable option.
We also use a digital twin to support real-time traffic management. Here, I envision that adding AI enhancements to the tool will enable proactive decision-making for reducing day-to-day traffic congestion. The current system works well for reacting to traffic backups, using a microsimulation tool that evaluates current traffic flows every three minutes. When an incident disrupts normal traffic patterns, the tool can generate a set of solutions, such as temporarily diverting traffic to another road, which is deployed through changeable highway messages. We’re developing an AI-based strategy aimed to sense potential traffic disruptors in real time. When you’re directing tens of thousands of rush-hour commuters, minutes matter.
By enabling fast, interactive feedback, our sketch planning digital twin will help us quickly develop innovative solutions to complex transportation problems. At SANDAG, we see data-based tools such as FutureScape playing key roles in helping us offer appealing—and environmentally beneficial—mass-transit options to many of our residents accustomed to our car-oriented culture.
While a complete digital twin of the human body is years or even decades away, researchers are chipping away at understanding the biological processes that transform us from DNA into human beings. Today’s research is enabled by advances in genetic sequencing and functional genomics, growing volumes of long-term health data of populations, and increasing capabilities in advanced analytics. This growing knowledge base will inform digital simulations that could eventually help medical professionals control or prevent genetic diseases and disorders.
The project is daunting. Within the human body, DNA provides the instructions for cell growth, which are “expressed” within individual cells to create hundreds of different cell types, including blood cells, nerve cells, muscle cells, and immune cells. Different types of cells combine to produce tissues, which are combined to form organs; for example, there may be more than 10 different types of cells in the tissues that comprise the liver.
As a first step toward creating better virtual models of biological systems, we are working to understand the “instructions” that influence a cell’s development into tissues and organs and, eventually, entire systems, such as the circulatory system. Our research builds on the development of single-cell genomics. Until recently, scientists were able to study only groups of cells, because they lacked the technical capability to extract enough DNA and RNA from a single cell to support genomic analyses. We’re taking single-cell genomics findings to the next level, to understand how single cells construct the gene regulatory systems that underlie the different cell types in tissues and organs.
In Professor Wong’s lab in California, we’re studying the regulation of gene expression in cells, trying to understand how different genes are expressed and how those genes affect the cells they eventually create. Using advanced mathematical models, we are studying huge volumes of data to try to better understand how cells develop into tissues.
After cells and tissues, the next level is organs. In Professor Zhang’s lab in Beijing, we are studying the heart to understand what types and subtypes of cells make up different parts of that organ. With a deeper knowledge of how the heart is constructed, we anticipate having a better understanding of how heart problems arise. By comparing what we’re seeing in the lab with the heart conditions we see in the broad population, we expect to be able to better predict what conditions lead to which health outcomes—positive or negative.
We intend to expand beyond studying specific tissues and organs to construct a digital simulation of the human circulatory system. We’re developing a framework to take in massive amounts of data generated by electronic health records and large-scale research mapping efforts, such as the Human Cell Atlas project.22 But data sets alone are not very useful, so we’re building a type of digital twin: a multilevel causal network, a complex mathematical model to represent the functioning system and the underlying linkages between the different layers. One day, we hope to be able to connect all the data from the DNA in the genome to health outcomes in the general population to better understand how cell instructions, cell types, tissues, organs, and health outcomes are all interconnected.
Within the next three years, our goal is to build out a set of quantitative, layer-by-layer models to help interpret the genomic system. We expect the day will come when physicians will examine a newborn’s genome sequence and understand the impact of its variants (that is, its differences from the reference genome) along with other factors, leading to insights for resolving or preventing disease or disorders. Over time, researchers may use these findings to create a digital twin of the entire human body to help us better understand and simulate how disease and other changes may manifest in the body. Meanwhile, we, along with researchers around the world, have a lot of work ahead.
While digital twin technologies that simulate the physical world have been around for years, new advances warrant taking a second look at current capabilities. The combination of cheap sensors and IoT, machine learning, and the fast, frictionless nature of cloud enable more sophisticated analyses and real-time simulations. While manufacturing scenarios have used these capabilities for years, organizations are increasingly exploring ways to deploy digital twins for operations, city planning, smart infrastructure, and more. Moreover, as companies look to migrate to selling as-a-service business models, digital twins’ increasingly sophisticated capabilities are worth a closer look. The challenging decisions will then be whether to make small investments to create tests and experiments or larger investments to support innovation more broadly.
Digital twins offer increasing potential to affect the bottom line of organizations but aren’t consistently well understood by CFOs and their teams. To many in the finance function, traditional digital twin simulations of manufacturing processes and warehouse logistics are black boxes owned by manufacturing or engineering. However, the growing availability of high-quality simulations, machine learning, and embedded sensors is changing the art of the possible. Some organizations that are shifting from selling products to products-plus-services or as-a-service models are using robust digital twins. They are tracking usage with embedded sensors, creating new offerings for usage recommendations, proactive maintenance, or profitability optimization. Working with the IT function to understand digital twins’ uses today and potential uses of tomorrow is becoming increasingly important, particularly to support new product and service design and delivery.
As digital twin technology integrates with IoT and AI, its disruptive power grows. In the current business climate, any potential technology-driven disruption has material risk implications for the entire organization. Digital twin–driven process efficiencies might not increase risk significantly, at least initially. But as reliance on digital twin technology grows, companies will be aggregating massive stores of data from sensor networks and other sources, which may, in turn, increase privacy or cyber risk. Likewise, if digital twin systems enable a new business model featuring several as-a-service offerings, organizations should understand what material impact these new revenue streams may have on finance, technology, and existing business models. If the potential risks are significant, companies will likely need to develop strategies for measuring and managing them before IT and the business proceed any further with the digital twin project.
In the future, everyone and everything—people, services, global enterprises, and even cities—could have a digital twin. That scale may not happen in the next 18 to 24 months, but the digital twins trend will evolve and grow for years to come. Pilots and prototypes can help identify potential areas where companies can benefit from digital twin capabilities, but the time to embrace this next disruptive transformation phase is now.