Digital twins were once a technology of the future. Now companies are lining up to implement them so they can solve real-world problems with virtual simulations. Is it easier said than done?
Futurist Bernard Marr described a digital twin as “an exact digital replica of something in the physical world; digital twins are made possible thanks to Internet of Things sensors that gather data from the physical world and send it to machines to reconstruct.” Unstructured data, such as IoT technology, have made digital twins possible—and these digital twins are able to solve real-world problems in virtual universes.
SEE: Microsoft Power Platform: What you need to know about it (free PDF) (TechRepublic)
An example Marr offered is the city of Singapore, which does most of its city planning by using a virtual replica of its physical city. In another example, a supermarket in France created a digital twin of a brick-and-mortar store based on data from IoT-enabled shelves and sales systems. The result is that store managers can easily manage inventory and test the effectiveness of different store layouts in digital twin simulations.
Digital twins can be impressive, but It isn’t easy to build one. Each twin is a vast complex of data drawn from IT assets throughout and outside of the enterprise. This data is then applied to an operational digital twin model developed by IT and operations specialists.
“The first challenge is knowing what type of digital twin makes sense based on the business problem you’re trying to solve, whether predictive maintenance, energy optimization or simulation,” said Andy Bane, CEO of Element, which builds and supports digital twins. “For example, is a 3D twin appropriate to the use case, or not, and what type of calculations make sense?”
SEE: Risk reduction: Digital twins, big data and their place on the IT roadmap (TechRepublic)
Because a twin is a digital representation of something in the physical world that is updated with live data, the second challenge is data engineering. This involves modeling the various physical systems involved, and defining the relationships between all of the different objects and their associated attributes within these systems. The third challenge is ingesting the data from multiple systems to feed the twin, and the final challenge is making the twin easy to visualize for users.
A good example is a factory twin that is designed to provide a virtual walkthrough of the factory.
“To do this, the digital twin must show all the equipment, how it’s connected together, and how it’s currently operating. The digital twin must also provide a path for users to drill down into the data for analytics,” Bane said. “You can’t create a digital twin without active IT/OT collaboration.
SEE: Google Cloud announces new supply chain twin offering (TechRepublic)
From a data standpoint, the digital twin will require engineering drawings and detail, sensor data and data from many other systems, none of which were designed to be easily integrated to modern applications like digital twins.”
It’s small wonder that Bane mentions that digital twin concepts were first discussed at the University of Michigan more than 20 years ago, yet digital twins are only now beginning to take root in industry.
“The rate of [digital twin] innovation didn’t begin inflecting until recently,” Bane said. “Industry analysts like Gartner began shifting their coverage of digital twins from being a trend to keep your eye on to beginning to cover it.”
Several factors are bringing digital twins to the forefront of IT strategic roadmaps.
“The biggest one is the falling cost of technology across the board, whether it is cloud compute, data storage, sensors, robots, drones, etc.,” Bane said. “This drives better connectivity edge-to-cloud and in the plant. With that, there is a growing acceptance of hybrid computing and working with an ecosystem of enabling vendors. All of this makes digital twins easier to build and deploy.”