From smart buildings and jet engines, to warships and even entire factories the ability to create a real-time digital representation of an asset – a so called digital twin – is already revolutionising many areas of technology.
Earlier this year, in partnership with the engineering giant Babcock International Group, The Engineer brought together a panel of experts to explore this concept in more detail, consider examples of the digital twin in action and discuss some of the challenges and benefits of implementing and maintaining a digital twin
The following report examines some of the key topics explored during this discussion. You can also view a " target="_blank" rel="noopener noreferrer">recording of the entire discussion here.
Meet the panel
- Dr Jon Hall – Chief Innovation and Technology Officer, Babcock International Group
- Steve Penver – Head of Data & Analytics, Babcock International Group
- Prof John Erkoyuncu – School of Aerospace, Transport and Manufacturing, Cranfield University
- Dr Nick Wright – Head of Manufacturing Industries, Digital Catapult
- Dr Chris Wallace – Knowledge Exchange Fellow, University of Strathclyde
- Alistair Donaldson – Transformation Executive – Head of Innovation and new Product Design, Rolls-Royce Plc
Setting the scene
Opening the session, Babcock technology chief Dr Jon Hall explained that digital twins are of growing importance to Babcock, and key to optimising the performance and availability of the range of complex, expensive assets it manages. “Digital twins and digital technologies have implications across all the areas Babcock works in,” he said. “It affects how we collaborate with customers and can be used to transform the way in which big complex assets are built and supported.”
Hall added that whilst the deployment of digital processes is increasingly baked in at the start of new projects, digital twinning is also key to extracting improved performance from existing assets. “There’s so much more we can do to exploit existing capability, he said, “it’s not just about applying these tools and techniques to shiny new assets but also legacy assets.”
For Hall, the key benefit of the approach is “clarity”; the way in which it enables all stakeholders – from members of the supply chain through to end customers - to access a shared view and make informed, dynamic decisions. “There’s a huge opportunity that comes from clarity,” he said. “Whether it’s a service delivery model (for example, looking after aircraft carriers or a fleet of jet aircraft) or in a build program, pulling a shared view of the state an asset and the enterprise that surround it is so powerful”.
Turning to the some of the key challenges of introducing and deploying digital twins, Hall said that it’s really about balancing an appetite for new technology, with an appreciation that if you’re going to truly tap into its benefits you need to work hard to ensure that stakeholders share a common platform and approach. Babcock supports such a broad range of range of assets that ensuring a common approach and implementing digital twins at scale and pace is a major challenge, he said.
Applying the digital twin
Following Hall’s scene-setting, Babcock’s Head of Data & Analytics, Steve Penver drilled down into more detail on exactly how the organisation goes about applying digital twins.
“The starting position is to create a digital back-bone and create that common technology enablement level that gives us connectivity to the assets, interoperability to the customers and the supply chain,” he said. “This is absolutely key to bringing a digital twin to life”
The next step, he explained, is to establish a clear understanding of the performance questions that need to be answered. “That drives us to what is the data that informs those decisions, and how can we digitise the asset to ensure that we’re collecting the right data.”
Developing a digital twin of an asset like a warship is clearly a complex process, and to address this challenge Babcock takes a systems to system approach.
“We model the individual systems, which breaks down the complexity of trying to do it all in one go,” he said. “By integrating those different system level twins we can get an aggregated model of the platform itself. This is how we can build up complexity and understand the interaction of the systems and how they affect performance of the overall platform”
Turning to the benefits of digital twinning, Penver referenced Babcock’s work on the Type 23 frigate. “We’ve seen a massive reduction in inventory spares on board and in holding as well as implementing design changes to some of the individual systems. This is really getting us towards insight of how the asset is performing in real life and how we can model expected performance.”
The Future is now
Echoing Penver’s remarks on the here and now benefits of digital twinning, Rolls-Royce’s head of innovation Alistair Donaldson, began by stressing that digital twinning has become an absolute imperative for many major projects and that expertise in the area is vital if organisations are to “get a seat at the table and contract for these pieces of work.”
Turning to the core technology, Donaldson said that it’s important to recognise that, rather than an individual thing, the digital twin is a collection of technologies “born out of the maturing of our digital enterprise.” In recent years, advances in design, test, simulation and production technologies and the emergence of industry 4.0 have, he said, “created the data supply chain enabling the creation of digital twins.”
Donaldson then outlined a model for a Digital twin hierarchy designed to be applied across all sectors. This model outlines different types of twin, from component level twins through to program level twins and shows how the value lies in bringing all of these twins together.
What’s needed now, he said, is a national digital twin program, that will help establish a common framework to enable organisations across industry to exploit the benefits of the technology. “There will become an important need for standards, the ability to plug this whole opportunity together and really drive that prosperity agenda.”
The practicalities of digital twinning
Our next panelist, Dr Nick Wright, head of manufacturing industries at Digital Catapult got back to basics, and turned to a topic which he said is often forgotten about in the digital twin debate: the infrastructure used to create the coupling between an asset and the digital world. “As part of making these things real that acquisition of high quality secure data between your physical asset and your virtual representation is absolutely key,”
he said.
The process of introducing this coupling is, said Wright, becoming more straightforward thanks to the emergence of improved wireless connectivity technologies, in particular 5G. “Until recently wireless technologies have not had the maturity around security, performance, flexibility and price that makes them accessible….. but some of the technologies now coming along are much more affordable, much more flexible and much more secure than they ever have been and that is the key to us scaling some of these solutions and making the most impact,” he said.
He explained that Digital Catapult is now involved in a number of projects aimed at exploring the manufacturing potential of next-generation wireless technologies, including a project led by AMRC and involving BAE systems that is building an industrial grade 5G network across three sites in NW England.
This initiative is exploring the development of a number of digital twin demonstrators including a real time monitoring and adaptive closed loop control system that’s being used to take data from manufacturing processes, compare this data to simulation models and then adapt the machining process in real time in order to improve performance.
Wright said that the team is also looking at things like simple supervisory models for factory ecosystem monitoring (which could for instance be used to optimise energy usage in HVAC systems) and at how digital twins can be used to optimise the supply chain. “We’re looking at how goods get moved between different tiers within supply chains,” he said. “How do we measure that, understand it, model and optimise it and improve it for future resilience of those supply chains.
Integrating data and building resilience
After Wright’s exploration of how digital twins can be applied in the short term Cranfield’s Prof John Erkoyuncu turned to some of the fundamental research that will shape the technology in the years ahead.
He began by talking about some of the work that his team has been doing on data integration, explaining that the research has centred on efforts to build dynamic relationships between the data source and the software. To achieve this, the team has developed an ontology based process aimed at understanding the asset, characterizing the changes in the asset and then feed that information into the database.
Erkoyuncu gave a couple of compelling experimental examples of this process in practice that demonstrated how it’s possible to automate the flow of data based on changes to an asset and then reflect those change in the digital twin -
Another key area for Erkoyuncu’s team is digital twin resilience, i.e. ensuring that the digital twin remains accurate over time.
Various different scenarios could, he said, cause digital twins to become less accurate over time, so his team has been exploring the use of machine learning tools to detect these anomalies and get the twin running as effectively as possible as quickly as possible. “We wanted to develop a machine learning based approach where we can learn from these disruptions and reduce the time it takes to detect these anomalies and recover to an appropriate level of accuracy,” he explained.
Digital twins in civil nuclear
Our final panelist, Strathclyde’s Dr Chris Wallace focused on a specific application area for digital twins: the civil nuclear sector, and explored some of the advantages and challenges of retrofitting digital twins to legacy equipment.
Wallace explained that the focus of his group’s digital twin activity has been around some of the monitoring, prognostics and inspection processes that are at the heart of the nuclear sector’s asset management strategy. “Using digital twin models to simulate future asset operation, and understand the potential impact of scheduled outages and maintenance procedures has huge potential in terms of asset management,” he said. A key attraction, he added, is that digital twins can be used to provide insight into the operation of a physical of a physical asset that’s not otherwise possible. “Nobody want to build a duplicate nuclear power station to monitor their existing nuclear power stations,” he said, “digital twins fit the bill for that.”
Wallace explained that lots of the work his team does is focused on the component level rather than the entire asset level. “A nuclear power station is a complicated asset,” he said, “with complex operation modes, and degradation mechanisms that vary from component to component, and lifecycles that may be 10, 20 or even 30 years. This seems to be the logical integration point for what people are calling digital twins.
Echoing a point made by other panelists Wallace added that one of the key challenges the group has started to come up against is the sheer volume of data that it is now possible to collect. “Given the cheap nature of computing power, and the availability of cheap sensors you can collect lots of data, build useful models and do analysis but at a certain point you start to experience challenges around scale.”
Q&A: The session concluded with a Q&A from the viewers, here are some of the key questions
When building a digital twin of an asset how do you decide which dynamic to mimic and which to leave out?
JH: “The art is in leaving all the stuff out that you can get away with leaving out. Don’t try and gather data on everything unless you think you really need it. What outcomes do you want? Do you want to save fuel in the engine, do you want greater availability of some mission systems, or power output from the reactor? That’s got to guide you towards the places where you think you need the data to be captured and the modeling to take place.
If you were to start from scratch what would be the key aspects needed to provide a robust foundation for setting up a digital twin?
NW: There’s not easy place to start. It’s not about the technology at all, it’s about focusing on the outcomes you’re looking to achieve by going for a digital twin. It’s really easy to get swallowed up in wanting to go and buy bits of tech, but for me it’s about establishing the outcomes you want to achieve and then the business case will come from those outcomes.
What is the main benefit of a digital twin?
SP: For us it’s very much around availability, maintaining availability, getting the best performance out of the assets, getting toward things like condition-based monitoring, and being able to predict potential failure.
NW: Often the thing that most people want to focus on is the financial reward, but there are other things as well. Companies investing in these technologies will naturally get transferrable digital skills through working with the technologies, and the sustainability agenda is absolute crucial as part of the business case for most of the industries we work in.
JE: Digital twins have a really important role to play in improving efficiency and effectiveness and this clearly has implications in terms of growing sustainability, minimising waste and reducing the costs that you experience across the supply chain.
Are digital twins applicable to smaller organisations and SMEs?
AD: This data needs to start within the supply to be able to be consumed and aggregated up into these products. There are lots of ways that can happen and lots of the things we’ve been talking about are transferrable to smaller organisations, This doesn’t need to be incredibly expensive blue chip enterprise level investment.
NW: The university of Cambridge Institute for manufacturing are running a program called digital manufacturing on a shoestring. That’s a really good program to get engaged with in terms of building up knowledge and capability. In terms of support there is MadeSmarter, the UK’s program and initiative around industry 4.0, there are opportunities for funding and support around and a library of technology solutions that have been developed for the SME community in this space.
What role can organisations like Babcock play to encourage the supply chain to embrace digital twinning?
JH: “There are two routes – a direct route which a number of companies do which is the supply chain excellence route of helping. What seems even more powerful is collaborative efforts. For example there’s an SME working group which helps the interface between defence primes and SMEs, and there are multiple organisations like Team Defence information. Those are opportunities for companies of all scales and sizes to get insight on digital technologies and digital twins. It’s a very democratic environment to come and participate and fin things out.
What are the skills implications around digital twinning?
JE: We recently ran a workshop looking at what are the skills needed around digital What was interesting was that we started off by the ability to justify the need for the technology. The other thing that came out is we need people that can cut across different kinds of skills and have awareness of things like programming and the importance of data. We also looked at how models and different types of modelling processes can contribute to decision making. The future digital person will cut across these different disciplines – we’re moving much more towards a person that has a system view and an awareness of all of these different challenges rather than just being an engineer in a particular application area.
Poll finds engineers are Britain’s second most trusted profession
Interesting. Government ministers are nearly 50% more trusted than politicians! "politicians (11 per cent ), government ministers (15 per...