The ability to independently and accurately monitor and virtually model how a process or product performs in real-time without physical intervention is the ultimate prize for engineers and manufacturers.
Without the huge cost of interrupting operations, it allows for the assessment of all manner of scenarios which could improve performance, from equipment upgrades to the evaluation of safety issues that may arise in service.
Guest blog: why digitalisation is vital for future prosperity
Guest blog: Building the right product with simulation
This ‘holy grail’ is now commonly referred to as a digital twin of the product. Operating symbiotically with its real-world counterpart, the digital twin is a fully connected but virtually modelled representation of the product, continually fed with live data from the real physical system.
Creating a viable digital twin is a major undertaking. If it is to be used to predict what may happen to a product or a system as circumstances change, then the commercial consequences can be substantial one way or another. This makes a high level of trust in the outcome essential.
Data driven
Any model is only as good as the data used to develop it. A digital twin is no different - it needs data to evolve with its physical counterpart.
Capturing real-world data historically has been a time-consuming issue, particularly in large volumes. This has meant existing models are working retrospectively rather than being in a position to make predictions based on live data. A model that tells you what went wrong has far less value than one that tells you something will go amiss.
Industry 4.0 provides the means to capture and record data on a massive scale and can rapidly identify variations in the state of a component or system. This goes way beyond exceeding a set threshold and can provide insight into how a system may be degrading or acting out of its original design assumptions.
This amount of data means the digital twin has the potential to predict the remaining life of the system, anticipate maintenance activities or improve design for more resilience to variations. Essentially, making the product better and providing enhanced service to the customer.
The role for AI
However, for some applications, the volume and velocity of the data required is impossible for a human to monitor alone. Fortunately, with the advent of techniques such as AI, correctly trained algorithms operating on relatively simple computational platforms can monitor change automatically, highlighting these to operators and advising them on potential actions they can take. Statistics from the US Department of Energy indicate these AI systems could achieve a 35-40 per cent reduction in equipment downtime from unexpected failures which currently cost $150 billion a year. Likewise, for large scale manufacturing facilities, such outages can cause hundreds of thousands of pounds a day in lost productivity.
While AI systems can cope with the volume of data generated, the algorithms used also need to identify the right data in order to provide meaningful results. Much the same as the human brain uses senses to learn, the algorithms learn by observing or experiencing what is good and bad – or even out of the ordinary.
A maintenance engineer spends years gathering knowledge from manuals and textbooks as well as experience from working with peers and observing what may or may not contribute to a failure. An AI system needs to do the same. It needs to experience everything, which is often physically impractical to achieve. This is where simulation is able to fill the gap in what can be replicated from the real world, synthetically.
It all comes back to simulation
Simulation can create the range of scenarios an AI system needs in order to learn how a product should perform or what it is likely to experience – effectively embedding a lifetime of knowledge into a relatively simple algorithm. It is essential this is done to accurately encode knowledge into these systems and that the approach is endorsed by experts. Only then will the system have value and subject matter knowledge that can be relied on.
Simulation can also help us understand what is happening if things go beyond the realms of human experience or as unforeseen properties emerge from the use of a system. It can do this at a far faster pace than could be achieved in physical trials which, again, means less downtime and better productivity.
It is not only maintenance that can be supported in this way. Production, quality control and operations management can all be enhanced through such AI systems. But, to develop them reliably and make them cost-effective, one needs simulation.
Through-life simulation
This goes to show that simulation is necessary throughout the life of a product. It’s something any manufacturer needs to embrace across their business if they are to remain efficient and achieve the highest levels of productivity.
As the world around us changes, engineers need to challenge what they do in a more comprehensive but economic way if we are to deliver products to meet societal needs – such as the drive for net-zero. Virtualising design and simulation through embracing digitalisation is a key route to achieving this goal and realising the power of concepts such as digital twins.
While the industry has moved beyond a situation where large-scale physical experimentation is the modus operandi, there is still a long way to go. The next step in the journey requires the industry to overcome two hurdles.
Firstly, adopting a mindset which not only cultivates a digital culture within organisations but also champions how a digital approach can bring about change. Secondly, as technology continues to evolve so must simulation, emerging from the constraints of what can be bought off-the-shelf. Industry must democratise this capability to a far greater extent than is seen today through better collaboration and open-source software.
The reward for taking these next important steps on the journey to digitalisation is clear but the journey itself is also imperative if engineers and manufacturers are to become more innovative, efficient and sustainable.
Ian Risk is Chief Technology Officer at CFMS, responsible for evolving the company’s technical vision and leading all aspects of technology development according to its strategic direction and growth objectives. Ian was previously Head of Airbus Group Innovations UK where he was responsible for the UK corporate research capability, developing technical strategy, business development and initiating industrial and academic partnerships.
Poll finds engineers are Britain’s second most trusted profession
Interesting. Government ministers are nearly 50% more trusted than politicians! "politicians (11 per cent ), government ministers (15 per...