For decades, engineering companies have been looking at ways of using data to drive added business value. In the late 1970s and early 1980s, they started upgrading their plant control systems from analogue to digital, enabling the capture of data from sensors located across these plants. When stored, this data enabled the development of higher-level applications like advanced process control and optimisation, helping to drive plants closer to their operating limit. Immense value has been created from these advanced solutions.
In recent times, though, the benefits data can bring to engineers have moved up another level. Today, it is easier and more cost-effective for engineers to generate large volumes of relevant data; gain access to it and then apply it to deliver operational gains across every aspect of their work from design to troubleshooting.
The price of instrumentation is falling and the cost of connectivity to that instrumentation is coming down. In line with this, the advance of technologies like edge computing and fog computing is making more data available to engineers – and the increasing ubiquity of the cloud is enabling companies to consolidate multiple silos of information and prepare it for analysis.
So more data is available to engineers and it is increasingly easy for them to access it and apply it to their business problems. All this offers engineering companies the potential to drive significant benefits across their operations but to take advantage, they also need to think about how they are using data differently.
Traditionally, for example, senior operatives carrying out a rigorous engineering simulation would spend very little time ‘getting their hands dirty’ in plant data but in reality, they need to do that to validate their models. Making real world production data available to this class of engineers will have a genuinely positive impact, enabling them to get the data they need to improve their models and therefore their general working processes as well as achieving significant operational, environmental and health and safety benefits.
That has been a key first step. However, what has proved to be a real game-changer in this space are all the latest advances in areas like artificial intelligence, machine learning and data science.
Traditionally, the most prevalent approach engineering companies have adopted to asset optimisation is to build models that mimic the behaviour of the asset. They obtain real-time data, put it through the model and look for abnormal behaviour. There are some fundamental problems with this approach, not least that companies have to find sufficient staff with the expertise to build models of all the assets they need to protect. That is a complex undertaking that inevitably delays time to value and makes the whole approach difficult, if not impossible, to scale.
The latest advances in machine learning and data science, in particular, enable a new approach known as prescriptive analytics which pinpoints signatures and patterns in the data that warn the engineering company of an impending incident, whether an outage or a complete plant breakdown, for example, enabling it to take remedial action in advance.
Critically too, the approach also helps the company determine the root cause of the problem. It can inform them not only that a compressor is going to fail, for example, but also that its impending failure is directly linked to the leakage of liquid into gas lines at a certain concentration, or even just a slow change in the pressure recorded.
This is also a much faster process. Rather than taking six months to get an application into production as may be typical with a model-based approach, the engineering company can do this in six weeks - or maybe even six days - in some cases. That makes it easier and more viable to scale the application up into a sustainable live production environment in a realistic timeframe.
Italian energy provider, Saras is one company that has already benefited from a machine learning driven approach, applying machine learning to four equipment areas at its 300,000-barrel-per-day refinery in the Mediterranean: feed pumps, wash oil pumps, makeup H2 compressors and recycle compressors. It got its digital effort up and running in just a matter of weeks and was soon able to accurately identify the specific failure mode for each component — without any false positives.
These capabilities enabled it to predict failures with lead times of 24-45 days, and Saras also expects to reduce unplanned shutdowns by up to 10 days, increase revenue by 1 to 3 percent and reduce refinery maintenance costs and operating expenses by 1 to 5 percent.1
Delivering Benefits across Multiple Functional Areas
Saras has achieved benefits from this kind of approach across multiple areas of its business. For engineering companies and engineers more generally we also see multiple benefits, not just from the ability to generate more data and access and consolidate it more easily but also from the ability to apply the latest AI-driven machine learning and data science techniques to it.
Moving forwards too, more of these benefits will be cross-functional. Prescriptive maintenance, for example, not only benefits the maintenance department by reducing the cost of asset repair but also positively impacts operations by helping eliminate unplanned downtime. Senior engineering executives are increasingly laser-focused on this kind of cross-functional improvement. They are looking at how to tear down their operational silos and put them back together again in a way that delivers margin - and that may be one of the long-term benefits they take from making better use of data.
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...