changes What’s driving the process industry

Hunting for treasure

Data is key in the digital transformation of the process industry. But only rarely does data find its way beyond the confines of the devices, machinery and systems that generate it. Now that could all change, thanks to smart field instrumentation, digital interfaces and cloud-based analytical tools. The possibilities are endless, particularly when sensors in the physical world are linked up to artificial intelligence.

Text: Laurin Paschek, Robert Habi, Martin Raab
Illustration: Julia Praschma

Should you have wandered through any large industry trade fair lately, it may have felt like being on a different planet from before. Exhibitors that until recently had stalls full of heavy steel are suddenly presenting themselves as tech companies. All eyes are on the digitalization of production. At the trade booths, the brave new world of Industry 4.0 is very much a reality. And yet, if you visit a process engineering facility – wherever in the world and whatever it manufactures – you will probably see little of this fourth industrial revolution under way. Analog signal transmission continues to dominate the process industry. Even the latest measuring devices tend to transmit their data to control systems using nothing fancier than 4–20 mA current loop technology.

< 5%

of the data generated in
production is actually utilized,
according to international studies.

“ Digitalizing process plants is beginning to emerge more and more from the confines of pilot installations. We are at a turning point.”

Mann mit Brille

Rolf Birkhofer,

Managing Director of Endress+Hauser Digital Solutions

Schildkröte schwimmt über Schatztruhe

“International studies estimate that only around five percent of the data generated in production actually gets analyzed in depth,” says Dr Rolf Birkhofer, Managing Director of Endress+Hauser Digital Solutions. “That finding closely matches our own experience. Even though Endress+Hauser’s measuring devices have had digital communication capability for years now, the vast majority of our customers have yet to exploit this option.” There are many reasons behind the reticence, including the decades-long life span of process plants and field instrumentation, the fact that those plants often contain components from numerous suppliers, and the strict safety standards and regulations in force across many industries. Given such an environment, convincing customers to adopt new technologies involves bringing some cogent arguments to the table.

 

INDUSTRY AT A TURNING POINT

And yet there are areas of industry where change is afoot, says Birkhofer. The latest generation of smart instruments can supply a wealth of supplementary data alongside their actual measurements, including information on the sensors and processes themselves. There are technologies that provide a secondary channel for rapid, secure data transfer from the field right up to corporate level that is completely distinct from process control in the plant itself. Furthermore, a host of projects have already demonstrated how this data can be turned into useful information and valuable knowledge. “Digitalizing process plants is beginning to emerge more and more from the confines of pilot installations and small-scale projects,” Birkhofer says. And, he adds with conviction, “We are at a turning point.” For plant operators, it’s all about efficiency, security and quality in the face of competitive pressure and a general shortage of skilled workers. It follows that there is an enormous number of potential use cases. Analyzing data at the level of individual measuring points can already bring significant benefits. But the data generated from instruments and processes only reveals its true value after central aggregation, be that in a cloud application or edge computing system. Aggregation brings scalability to data gathering and processing, with individual use cases no longer requiring their own dedicated software. A further possibility is to link data from the field with other data sources such as weather forecasts and ERP systems, all in real time.

 

VIRTUAL AND PHYSICAL WORLD

A particularly exciting prospect is to combine multiple data sources using artificial intelligence. “Big data applications can glean highly complex insights in fractions of a second, given the right data inputs,” says Florian Falger, Market Manager at the Endress+Hauser Level+Pressure Innovation Lab. One of the team’s activities is finding ways to precisely determine maintenance intervals for measuring instruments and entire plants with the help of specialized algorithms and artificial intelligence. Thus they are laying foundations for something that many companies in the process industry want: predictive maintenance. “Large chemical plants, for example, operate around the clock,” Falger explains. “Even planned maintenance is a costly undertaking. Predictive maintenance would help to minimize the plant downtime involved and avoid unscheduled outages, as well as reduce workload and costs.”

“ Process mining uncovers hidden potential for optimization, because we can take a deep dive into individual process flows by following their digital trails.”

Mann mit Brille

Stefan Sigg,

Management Board member at Software AG

Another use case is in-depth analysis of process data as a means to improve the quality of manufactured products or to increase process efficiency. “In manufacturing, deployment of process mining software still holds a lot of untapped potential,” says Dr Stefan Sigg, Management Board member and Chief Product Officer at Software AG, one of Europe’s largest software developers. Process mining uses previously acquired data to replay business and production processes virtually, then analyzes the results from various process instances to find anomalies. It can be a real eye-opener when most processes run according to plan, but some instances take a completely different path. “Those process instances may be wasting money, time or energy,” Sigg says. “Process mining uncovers hidden potential for optimization, because we can take a deep dive into individual process flows by following their digital trails.”

 

THE STICKING POINT: INTEROPERABILITY

The viability of solutions like those developed by Software AG hinges on data from industrial and commercial processes being available in analysis-friendly form. Straightforward and secure data interchange is another essential. Enter the Open Industry 4.0 Alliance, with its goal of ensuring those exact things. The Alliance is a joining of forces between some 100 providers of IT, software, factory and process automation. Their mission: to promote interoperability among the devices and solutions used by Industry 4.0 applications. “Operators might be using field instrumentation from multiple vendors in the same plant, or even for the same measurement technique,” says Hans-Jürgen Huber, Endress+Hauser’s representative in the Open Industry 4.0 Alliance. “And they expect all of these different devices to be easily integrated and interconnected.” The alliance is working on a reference architecture based on existing standards, consistently applied. “The ultimate goal is for plant operators to deploy our solution and, from there, to make use of data generated by any and all of their instruments.” Huber is aware of the long road ahead to international or even global standards, but the IIoT expert is nonetheless optimistic: “Take the example of screw threads, where it took decades to come up with industry-wide standards. Developing uniform standards in the data space will be a much quicker process.” The pace of progress is due in no small part to tremendous and ever-growing pressure on the process industry driven by environmental concerns and climate change, the energy transition and electrification. “It’s a question of when, not if, the valuable treasure trove of data in our customers’ use cases gets leveraged,” Birkhofer says with certainty. “Everything that can be digitalized will be digitalized.”

Fische schwimmen über Koralle
THE DATA PROSPECTORS’ GOLD

“We are drowning in information but starved for knowledge.” What US futurologist John Naisbitt wrote in 1982 described a challenge that would later cause a furor in the age of big data: masses of information that is too large, complex, volatile or poorly structured to be analyzed by conventional data processing methods. It takes other means, such as smart analytical processes from the world of data science.

Data mining promises leaps in innovation across many areas of life. For example, online stores that monitor our mouse clicks so that they can offer us products tailored to our interests. In medicine, algorithms ease diagnosis and allow patient-customized treatments. And thanks to artificial intelligence, self-driving vehicles will be able to navigate busy intersections safely, even faced with oncoming traffic – a surprisingly complex task from an IT perspective.

The process industry also generates reams of data from measuring instruments, drives, valves and so on. These measurements, sensor signals and device parameters are used to control the processes themselves, but then there’s more: analyzed and linked together, they can also be used as a basis for a broad spectrum of insights concerning instruments, processes and plants – providing opportunity to transform strictly defined and mostly rigid value chains into flexible, dynamic and globally integrated networks for value creation.