Greg Blackman reports on the progress being made in Industry 4.0, as discussed during the EMVA conference
The term Industry 4.0 is 10 years old. At the opening ceremony of the 2011 Hannover fair, Professor Wolfgang Wahlster, founding director of DFKI, the German research centre for artificial intelligence, first introduced the idea of the fourth industrial revolution to an audience that included the German chancellor Angela Merkel.
Ten years later and factories are now becoming more connected. In an article published by DFKI in April, Wahlster stated: ‘New factories today are always designed as flexible factories. The days of having a factory designed for only one specific product line are truly over.’
During a discussion about Industry 4.0 at the EMVA's online business conference in June, Dr Kai-Udo Modrich, head of inline inspection and metrology at Zeiss, said that Industry 4.0 is no longer just an academic term but is transforming manufacturing in many sectors.
So, what do vision firms need to be aware of as manufacturing becomes more digital? One area is the pay-per-use – such as software-as-a-service (SaaS) – business models, which have become widespread in IT and which Dr Gunther Kegel, CEO at Pepperl and Fuchs and president of ZVEI, noted during the EMVA discussion will 'not stop for industry'.
‘[Pay-per-use] will become our business,’ he said, ‘especially if we talk about more complex integration, more complex sensing solutions, and image processing by definition is complex.’
Software and IT providers don't offer on-premise solutions anymore, just cloud-based ones. The service model has also reached computer vision through Amazon's Lookout for Vision, which is cloud-based, giving subscribers access to neural networks that they can train with their images.
'We're the ones providing architectures and computational power in premises, whereas in the long run an automotive manufacturer will not accept this kind of hardware and software installed in their premises. They want to move this around the globe, copy it as quickly as possible, and that works much better if it's cloud-based,' Kegel continued.
'The software intensive [processes] – and image processing is software intensive – will move into the cloud and have other payment models,' he said.
Professor Dr Roman Dumitresco, managing director of technology network It's OWL and director at Fraunhofer IEM, added that there has to be a benefit for the customer to move to a pay-per-use model, but agreed it will happen eventually. It's OWL - standing for Intelligent Technical Systems OstWestfalenLippe - is a cluster with 200 companies, research institutes and organisations developing solutions for intelligent products and production processes. It is considered one of the largest Industry 4.0 initiatives for SMEs.
Pepperl and Fuchs provides industrial sensors and has a division that deals exclusively with image processing and image solutions. The firm is working on 3D data processing using lidar scanners.
‘We are in the eye of the storm of Industry 4.0 – the sensing side,’ Kegel said. ‘Actuators, networks are already available, but the sensing side is not developed as far as we thought. It is definitely a large working package for sensor manufacturers and sensor integrators to contribute to Industry 4.0.
‘It's not just another wave of automation technologies, not just another field bus,’ he added. ‘It's something substantially different, and it leads us into new data-driven businesses.’
Kegel explained that the transition to a digital, connected shop floor in manufacturing takes longer than in other sectors of society, because in industry changes only happen when they create user benefits.
‘We can't throw away whatever we have,’ he said. ‘Nobody is willing to reinvest just because there's some fancy new technology around as long as this doesn't give any benefit.’
Companies like Pepperl and Fuchs therefore have to prove that new sensing technology is beneficial and that it brings more efficient processes, higher quality, and a more flexible business.
‘In industry, we see a lot of effort in improving existing processes and existing business models instead of looking for completely new business models,’ Kegel said.
‘We hope that the new business models will come when we have finally completely digitised our shop floors, our machines,’ he continued. ‘Then, with this enormous amount of data, these plants will generate new business models automatically. At least, that's our belief. Currently, 90 per cent of what we do is really focusing on improving what we already have as digital technology rather than trying to invent new business models. That's a difference. People that come from other areas normally look at industry and don't understand why the disruptive nature of digital technology hasn't kicked in that strong like it did in other areas.’
Dr Thomas Scheiter, head of technology for the Internet of Things at Siemens, noted that, in most architectures at the moment, IoT sensors are largely separate from the control system, but that, in the future, there will not be this separation.
He said that Siemens is now running into questions around the quality of data from sensors, and also the topic of digital calibration certificates. Other aspects are ease-of-use and how to bring in sensors without compromising security in the IT environment.
Within its IIoT research, Siemens is focusing firstly on connectivity, addressing topics such as 5G, as well as interoperability through OPC UA; secondly, the field of edge computing, and device and system management; and thirdly, perception, which includes all the sensing activities, sensor data fusion, smart sensing and also the quality of sensing data.
Kegel made the distinction between a measurement value - the kind plant machinery might at the moment collect - and data. The difference is that data has context. A measurement value might not even have a timestamp, so once that number is processed in the control circuit, it's useless. ‘The minimum that you need to turn a measurement into data is a timestamp,’ Kegel said.
‘You need context information,’ he continued. ‘Data is not valuable if you can't understand it in the context of its environment. This context relevant information has to accompany the data, otherwise they are pretty useless.’
Kegel said that putting context around a primary device such as an IoT sensor requires a lot of standardisation work. ‘All these different IoT partners have to talk to each other,’ he said. ‘It requires more than just another field bus protocol; it really requires a semantic work ontology on top of this that is standardised.’
The VDMA, ZVEI, and Bitkom, along with 20 partner companies, have formed an industrial twin association to help bring about standards. The aim is to advance Industry 4.0 through the development of an open source digital twin, which will function as an interface between physical industrial products and the digital aspects of Industry 4.0 applications.
‘In all honesty, automation people don't know anything about semantics and ontologies, they have to learn this from the beginning,’ Kegel said. He said there’s plenty of technology available to create standards in these areas, and that only with new standards will vendors that speak different languages be able to talk to each other on the level of data. ‘That for me is the biggest challenge the industry is facing: how can we actually get our arms around this international standard and do this very abstract new way of standardisation,’ he said.
Who is best placed to do this standardisation activity, Kegel asked. He said that automation experts are stepping into new territories, and there are already standardisation bodies in these areas that do good work. ‘There's still a tremendous amount of work to do, but there’s no shortcut; we need to do this standardisation,’ he said.
In the DFKI article, Professor Wahlster said ‘AI is spearheading a second wave of digital manufacturing’, a point Dumitresco at It’s OWL made during the EMVA discussion. Artificial intelligence is needed to make sense of the huge amounts of data now available in manufacturing.
Dumitresco also said that digital twins are helping to structure data according to specific information sets. Scheiter agreed, saying that a digital twin is needed to solve complex tasks when setting up Industry 4.0 processes. He said: ‘We need [sensing] methods that ensure context plays an important role. That’s good news for imaging, good news for the industry, because that's how we can bring in our [sensing] domain knowledge.’
Scheiter also said that business innovations - companies working together in a cooperative way on a joint ecosystem - might be more important than advances in technology. He said there's still some way to go on this.
Gaia-X is one project bringing representatives together, in this case to build a European data infrastructure. The goal is to create a secure, federated system that meets high standards of digital sovereignty while promoting innovation. The project hopes to make an open, transparent digital ecosystem, where data and services can be made available, collated and shared.
Kegel concluded that there are two transformations happening: digitisation and sustainability, which go hand-in-hand, as digital technologies are required for more sustainable manufacturing practices.
‘As a sensor provider I'm confident that we need not only Industry 4.0 but also Sensors 4.0, because the next generation of sensors - better integrated with better context information and everything integrated in a digital twin approach - is a game changer,’ he said.
‘I don't think we need more disruptive technology at the moment,’ he added. ‘We have so many things to do in the coming years to digitise [manufacturing] completely and to contribute with our efforts into decarbonisation. If you really manage these two major trends in the future, we've done a good job.’