|

Chat GPT and other AI – a new language for the industry

Picture: Lance/stock.adobe

An interview with CSEM machine learning expert Philipp Schmid

ChatGPT and other AIs have recently caused a furor in virtually all professions where people write, program or design. How much dust are these supposedly general-purpose AIs stirring up in the industry?

The general-purpose AIs are kicking up an enormous amount of dust. But unlike 2016, this time I am convinced: it will also have a strong direct economic impact in the industry. The situation is comparable to the big AI hype in 2016, when AlphaGo beat South Korean Lee Sedol, one of the world’s best professional players, in the board game Go.

Go is considered the most complex board game. Suddenly, AI was on every­one’s lips and everyone did not want to miss out under any circumstances. However, thanks to ChatGPT, this time AI is not an impressive opponent, but is experienced in a simple way for everyone.

What benefit does ChatGPT bring to the industry?

GPT stands for “generative pretrained transformer”. Generative means something new is created (a matching response in the case of ChatGPT). This property of neural networks has been used in the image domain for a few years now also for industrial applications, be it via AI-assisted image rendering functions or as deepfake to generate training data for improved classification. Transformers are a special architecture of neural networks, which were first introduced back in 2017. Text is comparable to sensor data from a machine: it needs an understanding of the current situation and there is a temporal dependency, similar to the thread of a story. This makes these powerful transformer algorithms suitable for typical industrial problems like predictive maintenance/quality or process optimization.

What has happened in data analytics in the last five years?

These technologies are about extracting insights from data to identify trends and solve problems. It involves all the processes, tools and techniques used for this purpose, including collecting and storing the data. In order to successfully implement data analytics solutions in industry, three critical success factors are needed: annotated data (data that has been manually labeled by experts), algorithms (suitable neural networks) and (affordable) computing power. In all three areas, we have made tremendous progress in the last five years. Thanks to modern machine controllers and powerful databases, it has never been easier to store vast amounts of data. AI-powered annotation tools help human experts with the tedious labeling process. New algorithms like transformers have made it into programmers’ standard libraries. And thanks to gamers and crypto miners, GPUs (graphics cards) have become extremely powerful and at the same time attractively priced today. The combination of these three developments means that there is now a large number of industrial applications that were too expensive or not performant enough five years ago.

In your opinion, how well are Swiss companies already digitally positioned?

My daily experience in industry shows a wide spectrum: from simple assembly machines without any sensors to highly integrated fully automatic machines that store millions of data points every day. However, all these data are rarely processed and used. In general, I would give Swiss industry a rather poor report card in the area of digitization.

What is the reason for that?

I attribute a key factor to the restrained pragmatism exhibited by small and medium-sized enterprises (SMEs). While this approach shields companies from unnecessary expenses, it concurrently hinders their ability to assume a pioneering role in technology. Historically, there has been an excessive focus on ROI (return on investment) alone, neglecting the broader array of benefits flanking the path to digitalization. In numerous projects, the most significant value for companies did not arise solely from achieving the initial goal, but rather from capitalizing on incidental outcomes. Presently, however, there is a noticeable surge in companies expressing keen interest, and I firmly believe that it is not too late!

In which areas do you currently see the greatest potential?

I currently see three exciting areas that can also be directly implemented in industry: quality and process control, predictive analytics (maintenance, digital twin, forecasting) and cognitive robotics (intelligent robots, human-robot interaction, awareness of the environment). The selection of appropriate technologies depends on the market segment and the degree of automation. From a technological standpoint, all three pillars are ready for industrial applications; however, the choice of implementation should be evaluated on a case-by-case basis. Initiating the digitization journey promptly is crucial; there is no need to delay any further.

What is holding companies back from realizing this potential?

I think it is a mixture of skepticism, ignorance and risk aversion. Often, at the beginning of an AI project, the resulting performance cannot be guaranteed. The results depend on very many different parameters. Without practical experimentation in a specific case, experts rely on gut instincts. Many managers prefer a clear run-through of use cases to eliminate deployment risks, but in the area of AI solutions, this has hardly been possible to date.

Companies that have relied on the hype too early have also been disappointed in the past; they need to regain confidence and understand that the technological progress of recent years has been tremendous.

It is also enormously important that there is a certain level of expertise within the company. It is like a new “language” and proven concepts from software development have to be adapted. However, more and more young people are bringing this specific know-how with them, and this can also ensure long-term implementation. Companies should dare to take small steps and not be deterred by the big mountain ahead.

There seems to be a trend towards predictive maintenance. Has this trend already arrived in the manufacturing industry?

Excluding simple rotating systems (pump, fan, motor), I regret to say that, until today, I have not come across a solution in the industry that truly deserves this name. The algorithms to offer “predictive” exist today. This is proven daily by various text and voice applications like DeepL or ChatGPT. But for machine applications in industry, the data basis is usually missing. We have been looking for an industrial partner for a long time, who would provide us with 1,000 machines, all of which we would be allowed to break down. This would probably be necessary to acquire the required training data.

I see predictive maintenance as a vision and umbrella term for modern maintenance solutions. Condition-based maintenance is now technically feasible. If this is combined with real-time evaluation and trend analysis, we are very close to predictive maintenance. The local manufacturing industry still has a lot of catching up to do in this field, as we notice from many inquiries. However, establishing a sustainable business model appears to be anything but straightforward.

At CSEM, one of your research topics is predictive quality. What does that mean in concrete terms?

Modern maintenance is all about the availability of the machine. It is seldom possible to prevent component failure; the added value lies primarily in the early detection of a malfunction. Depending on the industry, this can hardly result in a major cost advantage. However, long before a component   such as an axle or a motor – fails, the quality gradually declines. The quality of a final product depends on the interaction of the individual production steps and the condition of the components used. In addition, product use and the subsequent field of application must be taken into account.

The concept “predictive quality” tries to predict the quality of a product during production using data-based methods and neural networks. For example, it answers the question: At what degree of vibration of an axis does the machine produce only rejects? It also takes into account whether the product is intended for a highly critical market such as aerospace or medical technology, or whether it is “merely” supplied to the consumer goods market.

What are the strengths of predictive quality?

The concept is very comprehensive, and takes into account all production steps, condition of all components and continuously estimates the final quality. It offers a large variety of direct business models and price advantages. A very big plus is that you do not need to destroy 1,000 machines to get the training data. Common quality checks can generate the valuable ground truth data at any time. We have already successfully validated the concept in an industrial use case in concrete production. The implementation is challenging, but it is possible to start with first smaller plants and then expand step by step to the whole production. Many companies are just now discovering the potential of predictive quality and I can literally feel the enthusiasm. I expect great growth in this area over the next few years.

In order, to fully exploit the potential of Machine Learning, data is existential. Is there a general need for better access to data and data sets?

I currently sense a real war in the industry over data sovereignty. Everyone thinks data is the new oil and no one wants to share it. The big added value of data-driven approaches is the fusion of different data sources. The algorithms and computing power are ready to process even large data sets efficiently. Solutions need to be developed here that encourage data sharing and not the opposite. Anyone who shares their data openly and makes it accessible via a suitable interface should also share in any success. Currently, a lot of things are done twice or the party with the greater negotiating power wins.

In the whole discussion, a distinction must be made between sensitive, personal data and machine/process data. Especially with health data and personal data, I advocate a very restrictive approach. There has to be a very good balance between the individual benefit and the potential risks. Neural networks in particular offer a whole arsenal of new possibilities for deanonymizing data and further perfidious attack scenarios. Advanced language models, like DarkBERT, exclusively trained on data from the darknet, are already in circulation. As is so often the case, there are two sides to the coin with these new technologies. Legislators should approach this matter with caution and foresight.

How would your solution for regulated access to data look like?

Especially for the Swiss/Europe economy, close cooperation in the data domain, even with its competitors, is absolutely essential. However, there’s a significant hurdle: it’s a “heavily untrusted environment” where mutual trust is lacking, despite the essential need for data from one another. CSEM, together with various companies, has developed a comprehensive concept called DEMON. The central idea is an encrypted data lake in combination with a highly automated crypto key exchange, which manages the access to the data sources as well as the time period. All companies, machines, devices and components can securely store their data in this encrypted data lake and request easily access from other users when necessary.

What opportunities arise from this?

If a manufacturing company wants to implement the concept “predictive quality,” it needs data from all systems in its facility. The supplier of a subsystem, aiming to enhance its products, requires data only from its system but from all facilities worldwide. In the event of a malfunction in a facility, permissions for the service specialist can be immediately activated, providing temporary full access. It is even possible to make data available for training a neural network without effectively sharing it; the AI service provider receives the trained network without the raw data used. Technically, a lot is possible today, but the question remains: who has the market power and the finances to proclaim such a solution? The benefits would exponentially increase with the number of participants. For now, there seem to be only isolated solutions, and the battle over data sovereignty continues behind the scenes.

What, in your opinion, will distinguish a successful industrial company in ten years?

In addition to the consistent automation of routine tasks, I am convinced that software will play a significantly more important role. An intuitive user interface, condition-based maintenance, encrypted data interfaces and comprehensive quality concepts will become significant unique selling points. Suddenly, the question will no longer be, do I make money with “predictive maintenance”, but rather: can I sell machines at all without it? Mechanical systems and components are becoming increasingly similar, copiers from low-wage countries are catching up with us faster and faster and patents are expiring – differentiation in the future will be determined by comprehensive software solutions. This will have an impact not only on the production and machine level, but also on business analytics . The shortage of skilled workers will force many companies to map the experience of long-serving employees in software and thus make it available as a support system for young, inexperienced employees. Those who consistently follow this path and cooperate with the right partners will see their companies flourish in ten years.

Philipp Schmid

is Head of Industry 4.0 & Machine Learning at CSEM. He is responsible for the scientific and economic excellence of this domain. Before joining CSEM he worked in several industrial companies including Roche and spent one year as research fellow at the CSIRO Robotics group in Australia. He has a broad technological background in automation and holds an MBA degree from ETH Zurich. He is member of various expert committees and in 2023, he was again awarded as “Digital Shaper”.

Similar Posts