Skip to content

Can data science lead industrial companies out of the crisis?

How it is possible to minimise costs, respond flexibly to fluctuations in demand and avoid production downtime due to disruption?

by Mag. Stefanie Kritzinger, PhD

Particularly in economically difficult times digitization and the automation that goes with it play a crucial role. Manufacturing companies are facing a previously never-existing challenge: processes should already be digitized and partially automated controllable in order to monitor and manage production remotely. Sales markets and workforce planning are subject to external, uncontrollable influences and production in small batches is more attractive than ever before. Depending on the industry and the level of digitalization, some companies can handle this easily, while others cannot.

The enormous importance of digital and virtual networking is currently being demonstrated to us in the fight against the pandemic. Thanks to the digitization efforts of recent years, process and production data have increasingly been seen as an essential part of value creation. Those who have already done their homework have for some time been collecting extensive and automated data on their own company processes. On the one hand, in order to use it to analyze and process real-time information for reacting to short-term changes in production. On the other hand, in order to be able to derive future events from the collected data pools and forecast them as accurately as possible.

Table of contents

  • Processes: Increase quality and minimize costs
  • Detect bottlenecks early: Prescriptive Analytics
  • Avoid downtime: Fault management
  • Know-How
  • Author
Data Science

Processes: Increase quality and minimize costs

In order to increase quality and minimize costs, an important success factor is to extract valuable information from the collected production data. Nevertheless, this is not a trivial task. Intensive data engineering makes process and production data due to quality-relevant process steps usable. Necessary parameters are identified in order to perform automated data analyses using data and visual analytics or modern artificial intelligence methods. This makes it possible to detect anomalies, evaluate them correctly, and predict their effects on the final product quality. The improvement of quality management is complemented by the traceability of production parameters and quality characteristics of the entire process. Thus, a better understanding of the production process is possible by recognizing cause-effect relationships based on anomalies and patterns. This also allows maintenance intervals and cycles to be optimized and, subsequently, production processes to be improved.

This makes it possible, for example, to minimize scrap by ensuring that the machines produce at the correct operating temperature, minimize unplanned downtimes, or optimize maintenance intervals.

High Tech Industry Factory

Detect bottlenecks early: Prescriptive Analytics

Especially in times of crisis, short-term fluctuations in demand are daily business. In most cases, production systems are also highly complex due to their individual structure and organization. Scarce resources, special requests from customers, the resulting product variety, and deadline pressure overload existing capacities and lead to cost-intensive bottlenecks, for example, due to additional staff or delayed deliveries. Good preparation for the early detection of bottlenecks is based on intelligent forecast-based planning.

Production figures can be predicted on the basis of historical production figures and other influencing parameters as well as with the help of modern methods from the field of statistics and artificial intelligence. Based on these forecasts, which are very likely to be accurate, adequate measures can be derived and the expected development can be influenced in a positive kind of way – this is subsumed under the term prescriptive analytics. Predictive analytics functions create greater transparency in upcoming production. Targeted calculations and visualizations make it clear where bottlenecks may arise and where delays will occur based on planning. This provides real insights that enable intervention before the problem even reaches customers.

Prescriptive Analytics

Avoid downtime: Fault management

Lack of material, lack of personnel and changing requirements as a result of a crisis are mostly known factors for production stoppages. Disruptions often result in lost sales, large financial losses, and negatively impact the bottom line. Little attention is paid to managing a disruption from its discovery to its full recovery. Especially in the case of unforeseen events, efficient disruption management is able to absorb disruptions in a responsive manner. Rescheduling of the production process is necessary in order to maintain delivery reliability as far as possible with limited resources due to supply bottlenecks or short-time work, while at the same time complying with the recommended measures.

In the current, very turbulent and dynamic environment, it is more necessary than ever to increase the degree of digitization in order to be able to adhere to the potential targets of increasing quality, minimizing costs, identifying bottlenecks and efficient fault management. In that case, the two important key levers are process transparency and responsiveness.

Boost 4.0 RISC Software GmbH


The data engineers and data scientists at RISC Software GmbH possess extensive expertise and many years of experience in a wide range of areas of data management and data analytics. By using modern methods from the areas of data analytics and visual analytics as well as machine learning for smart data analysis and forecasting, the challenge of Big Data can be perceived as an important opportunity for process and revenue optimization.



    Mag. Stefanie Kritzinger-Griebler, PhD

    Head of Unit Logistics Informatics