Automating production processes with the aim of improving the daily lives of the people involved is the decisive change of mentality to achieve effective and sustainable industrial automation.
In a new series of articles, we will explore the reasons why the most innovative companies choose to start from a human perspective, starting from a theme that is often underestimated: the allocation of functions and activities between humans and machines.
One of the most frequent simplifications in the world of industrial automation concerns the distribution of production activities between humans and machines. The spectrum of industrial automation ranges from the entirely manual scenario (zero automation) to the fully automated scenario (maximum automation), with various intermediate stages that envisage human-machine interaction on different levels of complexity.
To date, the dominant interpretation of an automated industrial system is based on the latter to take over activities previously carried out by humans. Among these, those most prone to automation are those with high risk, which are heavily worn, repetitive and stressful. In this scenario, the human being is asked for less work or less attention to the individual activity, thanks to the support of the automated system.
This entails rethinking the role of the industrial operator himself, who, stripped of the hardest manual activities, can now carry out work centered on those creative and decision-making skills. The scientific literature abounds with research that highlights the impossibility of the available technology to permanently replace the human being in the performance of these critical activities.
The topic of expanding the skills of worker 4.0, due to its relevance and complexity, is beyond the scope of this article and will be addressed separately. Here we will analyze how this ideal scenario is not achievable for a mix of psychological factors that too often we see overlooked in the world of automation engineering.
The irony of automation
Introduced by cognitive psychologist Lisanne Bainbridge in a famous research published in 1983, the concept of “irony of automation” identifies a paradox of automation systems, especially industrial ones. The irony lies in the fact that, in an automated system characterized by greater reliability, the human being has lower responsibilities and pays less attention to any system errors.
In other words, if in a completely manual system the error can be identified as an alteration of the normal development of the production activity, in a partially automated system the error risks becoming a normality, although rare, due to the less attention paid by the operator and frequency reduction.
The irony of automation is of fundamental importance in the design of the industrial models of the future, as it highlights a cardinal element: the human being, when the object of automation, not only does less work, but profoundly changes the way in which it carries out the activities for which it is responsible.
This effect also occurs outside the industrial context, as in the case of semi-autonomous vehicles. The reduction in the activities foreseen by the driver should be compensated by a greater attention to the road and traffic conditions. On the contrary, research shows that the human being is inclined to distraction and to carry out activities that dangerously reduce the threshold of attention, with substantial risks for road traffic.
In the industrial context, similar effects can occur on various levels of responsibility, with potential harmful effects both on the safety of the production process and on the quality of the output. To remedy these risks, it is necessary to design automation systems for an optimal allocation of activities between humans and machines.
The allocation of functions
The optimal allocation of functions and resources within a production process must take into account the deeply asymmetrical relationship between automation and human behavior. If the worker of the second industrial revolution was the victim of alienation due to the precarious working conditions of a production system in profound transformation, that of the fourth may be due to a suboptimal transition in the redistribution of roles.
Although difficult to standardize in a framework that can be applied generically in all contexts, it is possible to identify four best practices to be applied in the design of industrial automation, capable of reducing the impact of the psychological effects:
mapping the elements that impact the operator’s working life
First of all, the starting point consists in mapping the elements that impact the operator’s working life from three points of view: procedural (activities carried out), relational (human-human and human-machine) and functional (role in the ecosystem). Automation interventions can therefore be addressed to maintain a balance between the three components. After the system has been implemented, the evolution of this structure must be updated for any corrections.
measuring the operator’s work after automation
A second measure therefore consists in measuring the operator’s work after automation, evaluating its diversity, weight and connection with activities carried out by other human beings. An adequate balance of these factors leads to an ideal scenario in which the operator does not have the psychological effects described above. In TeamDev, we measure these factors with a series of variables that can also be detected through the same automated systems existing in the production plant.
increasing the relational component of the work
The third area of intervention consists in increasing the relational component of the work, with exchanges of information between operators and simpler man-machine communication.
creation of engagement mechanisms favors operator concentration
Finally, the creation of engagement mechanisms favors operator concentration, thus reducing risks. Although the engagement is transversal to all the components described above, specific measures can be put in place to intervene in the most critical moments, when attention risks falling below a threshold level.
In summary, in the design of automation systems, especially industrial ones, it is essential to keep in mind that unexpected results may occur in the behavior of human operators that could jeopardize the benefits brought by the automation itself. To maintain a high level of engagement and attention, it is therefore necessary to intelligently allocate the functions and activities between humans and automated systems.
Bainbridge, L. (1983). Ironies of automation. In Analysis, design and evaluation of man–machine systems (pp. 129-135).
Baxter, G., Rooksby, J., Wang, Y., & Khajeh-Hosseini, A. (2012). The ironies of automation: still going strong at 30?. In Proceedings of the 30th European Conference on Cognitive Ergonomics (pp. 65-71).