Projects

Click here for information about the projects funded. Click the project title to go to the project information.

motionEAP

A system to boost efficiency and provide assistance in production processes

Brief description

The aim of the motionEAP project is to use motion detection and prediction as a basis for developing a system to increase efficiency and assist in production processes at companies. With cameras and distance sensors, the system detects the worker's activities and informs him of problems and potential for improvement. In addition to technical development, this product will focus on the issues of psychology and work ethics that arise from these new forms of interaction.

The challenge

Industrial suppliers have long been familiar with permanent quality control and IT-supported assistance for individual work steps.  However, existing assistance systems are somewhat late to act, i.e. when the finished weld seam is checked and defects are discovered. It would be better to inform the worker while performing the task that he is about to make a mistake. Expensive rework and rejects can be avoided and worker motivation improved because the work delivered is reliably good.

Aim

Researchers in the motionEAP project are working on just such a new assistance system. Context-aware and process-integrated assistance is to be developed, i.e. the worker's work steps are analysed with the help of sensors and video so that problems or errors can be immediately identified. This could be an incorrect assembly step or even non-ergonomic posture or hand position. The system then displays a corresponding message in the worker's field of vision. motionEAP is not only conceived for fast introduction to new production workflows, it also aims to support older or impaired workers in line with their performance capabilities. The integration of game elements will also be tested in order to boost motivation and work satisfaction at the same time.

Technologies

There are a number of challenges facing the implementation of the goal: First of all, customary methods of optical movement and object detection using video cameras and 3D sensors must be adapted to the needs of production environments. In addition to body movements, facial expressions are to be analysed so that the worker's emotions can be recognised. This will especially allow error-prone stress situations to be recognised and/or avoided. However, the cameras and sensors should not hinder either the work or the material flow. Another challenge: For safety reasons, no laser projectors are to be used for display purposes in the working area, instead LED devices with weaker light are to be used which will have to be optimised accordingly.

Psychological aspects related to work and motivation are just as important as these technical matters. One central demand of this project is that the assistance system should neither overburden or underburden individual workers. During the assembly and commissioning activities supported in this way, the technical and cognitive strengths and weaknesses are to be detected by analysing motion and facial expression so that the messages from the system can be geared to the current situation. Although unnecessary errors are to be avoided, workers should not be patronised by providing too many messages. This work is based on established models of work psychology. The project also aims to examine for the first time how gaming elements and methods can be integrated into industrial assistance systems (gamification) and how they can influence work satisfaction and motivation.

Independent ethical support will ensure that aspects, such as data protection and personal autonomy and privacy at the workplace, are considered during the project.

Use case

The motionEAP approach will be tried and tested on the basis of several application scenarios in the automotive industry and in social enterprises. The first scenario is a training and education system for assembly workplaces which uses cameras and infrared 3D sensors to capture the user's individual performance data and to gear the system to this data. In the subsequent scenarios, a projection then informs the user directly at the workplace (in-situ) of any errors. In the second scenario, the system is integrated into a production workplace where frequent product changes are also permitted. In this case, complex hand and arm movement sequences are to be detected and different styles of movement, for instance, fast hand movements, are also to be permitted. In the third scenario, components will be put together in pre-assembly.  Unlike today's pick-by-light systems, the employee's attention is not only drawn to the respective pick-up spot by light signals, incorrect pick-ups are also reported. In the final scenario, several workers in an assembly cell, i.e. several assembly tables together, will be supported.

Members of Consortium: Audi AG (Konsortialführer), BESSEY Tool GmbH & Co. KG, GWW GmbH, Hochschule Esslingen, KORION GmbH, Schnaithmann Maschinenbau GmbH, Universität Stuttgart

More information

More information

Contact person

Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR)

Project Management Agency "Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR)" for the Federal Ministry for Economic Affairs and Energy -Technical Innovations in business-

Dipl.-Phys. Gerd Hembach

Contact person

Audi AG

Klaus Klein

homepage: MotionEAP