Relevant for Research Area

C - Applications


As robots enter human environments, there is increasing need for robots to be human-compliant. Specifically, humans in the vicinity of the robot should influence both what the robot does and how the actions are performed. Hence, in this project we address human-compliant brain state-informed robot adaptation (COBRA). We focus on decoding human feedback and judgment in the context of (autonomous) driving and robotic service assistants.

Over the course of the project, we have developed a novel paradigm that allows decoding human scene judgments. It was instantiated in the form of classifying potentially hazardous events in driving scenes from observers’ brain signals. Additionally, we have devised an approach to interface-free object selection in a robotic environment based on highlighting gestures executed by the robot, which allows for successful online decoding based on in-the-scene stimuli.