(searched for: doi:10.3390/bdcc4040035)
Published: 9 November 2021
Proceedings of the 9th International Conference on Human-Agent Interaction; https://doi.org/10.1145/3472307.3484672
In this work, we propose a computational robot trust model based on the predictability of the human partner as the main factor that impacts robot trust. Using this model and based on the robot’s knowledge about the task, the robot switches its behavior between conservative and normal, and adjusts the implemented safety mechanism as a function of the current robot trust level. We illustrate the proposed model impact on the outcome of the collaboration using a simple scenario. The results show a significant improvement in safety conditions of the human in collaboration with a robot at the cost of justifiable team performance reduction.
Frontiers in Artificial Intelligence, Volume 4; https://doi.org/10.3389/frai.2021.703504
Trust is the foundation of successful human collaboration. This has also been found to be true for human-robot collaboration, where trust has also influence on over- and under-reliance issues. Correspondingly, the study of trust in robots is usually concerned with the detection of the current level of the human collaborator trust, aiming at keeping it within certain limits to avoid undesired consequences, which is known as trust calibration. However, while there is intensive research on human-robot trust, there is a lack of knowledge about the factors that affect it in synchronous and co-located teamwork. Particularly, there is hardly any knowledge about how these factors impact the dynamics of trust during the collaboration. These factors along with trust evolvement characteristics are prerequisites for a computational model that allows robots to adapt their behavior dynamically based on the current human trust level, which in turn is needed to enable a dynamic and spontaneous cooperation. To address this, we conducted a two-phase lab experiment in a mixed-reality environment, in which thirty-two participants collaborated with a virtual CoBot on disassembling traction batteries in a recycling context. In the first phase, we explored the (dynamics of) relevant trust factors during physical human-robot collaboration. In the second phase, we investigated the impact of robot’s reliability and feedback on human trust in robots. Results manifest stronger trust dynamics while dissipating than while accumulating and highlight different relevant factors as more interactions occur. Besides, the factors that show relevance as trust accumulates differ from those appear as trust dissipates. We detected four factors while trust accumulates (perceived reliability, perceived dependability, perceived predictability, and faith) which do not appear while it dissipates. This points to an interesting conclusion that depending on the stage of the collaboration and the direction of trust evolvement, different factors might shape trust. Further, the robot’s feedback accuracy has a conditional effect on trust depending on the robot’s reliability level. It preserves human trust when a failure is expected but does not affect it when the robot works reliably. This provides a hint to designers on when assurances are necessary and when they are redundant.
tm - Technisches Messen, Volume 88, pp 473-480; https://doi.org/10.1515/teme-2021-0030
Recent research indicates that a direct correlation exists between brain activity and oscillations of the pupil. A publication by Park and Whang shows measurements of excitations in the frequency range below 1 Hz. A similar correlation for frequencies between 1 Hz and 40 Hz has not yet been clarified. In order to evaluate small oscillations, a pupillometer with a spatial resolution of 1 µm is required, exceeding the specifications of existing systems. In this paper, we present a setup able to measure with such a resolution. We consider noise sources, and identify the quantisation noise due to finite pixel sizes as the fundamental noise source. We present a model to describe the quantisation noise, and show that our algorithm to measure the pupil diameter achieves a sub-pixel resolution of about half a pixel of the image or 12 µm. We further consider the processing gains from transforming the diameter time series into frequency space, and subsequently show that we can achieve a sub-micron resolution when measuring pupil oscillations, surpassing established pupillometry systems. This setup could allow for the development of a functional optical, fully-remote electroencephalograph (EEG). Such a device could be a valuable sensor in many areas of AI-based human-machine-interaction.