No Cover Image

Conference Paper/Proceeding/Abstract 80 views

Designing Bias Suppressing Robots for `fair' Robot moderated Human-Human Interactions.

Peter Daish, Takayuki Kanda Orcid Logo, Matt Roach Orcid Logo, Muneeb Ahmad Orcid Logo

Proceedings of the 12th International Conference on Human-Agent Interaction, Pages: 347 - 349

Swansea University Authors: Peter Daish, Matt Roach Orcid Logo, Muneeb Ahmad Orcid Logo

Full text not available from this repository: check for access using links below.

DOI (Published version): 10.1145/3687272.3690877

Abstract

Research has shown that data-driven robots deployed in social settings are likely to unconsciously perpetuate systemic social biases. Despite this, robots can also be deployed to promote fair behaviour in humans. These phenomena have led to the development of two broad sub-disciplines in HRI concern...

Full description

Published in: Proceedings of the 12th International Conference on Human-Agent Interaction
ISBN: 979-8-4007-1178-7 979-8-4007-1178-7
Published: New York, NY, USA ACM 2024
URI: https://cronfa.swan.ac.uk/Record/cronfa68340
Abstract: Research has shown that data-driven robots deployed in social settings are likely to unconsciously perpetuate systemic social biases. Despite this, robots can also be deployed to promote fair behaviour in humans. These phenomena have led to the development of two broad sub-disciplines in HRI concerning ‘fairness’: a data-centric approach to ensuring robots operate fairly and a human-centric approach which aims to use robots as interventions to promote fairness in society. To date, these two fields have developed independently, thus it is unknown how data-driven robots can be used to suppress biases in human-human interactions. In this paper, we present a conceptual framework and hypothetical example of how robots might deploy data-driven fairness interventions, to actively suppress social biases in human-human interactions.
Item Description: Poster
College: Faculty of Science and Engineering
Start Page: 347
End Page: 349