Intelligent Human-Robot Interaction with Explainable AI

There is currently a communication barrier between robots and the operator with regard to perceived world views and reasoning behind system actions and plan failures.

This is particularly problematic in remote, highly challenging and hazardous environments which also involve multiple vehicles and/or platforms. This lack of transparency can lead to a reduction in trust and situation awareness, thereby hindering the true human-machine teaming and resulting in unnecessary aborts and/or laborious manual manipulation of the assets.

Led by Professor Helen Hastie from Heriot-Watt University, the Intelligent Human-Robot Interaction team are developing interaction techniques following a user-centred design approach to explain four important aspects of Robotics and Artificial Intelligence (RAI)transparency:

  1. Communication in natural language with a 'robotic assistant', who can control remote robots and provide situation awareness, including explanations of behaviour.
  2. Visualisation of plans to aid operator decision making and replanning.
  3. Automatically induced ‘activity models’, including describing casual behaviour.
  4. Monitoring and adapting to user cognitive state.
MIRIAM chatbot in action along with remote operation of robots and autonomous systems through natural language demonstration

This work theme draws on ORCA's combined academic expertise in:

  • Human-Robot Interaction
  • Intelligent, multimodal interfaces for autonomous systems in multiple demand
  • Automated planning
  • Explainable Artificial Intelligence


Intelligent Human-Robot Interaction with Explainable AI research is being undertaken by

Institutions involved in this research theme

Introducing MIRIAM:

Although ORCA is working to reduce the need for personnel in hazardous environments, it is essential that onshore operators maintain situation awareness to monitor the mission and handle unforeseen circumstances.

MIRIAM (Multimodal Intelligent inteRaction for Autonomous systeMs) combines visual indicators of status with a conversational agent component and offers a fluid and natural way for operators to gain information on vehicle status, explanations of system behaviour and mission progress.

FIND OUT MORE FIND OUT MORE