Mixed-Reality Enhanced Telepresence
For Remote Inspection and Monitoring with Multiple Aerial Robots.
Remote inspection using aerial robots to monitor complex environments is extremely challenging due to user requirements for high-fidelity situational awareness. While aerial robots have the potential to access hard-to-reach areas, the information provided by current teleoperation interfaces makes it difficult for users to unlock the full capabilities of these highly agile platforms; relying upon relatively granular visual feedback sources that inhibit the user’s understanding of the remote environment to easily attribute complex robot commands to mission objectives.
This project will focus on developing an intuitive human-robot interface that allows the operator to become immersed in the remote environment. Guided by industry expertise from RACE, we will leverage visual, acoustic and potentially haptic feedback, to investigate mixed-reality interface designs that enhances the user’s sense of presence within the virtual context in which the mission is planned and executed.
Moving beyond controlling an individual robot, we will also test the usability of the system to coordinate a team of aerial vehicles by adapting the presented information optimised to the user’s cognitive load. Supporting future work, the end goal will demonstrate a safe and “information-optimised” mixed reality-aided telepresence platform which is capable of controlling multiple aerial robots using an in-the-field remote inspection scenario.
Lead Investigator: Dr. Hai-Nguyen (Hann) Nguyen, Imperial College London
For information about the project Mixed-Reality Enhanced Telepresence for Remote Inspection and Monitoring with Multiple Aerial Robots, please contact Dr. Hai-Nguyen (Hann) Nguyen.