DC-26

Cross-Reality in Interactive Robotics using VR/AR

Taha Lamine
ENIB and UniSA

Research Areas

Robotics, Cross-Reality, Human-Robot Interaction, Digital Twin, Augmented Reality, Virtual Reality, Mixed Reality, Robotics

Project brief

Currently robots are developed in two separate realities: in the physical world using the real robot, and in a simulated world using a virtual robot. Using the real world allows interaction with the users, but this option can be time consuming and unsafe when testing complex algorithms (navigation, grasping …). Using a virtual world can speed up development, is safe, but breaks the natural interaction with the user. In addition, the virtual world can fail to simulate sensors/effectors of the robot and therefore differ from reality.
Mixed realities have the potential to make a “bridge” between the real and virtual worlds, taking advantage of both realities. It’s advantageous to be able to develop their capabilities simultaneously across the real world, augmented reality (AR), virtual reality (VR), and robotic simulation. Using an immersive system (HMD, CAVE), a (virtual) user should be able not only see the world, but also then interact with the robot (real or virtual). Synchronization between both realities have to be done, e.g. when the real robot moves in the real environment, motion should be replicated in the virtual environment and so on.
Our current research direction investigates how frame of reference alignment impacts cross-reality human-robot interaction. Specifically, we examine how robot embodiment (real versus virtual) affects the shared spatial understanding and interaction quality in cross-reality environments.