Mixed Reality Human Robot Interaction Interface using Hand Tracking and Stereo Vision for Unknown and Hazardous Environments

dc.contributor.advisorJadidi, Mardkheh Amaneh
dc.contributor.authorTennakoon, Damith Deshan
dc.date.accessioned2025-11-11T20:10:16Z
dc.date.available2025-11-11T20:10:16Z
dc.date.copyright2025-08-19
dc.date.issued2025-11-11
dc.date.updated2025-11-11T20:10:16Z
dc.degree.disciplineEarth & Space Science
dc.degree.levelMaster's
dc.degree.nameMSc - Master of Science
dc.description.abstractMany industries employ human workers to perform hazardous and unsafe tasks that result in injuries and fatalities. These include on-site monitoring of radioactive environments, collisions with heavy machinery on construction sites, and repetitive, physically demanding work. Robotics solutions have been proven to improve task efficiency, product quality, and workplace safety in many industries, though it has posed a challenge to be integrated into dynamic and unstructured environments. This research aims to develop a generalized immersive robot arm teleoperation interface designed to accelerate the deployment of robotics in real-world applications through an intuitive Human-Robot Interaction (HRI) system. The methodology involves mounting a ZEDM stereo camera, attached to a 2-axis servo-actuated gimbal, on the end effector of a 6-axis robot arm. Stereo images are streamed to a Meta Quest 2 virtual reality (VR) head-mounted display (HMD), providing a first-person view (FPV) with human-like perception of the end effector's surroundings. Hand tracking is used to map the operator’s hands into the FPV, enabling omnidirectional motion control and gripper operation via gesture recognition. In-situ data is spatially mapped to the stereoscopic view, creating a mixed-reality (MR) HRI interface. Results show improved depth perception and higher control stability compared to traditional monocular video systems paired with keyboard controls. Specifically, the MR interface achieves a 39% reduction in task completion time during target-tracking experiments and demonstrates five times greater stability in trajectory control tests. This MR HRI interface promotes the safe use of robotics in unknown or hazardous environments, allowing operators to perform high-risk tasks remotely using only HMDs and natural hand gestures.
dc.identifier.urihttps://hdl.handle.net/10315/43347
dc.languageen
dc.rightsAuthor owns copyright, except where explicitly noted. Please contact the author directly with licensing requests.
dc.subjectRobotics
dc.subject.keywordsHuman-robot interaction
dc.subject.keywordsMixed reality
dc.subject.keywordsTeleoperation
dc.subject.keywordsRobotic arm control
dc.subject.keywordsStereo vision
dc.subject.keywordsHand tracking
dc.subject.keywordsDepth perception
dc.subject.keywordsConstruction robotics
dc.subject.keywordsGesture recognition
dc.subject.keywordsHazardous environments
dc.titleMixed Reality Human Robot Interaction Interface using Hand Tracking and Stereo Vision for Unknown and Hazardous Environments
dc.typeElectronic Thesis or Dissertation

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Tennakoon_Damith_Deshan_2025_MSc.pdf
Size:
47.76 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.87 KB
Format:
Plain Text
Description:
Loading...
Thumbnail Image
Name:
YorkU_ETDlicense.txt
Size:
3.39 KB
Format:
Plain Text
Description: