HTP Graphics

RAIN Hub Year 3 Report

Issue link: https://htpgraphics.uberflip.com/i/1385717

Contents of this Issue

Navigation

Page 18 of 77

HUMAN ROBOT INTERACTION RAIN PROGRESS // At this stage we have designed and built the custom hardware rig, which features four Vicon cameras, four custom haptic devices with 3 DOF force feedback, and an Oculus Rift S HMD. We have successfully integrated the three different software frameworks used, allowing the HMD, haptic devices and Vicon tracking to work in the same framework. To bring these systems into a shared coordinate frame, we designed a specialised calibration procedure which uses custom device attachments tracked by Vicon to calibrate the different devices into a shared coordinate system. The calibration setup for the four haptic devices is complete, while the calibration of the HMD via Vicon is still in progress. Additionally, we have programmed and piloted a simple object-slotting task, where the observer picks up an object such as a box and slots it into a hole – with various distortions applied. This was initially programmed on simple haptic devices, using the built-in tracking of the Oculus HMD, before being ported to the bimanual haptic rig once lab access became available. This was also done for a basic Jenga demo, which again was written on a simple haptic device setup and then ported to the bimanual haptics rig. FUTURE ASPIRATIONS // The next steps are to finalise the HMD calibration with the Vicon tracking, before collecting data on the object- placing experiment and analysing which distortions are the most disruptive. We also aim to work with The Christie hospital in Manchester on generalising the interaction tasks and developing a shared codebase of manipulation tasks. With this collaboration we will create demos for tumour detection and delineation, looking at traversal of 3D medical scans using a combination of standard slice- by-slice exploration of image stacks and voxel objects generated from medical scans, where we can both embed 'tumours' directly into the image stacks or add them as separate voxel objects into the rendered tissue. After this we aim to look at varying levels of tissue density, such as 'hard' and impermeable density for bone and 'softer' density for tissue such as muscle, based on the contrasts in the original source images. There is also scope for us to collaborate with other working groups in RAIN to integrate our user-end teleoperation setup with their remote-end glove box interaction robots; we would be replacing the simulated experimental world with the physical remote world, which would allow for live manipulation at a distance with force feedback and augmented reality visuals. 19

Articles in this issue

Archives of this issue

view archives of HTP Graphics - RAIN Hub Year 3 Report