4TU.H&T Hectic Haptics Hackathon on telepresence, teleoperation, embodiment and haptics

4TU Delft
4TU Eindhoven
4TU Twente
4TU Wageningen

Hectic Haptics Hackathon

On Wednesday, 21st of March, we held our first 4TU Hackathon on telepresence, teleoperation, embodiment and haptics. Fifteen researchers and students with various backgrounds, ranging from from control engineering to philosophy, met up in the DesignLab at the University of Twente. Besides researchers with 4TU H&T affiliation, there was also a group of researchers from TNO who collaborate with the UTwente on this topic in the context of the I-Botics project (www.i-botics.com).

The idea was to facilitate exchange of experience with equipment, software and methods used in research on the topic, mediated through the hands-on setting of a Hackathon.

Participants were provided with (and brought themselves) a wide range of equipment to work with: haptic devices and displays, sensors for tracking and imaging, actuators such as robotic grippers as well as VR systems.

Two major groups formed. One looked specifically at integrating an Elitac virbrotactile glove with the Omega 7 haptic device. Here, while wearing the glove, users could control a robotic arm in a Robot Operating System (ROS) simulation through the haptic device. Collisions of the robotic arm with virtual objects in the simulation environment would be translated to localized vibrotactile feedback on the glove. Setups such as this one could be used to investigate the trade-offs between full haptic, vibro-tactile and visual feedback in teleoperation settings.

This setup was then extended to work in a dislocated configuration. The vibrotactile glove was tracked externally using an OptiTrack system, so that its position could then be shared with ROS, kinematically driving a rigidbody in the simulation environment set up for the haptic device. Now, collisions between the glove and the haptic device would be felt on both ends, allowing a dislocated, bi-directional haptic experience.

A second group looked at body transfer illusions. They started by reproducing the classic Rubber Hand Illusion. Here, a fake left hand is placed in front of a participant, such that it may be their own, hiding the actual left hand using a visual barricade. The experimenter then strokes both the real and fake hands in a synchronized fashion, which causes participants to experience the rubber hand as their own.

As there was no rubber hand available, a leather glove stood in at first. This didn't produce convincing results. A better alternative was found when using an actual arm of a third participant instead, which successfully created the illusion.

Next, the group worked towards a setup to experiment with full body illusions. The Open Impress system developed at HMI, consisting of an HTC Vive VR setup combined with a Kinect V2 RGBD sensor, was used to render a virtual representation of a different body in front of themselves (i.e. one would virtually stand behind the other body, looking at the back). The experimenters then applied haptic stimuli by touching the participant to induce the illusion that the participant was actually located at the site of the virtual body in front of them. The experience of this illusion required precise timing of haptic and visual feedback and was not experienced as strongly as anticipated, potentially due to the fact that the virtual body differed from the participant’s body.

Next, the system was adapted to allow making recordings. A participant now viewed their own body in front of them and saw and felt their body being touched. This produced a moderately strong illusion that the participant was actually standing at the location of their virtual body in front of themselves. At some point the life feed of the participants body was replaced with a recording where the touch could only be seen, which resulted in a strong reduction in the experience of the illusion. The experimentations illustrated the importance of congruent visual and haptic feedback for creating bodily illusions in VR/AR.

Other side-activities included an integration between Unity3D and ROS, as well as work towards integration of the UltraHaptics device with a VR setup.

With more than two tangible, working results, the Hackathon turned out to be a great success. The format of intellectual exchange over a hands-on activity was well received. Given the nature of the topic, being able to demonstrate ideas through creating an experience rather than just describing it with words provided an added value.

4TU.H&T Hectic Haptics Hackathon on telepresence, teleoperation, embodiment and haptics

Hectic Haptics Hackathon

On Wednesday, 21st of March, we held our first 4TU Hackathon on telepresence, teleoperation, embodiment and haptics. Fifteen researchers and students with various backgrounds, ranging from from control engineering to philosophy, met up in the DesignLab at the University of Twente. Besides researchers with 4TU H&T affiliation, there was also a group of researchers from TNO who collaborate with the UTwente on this topic in the context of the I-Botics project (www.i-botics.com).

The idea was to facilitate exchange of experience with equipment, software and methods used in research on the topic, mediated through the hands-on setting of a Hackathon.

Participants were provided with (and brought themselves) a wide range of equipment to work with: haptic devices and displays, sensors for tracking and imaging, actuators such as robotic grippers as well as VR systems.

Two major groups formed. One looked specifically at integrating an Elitac virbrotactile glove with the Omega 7 haptic device. Here, while wearing the glove, users could control a robotic arm in a Robot Operating System (ROS) simulation through the haptic device. Collisions of the robotic arm with virtual objects in the simulation environment would be translated to localized vibrotactile feedback on the glove. Setups such as this one could be used to investigate the trade-offs between full haptic, vibro-tactile and visual feedback in teleoperation settings.

This setup was then extended to work in a dislocated configuration. The vibrotactile glove was tracked externally using an OptiTrack system, so that its position could then be shared with ROS, kinematically driving a rigidbody in the simulation environment set up for the haptic device. Now, collisions between the glove and the haptic device would be felt on both ends, allowing a dislocated, bi-directional haptic experience.

A second group looked at body transfer illusions. They started by reproducing the classic Rubber Hand Illusion. Here, a fake left hand is placed in front of a participant, such that it may be their own, hiding the actual left hand using a visual barricade. The experimenter then strokes both the real and fake hands in a synchronized fashion, which causes participants to experience the rubber hand as their own.

As there was no rubber hand available, a leather glove stood in at first. This didn't produce convincing results. A better alternative was found when using an actual arm of a third participant instead, which successfully created the illusion.

Next, the group worked towards a setup to experiment with full body illusions. The Open Impress system developed at HMI, consisting of an HTC Vive VR setup combined with a Kinect V2 RGBD sensor, was used to render a virtual representation of a different body in front of themselves (i.e. one would virtually stand behind the other body, looking at the back). The experimenters then applied haptic stimuli by touching the participant to induce the illusion that the participant was actually located at the site of the virtual body in front of them. The experience of this illusion required precise timing of haptic and visual feedback and was not experienced as strongly as anticipated, potentially due to the fact that the virtual body differed from the participant’s body.

Next, the system was adapted to allow making recordings. A participant now viewed their own body in front of them and saw and felt their body being touched. This produced a moderately strong illusion that the participant was actually standing at the location of their virtual body in front of themselves. At some point the life feed of the participants body was replaced with a recording where the touch could only be seen, which resulted in a strong reduction in the experience of the illusion. The experimentations illustrated the importance of congruent visual and haptic feedback for creating bodily illusions in VR/AR.

Other side-activities included an integration between Unity3D and ROS, as well as work towards integration of the UltraHaptics device with a VR setup.

With more than two tangible, working results, the Hackathon turned out to be a great success. The format of intellectual exchange over a hands-on activity was well received. Given the nature of the topic, being able to demonstrate ideas through creating an experience rather than just describing it with words provided an added value.