Physical telepresence, shape capture and display for embodied computer-mediated remote collaboration.
We propose a new approach to physical telepresence based on shared workspaces with the ability
to capture and remotely render physical shapes.
We explore physical telepresence through three different shape display systems.
Asymmetric teleoperation with shape capture and rendering.
Bidirectional interaction through shape capture and rendering.
Bidirectional interaction through deformation.
Our teleoperation system enables capturing and rendering hand shapes for presence and manipulation.
Our system is based on a depth camera connected to a remote shape display.
Video of a shape display is shown on the horizontal screen while a connect captures the user's hands.
Remote experts can, for example, provide assistance and instruction to a local user in different contexts.
They can also dexterously manipulate various physical objects.
By altering the rendered shape, the remote operator's capabilities can be increased.
To avoid obstacles in cluttered spaces, hands can be rendered without arms.
Or shape output can be switched on and off to move around objects.
Pointing on top of objects is also enabled by this feature.
Users can rotate the scene, scale their representations.
These transformations can, for example, enable users to handle larger objects.
Physical shapes can be replicated and mirrored.
By leaving a copy behind, the user can create static shape features.
The hand shape can be replaced with different tools to manipulate objects more effectively.
These tools include a claw to pick up objects, hooks for reaching, and ramps for sliding objects.
Our second prototype enables bi-directional shape capture and output.
It renders captured shapes remotely.
Remote users can collaborate through connected objects for tangible interaction.
Here, two tokens are linked and the remote user's hands are rendered physically.
Alternatively, hands can be projected on the surface while the objects remain connected.
Here, a remote object is rendered physically by the shape display,
and a user can manipulate it through a separate gesture zone above the object.
Alternatively, a brush tool allows the user to translate the remote object
by pushing into the side of its local rendering.
Our third prototype allows for collaboration through bi-directional deformation and output on shape displays.
Two shape displays can render the same connected 3D model, allowing both sides to modify the underlying geometry.
In another application to convey remote presence, the shape displays render the inverse shape of each other.
Pushing into a pin will raise its connected counterpart.
An elastic mode conveys if someone on the other end is pushing at that very moment.
This system can be combined with vertical strings for an enhanced sense of presence.
We conducted a preliminary evaluation to get a more nuanced picture of remote object manipulation strategies and interaction techniques.
Participants were asked to move a remote sphere to a target location by using a variety of different interaction techniques.
