Touchless Tactile Displays for Digital Signage: Mid-air Haptics meets Large Screens

This demo presents the concept of Interactive Digital Signage with Haptics, where users can interact with public digital screens with their bare hands, utilizing tracking technology and ultrasonic mid-air haptic feedback. Using these three main components: a digital screen, a tracking device, and Ultrahaptics technology for tactile feedback in mid-air, users are offered a multi-sensory experience that can potentially dramatically improve the advertising experience, increasing brand engagement, dwell time and brand recall. To this end, we present an example of a movie poster that transforms into an interactive mini game.


Introduction
Digital signage has many applications such as the dissemination of information or emergency announcements, however by far the most prominent use case is that of advertising. As large digital displays proliferate in public spaces, there is an increased demand for our attention and engagement [1]. Because of the broadcast nature of advertising content however, the audience may become uninterested and exhibit "display blindness" [2]. Therefore, the key challenge is to develop systems that enable enhanced interactions with advertising content as to provide memorable experiences and hence motivate further engagement.
This demo illustrates how mid-air haptic feedback can enhance interactive interfaces. Through this example, we want to promote the reflection around guidelines and best practices to achieve this goal while also taking into account intrinsic constraints.

Background
Touchscreens are prevalent in public spaces and have radically changed the face of today's digital signage. There are however many issues with touchscreens: Significant costs, responsiveness, hygiene and cleaning requirements, robustness against extended use and everyday knocks that may cause damage, securing access to the display control panel, and finally that they have to be reachable and therefore compromise viewability and location. From an HCI perspective [3], there is a large body of work on requirements and design principles to help evoke the interest of different types of viewers, motivate ergonomic interactivity in a public space, and effectively deliver engaging content.
Gesture controlled public displays make interactions possible from a distance, therefore addressing some of the abovementioned touchscreen issues. They also offer opportunities for expressive gestural interactions. Indeed, it is often the case that individuals facing digital signage will be part of a group. The gestures employed by the performer to interact with the system influences the reaction of the other members of the group even before they interact with it. Highly visible gestures may discourage use or, on the contrary, build curiosity and interest by turning the interaction into a "public performance" [4]. In our case, the interaction space is limited to a volume approximately of 30cm (W) x 50cm (H) x 20cm (D), raising the question of how to design expressive gestures in a restricted interaction area? Of course, there are many other challenges with gesture input as well. For instance, there is no established gesture set for selection, and text entry is often inaccurate. One reason for this is the lack of physicality and haptic feedback, resulting in a kind of affective disconnect towards touchless interactions [5].

Demo Contribution
The main contribution of this extended abstract is the ability to add haptic feedback to gesture controlled interactive public displays. Our demo uses state-of-theart software and hardware to manipulate focused ultrasound and remotely stimulate sensory receptor structures in various parts of the hand [6]. Coupled with hand tracking and large displays, our demo presents new opportunities for designing interactive touchless tactile interfaces for digital signage that are up to 50% more accurate when reaching out to select a widget [7] and that can instill the persuasive influence of touch to create a more memorable and engaging experiences with positive marketing implications [5]. Our demo also challenges how a spectator can experience a user's interaction with a computer [8].
Going beyond the expressiveness of gesture input, invisible haptic feedback enables interactive digital signage to become a magical interface providing tactile cues to the active user without the spectators being able to see or notice it. This can also enhance honeypot effects that attract more spectators and passersby.

Demonstration Setup
Our demonstration setup is composed of a large 50" LCD display, a laptop PC, an Ultrahaptics TouchBase device (UHEV2), and a LEAP Motion controller. In this particular instance, the display is oriented in Portrait as the context is movie posters (in cinemas). The UHEV2 is ergonomically located below the display mounted on a bracket and angled at 40 degrees enabling easy access for touchless interactions (see Figures 1 & 2).
The screen displays a static advertisement poster with some overlaid text hinting towards the interactive content. The interaction is initiated by a passer-by who responds to the onscreen cues and reaches out with her hand to launch an interactive game. The game contains purposeful audio, visual, and haptic cues that correlate with one another and particularly with the content being advertised. While a plethora of design guidelines exists on how the audio-visual components should be implemented, there is, to the best of our knowledge, none for mid-air haptics. This demo is our 1 st attempt at exploring interaction designs for digital signage.

Interaction and Haptic Designs
In our demo, users can experience a mini-game, i.e., a game lasting less than a minute while interacting with the digital poster advertisement. The game is divided into three phases: a welcoming screen, a gaming phase, and finally a closing screen. The gaming phase is divided in to two sub-games: a "shoot 'em up" (STG) phase and a "dodge" phase where users have to avoid asteroids. A final score prompts replayability, increasing engagement and dwell time and thus event recall.
The goal was to create an interactive digital poster that provides the user with a novel, engaging and memorable experience. During the STG sub-game, users have to fight enemy spaceships and control their spaceship to avoid enemy fire. An effort was made to keep maneuvering gestures in STG phase intuitive (up/down/left/right) following the plane and position of the user's hand. These are reliable gestures using current hand-tracking hardware. Moreover, they all expose the palm to the ultrasonic haptic feedback emanating from the UHEV2 device below.
The interaction novelty resides in the addition of the mid-air haptic feedback to complement the audio-visual stimuli. Therefore, we have designed multiple discrete haptic effects to correlate with specific interactions within the game, reinforcing specific game events with adding distinct new "haptic sensations". We detail these for our "Beyond Terra" interactive demo (Figures 2 &  3): Open and close: a circle composed of 4 rotating haptic focus points is centered on the palm. The radius of the circle is increasing (decreasing) giving the sensation of an opening (closing) shape. This feedback is provided at the warp jump phases, when the spacecraft enters (leaves) the scene. Moving point: a single haptic focus point moves from the bottom of the palm to the tip of the middle finger. This feedback is associated to the spaceship firing lasers. Random points: 4 haptic focus points are randomly printed on the user's palm and associated with damage received by the user's spaceship. Line scan: a line composed of 4 haptic focus points scanning the user's hand, i.e., going from palm heel to fingertips. This feedback is provided when users activate the demo and the spacecraft enters the scene. This last sensation is also used for the warning sensation during the didge phase of avoiding the asteroids. Here, the line is scanning in the opposite direction of the incoming asteroid (e.g., left to right if asteroid approaching from the left). Importantly, and unlike all other haptic sensations designed that are triggered simultaneously with some audio-visual events (e.g., taking damage, or laser blaster), the asteroid warning sensation is timed 1-2 seconds before the asteroid becomes visible giving the user a window of opportunity to move the spacecraft out of harm's way. Hence, the active user can sense a future upcoming event, while spectators cannot. From a game perspective, the haptic feedback here is therefore not just complementary but indispensable towards successfully interacting with the poster interface.

Discussion
The above-mentioned haptic experience could have been designed differently while still maintaining their space and time correlations with their audio-visual counterparts in the game and also the various intrinsic restrictions (interaction space, hand-tracking accuracy, etc.). Therefore, it is interesting to consider their generality: what elements of the audio-visual poster should one look to haptically enhance and/or augment, and what haptic sensation set should be used to achieve expressive tactile interactions.

Conclusion
We have demonstrated how mid-air haptic feedback can offer a new dimension to interactive public displays and how this has the potential to increase passerby engagement and improve advertising effectiveness. There are still many open questions that we aim to understand better through user studies as well as by working together with display owners, content designers, academia, and the HCI community in general in order to further develop this promising new interactive mode of digital signage.