Mobile Negotiated Interaction

Example scenario: Exploring a digitally enriched environment

An example of how this approach could be used is that of location-aware information acquisition while walking in a town centre. You might feel a 'tick' on your phone's vibration motor, making you aware that there is information available about something in your environment. Your rich context understanding abilities would tell you how likely this 'tick' was to be of interest, if you ignore the cue and walk on, the negotiation would end there and then. If you are curious, you might gesture with the phone at likely targets in your surroundings, and get a response from several of them. If you are further intrigued, you may continue to interact with these potential targets, possibly moving from the vibro-tactile to an audio display, gaining information by an active exploration of the environment, something we have evolved to do naturally.

The user explores the possibilities in the situation by directly engaging (probing or playing) with it, being able to move at will through the space of possibilities, gaining more and more insight during the interaction. The multimodal feedback provided both encodes the system's current interpretation of the user's intention (e.g. moving towards a target) and the probability of the target meeting the user's needs. After working through combinations of vibration and audio, if the joint dynamics of information source, and user continue to intertwine, the display of the mobile device might be used for full details. This example shows a 'schedule' of modalities, and illustrates the negotiation process in a practical and commercially interesting example.

Introduction

We're working with Glasgow University Computer Science to think about the future of mobile digital-physical world interactions. Our 'grand challenge' is to find the mobile browser for this fused digital-physical future.

What is Negotiated Interaction?

Our proposal is to investigate an alternative means of allowing users to interact with content and services in their environment such that the actions they make, movements, gestures, etc., and feedback they receive are continuous, with the user and system negotiating their interactions in a fluid, dynamic way. We believe the appropriate comparison would be dancing, rather than the current command & control metaphor. When someone dances with a partner there is a soft ebb and flow of control; sometimes one person leads, sometimes the other, this changing fluidly as they dance. We are proposing a similar interaction between a user and computer, where sometimes the user leads and at other times the computer according to the context of the interaction. This contrasts with most current approaches where one agent, be it the human or the computer, pre-empts the other and where most interaction is driven by events and proceeds to varying degrees in rigid, over specified ways

At Swansea

In this joint project, our role at the FIT Lab is to:

  1. Examine potential mobile scenarios to evaluate and extend the negotiated interaction paradigm.
  2. Develop large-scale gestures for effective physical-digital interaction.
  3. Develop a toolkit of evaluation methods for ambitious digital-physical systems.