Introduction | Papers | Videos | people
body space

The negotiated interaction project develops a novel approach to interaction design, based on closed-loop system design and probabilistic reasoning. The approach makes interaction into a negotiation process, and is especially relevant for systems instrumented with sensors. It includes a dynamic simulation approach to gestural interaction which improves learnability, and robustness to user variability, while allowing users to become 'masters' with fluent, skilled ability to navigate information spaces in a continuous fashion.

The project addresses the problem of how to build natural, easy-to-use interfaces for computers which can sense continuous motion, especially for small, mobile computers such as mobile telephones, Pocket PCs/PDAs and music players. We will investigate how an approach based on dynamic systems theory, probabilistic reasoning and control theory can use multimodal feedback to create a new way of solving these problems, through ‘Negotiated Interaction’.

This work has wide-ranging implications for Human-Computer interaction (HCI) in general, but is especially important for the rapidly growing area of mobile computing. The work investigates the extent to which the approach may enable users to sense and make sense of digital information related to phenomena in their environment such as places, routes, events and people. A useful and usable system needs to be able to model and accommodate the uncertainty relating to the relationships between the user and the phenomena, relationships that can vary dynamically and involve complex interactions of location, time and value.

Richer sensors are already entering consumer products, e.g. Nokia have introduced accelerometers to two of their phones, the 3220 and 5500, Samsung released the first gesture recognition phone in 2006 (SCH-310), and Nintendo and Sony have both released gesture-based games controlers. The framework for development of interaction based around such systems, is however, still a discrete event-based framework, which is ill-suited to the continuous interaction and rich feedback which such systems potentially allow, and the lack of compelling software development for this mass-market potential emphasises the need for a new approach to interaction design away from the desktop for continuous interaction systems, especially if we are to make such systems rewarding to use. Our probabilistic, dynamic framework allows us to make use of the continuous control actions humans have evolved to perform on the physical world. Now, though, we can apply these to rich abstractions such as multidimensional, structured probability density functions as if they were tangible objects. So, the likelihood of a route being interesting to a user can be represented probabilistically with respect to their proximity to it, the route’s attributes and the way the user reacts to cues given by the system to suggest how they might explore it. This is of theoretical interest to interaction design research in general, as we are going to the heart of interaction design and proposing a fundamentally new approach to it. The approach supports designers both in the intricate, low-level detail of the look and feel of interaction, analysing both the human actions and feedback to human perception, and can also incorporate probabilistic information from high-level context detection, such as Bayesian network context estimates.