body space
home | Introduction | papers | videos | people | links
side banner

Mobile telephones, Personal Digital Assistants and handheld computers are currently one of the fastest growth areas of computing and this growth is extending into fully wearable systems. Existing devices have limited input and output capabilities, making them cumbersome and hard to use when mobile. Consquently, a current requirement in this field is the development of new interaction techniques specifically designed for mobile scenarios. One important aspect of interaction with a mobile or wearable device is that it has the potential to be continuous, with the user in constant, tightly coupled interaction with the system. In these scenarios, interaction need no longer consist of an exchange of discrete messages, but can form a rich and continuous dialogue. The development of interfaces featuring such tight coupling, in continuous-time and real space, require insight from control theory, pattern recognition, and advanced statistics.

The Body Space project explores a new mobile interaction technique, allowing users to store and retrieve information and computational functionality on different parts of their bodies. It attempts to address three problem areas in mobile computing: the high levels of attention required using the devices, the impersonal nature of their interfaces, and the socially exclusive modes of interaction they support. Technologically, Body Space is based on detecting the location of a handheld device. To provide a system that requires no additional equipment (such as worn tags or markers) to facilitate the identification of different locations, Body Spaces relies on inertial sensing.

Inertial sensing is a new paradigm for interacting with mobile computers. It allows the investigation of novel interaction techniques such as gestures, or location based interaction, to create interfaces that are powerful, usable and natural. Inertial input is a good example of continuous input; the device gathers information about user behaviour whenever it is being held or carried. A significant aspect of this project is the generation of a robust reliable gesture recognition system for use with inertial sensors embedded in mobile devices. The gesture system will be an innovative combination of psychological insight, dynamic systems models and nonparametric statistical models. An important, and traditonally neglected, feature of this gesture system will be the development of novel audio and haptic cues to inform users about the progress of their gestural interaction.

webmaster:webmaster