Strachan, Steven (2007) Multimodal, Embodied and Location-Aware Interaction. PhD thesis, National University of Ireland Maynooth.
PDF
steve_strachan_phdthesis.pdf
Download (4MB)
steve_strachan_phdthesis.pdf
Download (4MB)
Official URL: http://www.hamilton.ie/publications/steve_strachan...
Abstract
This work demonstrates the development of mobile, location-aware, eyes-free applications which utilise multiple sensors to provide a continuous, rich and embodied interaction. We bring together ideas from the fields of
gesture recognition, continuous multimodal interaction, probability theory and audio interfaces to design and develop location-aware applications and embodied interaction in both a small-scale, egocentric body-based case and a large-scale, exocentric `world-based' case.
BodySpace is a gesture-based application, which utilises multiple sensors and pattern recognition enabling the human body to be used as the interface for an application. As an example, we describe the development of a gesture controlled music player, which functions by placing the device at different parts of the body. We describe a new approach to the segmentation and recognition of gestures for this kind of application and show how simulated physical model-based interaction techniques and the use of real world constraints can shape the gestural interaction.
GpsTunes is a mobile, multimodal navigation system equipped with inertial control that enables users to actively explore and navigate through an area in an augmented physical space, incorporating and displaying uncertainty resulting from inaccurate sensing and unknown user intention. The system propagates uncertainty appropriately via Monte Carlo sampling and output is displayed both visually and in audio, with audio rendered via granular synthesis. We demonstrate the use of uncertain prediction in the real world and show that appropriate display of the full distribution of potential future user positions with respect to sites-of-interest can improve the quality
of interaction over a simplistic interpretation of the sensed data. We show that this system enables eyes-free navigation around set trajectories or paths unfamiliar to the user for varying trajectory width and context. We demon-
strate the possibility to create a simulated model of user behaviour, which may be used to gain an insight into the user behaviour observed in our field trials. The extension of this application to provide a general mechanism for
highly interactive context aware applications via density exploration is also presented. AirMessages is an example application enabling users to take an embodied approach to scanning a local area to find messages left in their
virtual environment.
Item Type: | Thesis (PhD) |
---|---|
Keywords: | Gesture recognition; Continuous multimodal interaction; Probability theory; Audio interfaces; BodySpace; GpsTunes; Hamilton Institute. |
Academic Unit: | Faculty of Science and Engineering > Computer Science Faculty of Science and Engineering > Research Institutes > Hamilton Institute |
Item ID: | 1703 |
Depositing User: | Hamilton Editor |
Date Deposited: | 01 Dec 2009 15:52 |
Related URLs: | |
URI: | https://mu.eprints-hosting.org/id/eprint/1703 |
Use Licence: | This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available here |
Repository Staff Only (login required)
Downloads
Downloads per month over past year