25.4.2010 Prof. Brewster to give a talk on mobile multimodal interaction

Mon, 24.05.2010

25 May 10:00 Stephen Brewster:

Please join us for an exciting session on the future of mobile human-computer interaction!

Professor Stephen Brewster, University of Glasgow

"Mobile Multimodal Interaction: How can we use human capabilities to improve mobile interfaces"

Tuesday 25th May (next week!) at 10-11am

Auditorium T1, Computer Science Building, Konemiehentie 2

Mobile user interfaces are commonly based on techniques developed for desktop computers in the 1970s, often including buttons, sliders, windows and progress bars. These can be hard to use on the move, which then limits the way we use our devices and the applications on them. This talk will look at the possibility of moving away from these kinds of interactions to ones more suited to mobile devices and their dynamic contexts of use where users need to be able to look where they are going, carry shopping bags and hold on to children. Multimodal (gestural, audio and haptic) interactions provide us new ways to use our devices that can be eyes and hands free, and allow users to interact in a 'head up' way. These new interactions will facilitate new services, applications and devices that fit better into our daily lives and allow us to do a whole host of new things.

I will discuss some of the work we are doing on input using gestures done with fingers, wrist and head, along with work on output using non-speech audio, 3D sound and tactile displays in applications such as for mobile devices such as text entry, camera phone user interfaces and navigation. I will also discuss some of the issues of social acceptability of these new interfaces; we have to be careful that the new ways we want people to use devices are socially appropriate and don't make us feel embarrassed or awkward.

Stephen Brewster has been a professor of human-computer interaction in the department of computing science at the University of Glasgow, UK for since 2001. Brewster's research focuses on multimodal human computer interaction, or using multiple sensory modalities (particularly hearing, touch and smell) to create richer interactions between human and computer. His work has a strong experimental focus, applying perceptual research to practical situations. He has shown that novel use of multimodality can significantly improve usability in a wide range of situations, for mobile users, visually-impaired people, older users and in medical applications.

More information is available at www.dcs.gla.ac.uk/~stephen

Helsinki Institute for Information Technology HIIT
Network Society and the HIIT Interactive Computing Seminar

Contact for more information:
Joanna Bergström-Lehtovirta, email: joanna.bergstrom@hiit.fi

Last updated on 24 May 2010 by Antti Oulasvirta - Page created on 24 May 2010 by Antti Oulasvirta