The European project OpenInterface builds an Open Source platform that assists interaction designers in developing multimodal applications. A graphical user interface lets the designer select, configure and combine various I/O devices for different sensory channels (‘modalities’). Keyboard and mouse, speech and gesture sensors (e.g. Wii controllers) or position (e.g. internal GPS of Nokia N95) are examples of the modalities that can be made available in the platform. In OpenInterface, FIT is in charge of user-centered design, focusing both on the interaction designers working with the OpenInterface platform as well as end-users of final multi-modal applications. FIT coordinates, as a central part of this role, provides and implements the methodological framework for designing and evaluating multimodal applications.

Test Bed Applications

To drive the design process, two prototypical applications, using mobile as well as stationary I/O devices, serve as test beds for the platform:
The Large Information Space application leverage the advantages of multi-modal forms of interaction to support tourists in planning their trips on the go.
A pervasive mobile phone game uses both a virtual 3D environment and the real world as its playing field, allowing the players to move freely between both environments.

Interaction Design

The main challenge in the design of multimodal applications is to optimize the user experience in the given context. Here, there are many open questions, such as: Which modalities do the users prefer in specific situations? How to deal with conflicting parallel inputs from different modalities? Questions like these are taken into account in designing the OpenInterface framework.