Today’s work, educational and social life is characterized by intense collaboration in teams, who commonly try to work on and solve given tasks. To support such teamwork, meeting rooms are equipped with analog whiteboards, flipcharts, sketching tables, etc. to support, structure and document lively discussions. Within such discussions, information is made explicit using these tools, while other information remains implicit such as body language, position of artifacts in the meeting room, etc. Consequently, such meetings cause accessibility problems for blind users. Using ICT for such meetings already eases access to the explicit information of artifacts e.g. via digital whiteboards or interactive tables. Moreover, digitisation technologies (e.g. OCR, also for hand-written documents) and tools like Anoto support accessing the explicit information layer. What remains to be addressed as basic research questions are
- the nonverbal communication layer and
- information which is inherent to the spatial distribution (e.g. grouping, hierarchies, relations) of information in a meeting room, both of which are important for problem solving.
- As these are elements which are perceived “at a glance” by the visual sense an additional question is
- how to efficiently integrate these information layers into existing or new views and interaction modes for blind users.
- And finally the increased density and complexity of information and interaction asks for
- a usable and seamlessly integrated workaround for blind users to manage active participation. Even when tracked and made explicit, conveying the amount of information is difficult on Braille displays or speech synthesizers only. Even more challenging is active manipulation and contribution what asks for new modes of interaction (e.g. using gestures of blind people).
This research project addresses these issues and intends to study the feasibility of making modern, ICT equipped meeting rooms and team session accessible to blind users. Team meetings – and more explicitly ideation processes using techniques as Metaplan – are typical scenarios. Participation goes beyond access to the artifact level and addresses the communication and coordination among participants, which heavily rely on nonverbal communication (NVC). While during a tabletop interaction, as addressed in the previous project, mainly deictic gestures are relevant, the information distributed in a room extends the list of relevant NVC elements to e.g. position or facing direction, to address the artifacts or clusters of artifacts. The spatial distribution of information requires intense synchronization and coordination among participants, e.g. for directing attention. Unlike on a table, where information is displayed and used solely on a 2D plane, the room wide spatial distribution increases the amount of information, relations, clusters and other structures visually displayed in parallel by location, distance, density, colors, lines and other visual elements. This implies the following basic research questions for accessibility of blind users:
- Tracking of relevant NVC elements
- Interpretation & semantic analysis of NVC elements to avoid cognitive overload
- Sensor fusion and reasoning to refine data and reduce wrong notifications to blind users
- Accessible synchronized representation of the spatially distributed information & interaction structures using team-meeting tools (e.g. metaplan)
- New concepts of browsing spatially structured information, enriched with information on NVC
- New concepts for intuitive manipulation in a spatial environment with direct feedback
In summary, the goal of this project is to research the basic accessibility layers to allow blind users to fully participate in team meetings using room wide spatially distributed and NVC related information.
For doing so, it uses the competences and the proven quality of cooperation of three labs in Switzerland (ETH Zurich), Germany (TU Darmstadt), and Austria (University of Linz).