Authors:
Jutta Fortmann, Heiko Müller, Wilko Heuten, Susanne Boll, Jari Kangas, Deepak Akkil, Jussi Rantala, Poika Isokoski, Päivi Majaranta, Roope Raisamo, Pavel Orlov, Roman Bednarik, Héctor Caltenco, Henrik Larsen, Per-Olof Hedvall
1. Presenting Information on Wrist-Worn Point-Light Displays
Wearable devices with small form factors create a need for simple displays that present information through single light spots. What should these displays look like and how should they present information in daily life? In this work we built a light bracelet to explore this question. It consists of an LED stripe, which is controlled by Arduino components sewn onto an elastic armlet. In a preliminary user study we investigated how participants experienced self-designed light patterns in their daily lives. From the results we derived implications for the design of light patterns on wrist-worn displays.
Fortmann, J., Müller, H., Heuten, W., and Boll, S. How to present information on wrist-worn point-light displays. Proc. of the 8th Nordic Conference on Human-Computer Interaction. ACM, New York, 2014.
Jutta Fortmann, University of Oldenburg
[email protected]
Heiko Müller, OFFIS – Institute for Information Technology
[email protected]
Wilko Heuten, OFFIS – Institute for Information Technology
[email protected]
Susanne Boll, University of Oldenburg
[email protected]
A participant wearing the prototype. |
Wearable computing devices such as smartwatches and smart glasses are becoming more widely available. These devices present new interaction challenges, as the devices are usually small and the context of use sets limitations on available interaction modalities. We are exploring the use of gaze as input and haptics as output for wearable devices. This demonstration allows users to experience the combination of gaze input and haptic output in the use of a bus schedule application. Links to such applications could be embedded in the environment, much like paper schedules have been used in the past.
http://www.uta.fi/sis/tauchi/projects/hagi.html
Kangas, J., Akkil, D., Rantala, J., Isokoski, P., Majaranta, P., and Raisamo, R. Gaze gestures and haptic feedback in mobile devices. Proc. of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, 2014, 435–438.
Kangas, J., Akkil, D., Rantala, J., Isokoski, P., Majaranta, P., and Raisamo, R. Using gaze gestures with haptic feedback on glasses. Proc. of NordiCHI EA 2014. ACM, New York, 2014.
Jari Kangas, University of Tampere
[email protected]
Deepak Akkil, University of Tampere
[email protected]
Jussi Rantala, University of Tampere
[email protected]
Poika Isokoski, University of Tampere
[email protected]
Päivi Majaranta, University of Tampere
[email protected]
Roope Raisamo, University of Tampere
[email protected]
Close-up of the haptic actuator attached to the temples of the glasses. |
Gaze gestures performed on top of the simulated info table are followed by a gaze tracker in the bottom of the tablet computer. |
3. Low-Cost Latency Measurement System for Eye-Mouse Software
One of the important characteristics of a window- and gaze-contingent tool is the speed of reaction to the pointer or eye movements and the update delay—the so-called latency—of the contingent response. Here we demonstrate a handy possibility for measuring the latency of gaze-contingent or mouse-based software. We present a low-cost latency-measurement system that can be useful for studies that include eye-movement tracking tools. In our demonstration we use this system for measuring the latency of a gaze-contingent tool.
http://www.youtube.com/watch?v=eGKSeMrALcY&feature=youtu.be
Orlov, P. and Bednarik, R. Low-cost latency measurement system for eye-mouse software. Proc. of the 8th Nordic Conference on Human-Computer Interaction. ACM, New York, 2014, 1085–1088.
Pavel A. Orlov, University of Eastern Finland and St. Petersburg State Polytechnic University
[email protected]
Roman Bednarik, University of Eastern Finland
[email protected]
A ball and a mirror are used to measure the delay between user input and screen update. |
4. SID—Sensuousness, Interaction, and Participation
The SID project is about Snoezelen, a method based on the use of a multisensory environment (room) designed to awaken children's interests. It offers them the opportunity to discover, explore, and experience at their own pace. It arouses curiosity and urges them to act, but it also offers a haven for relaxation. The purpose of the SID project is to further develop the Snoezelen concept, creating new opportunities for children with developmental disabilities to utilize today's interactive possibilities. For example, by adding microphones, speakers, subwoofers and a computer to a waterbed, any movement in the waterbed becomes a bodily dialogue with its inner "wavescape" of evolving sound layers, vibrations, and even infrasonic kicks in the water. SID aims to develop new possibilities for sensuousness, interaction, and participation through an interactive multisensory environment.
http://vimeo.com/channels/193431
Larsen, H.S. and Hedvall, P-O. Ideation and ability: When actions speak louder than words. Proc. of the 12th Participatory Design Conference: Exploratory Papers, Workshop Descriptions, Industry Cases. 2012, 37–40.
Héctor A. Caltenco, Lund University
[email protected]
Henrik Svarrer Larsen, Lund University
[email protected]
Per-Olof Hedvall, Lund University
[email protected]
An interactive multisensory room helps awaken the senses of children. |
©2015 ACM 1072-5520/15/01 $15.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2015 ACM, Inc.
Post Comment
No Comments Found