• No results found

Sonification of haptic interaction in a virtual scene

N/A
N/A
Protected

Academic year: 2021

Share "Sonification of haptic interaction in a virtual scene"

Copied!
3
0
0

Loading.... (view fulltext now)

Full text

(1)

Sonification of Haptic Interaction in a Virtual Scene

Emma Frid Roberto Bresin Sound and Music Computing CSC, KTH Royal Institute of Technology

Stockholm, Sweden emmafrid@kth.se

roberto@kth.se

Jonas Moll

Eva-Lotta S¨alln¨as Pysander Interaction Design

CSC, KTH Royal Institute of Technology Stockholm, Sweden

jomol@csc.kth.se evalotta@csc.kth.se

ABSTRACT

This paper presents a brief overview of work-in-progress for a study on correlations between visual and haptic spa-tial attention in a multimodal single-user application com-paring different modalities. The aim is to gain insight into how auditory and haptic versus visual representations of temporal events may affect task performance and spatial attention. For this purpose, a 3D application involving one haptic model and two different sound models for interac-tive sonification are developed.

Keywords: interactive sonification, haptic feedback, spatial attention

1. BACKGROUND

Integration of haptic feedback in computer music applica-tions, especially in the context of Digital Musical Instru-ments (DMIs), is a growing research field (see e.g. [1, 2]). Numerous studies have focused on how force feed-back devices, i.e. controllers that read position informa-tion and provide continuous force feedback as a response to user movements, can be used in applications involving both sound and haptics [3, 4, 5, 6].

Audio-tactile and audio-proprioceptive interaction has been found to play an important role for spatial orientation in virtual scenes [7]. Moreover, it has been suggested that auditory and tactile signals are more effective than visual signals when it comes to drawing cross-modal attention to particular positions [8]. The current study is motivated by the fact that few previous investigations have focused on cross-modal links in spatial attention for sonified 3D hap-tic interfaces.

2. AIM

The purpose of this study is to investigate how visual spa-tial attention and haptic spaspa-tial attention correlate in a single-user application comparing combinations of different modal-ities. We aim to investigate how different representations

Copyright: ©2014 Emma Frid et al. This is an open-access article distributed under the terms of theCreative Commons Attribution 3.0 Unported License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

of temporal events affect task performance by triggering a shift of attention. The following proposed hypotheses will be tested : 1) by providing auditory and/or haptic feedback a visual attention shift will be triggered, and 2) auditory feedback can elicit an increased sense of effort; a user’s gestures can be affected by ecological knowledge of sound producing events related to the implemented sound model.

3. METHOD

A SensAbleTMPhantom® Desktop haptic device1 is used together with eye-tracking technology to analyze how fo-cus of attention is affected by combinations of different modalities. The haptic device has a pen-like stylus, at-tached to a robotic arm, which is used to haptically inter-act with objects in virtual environments. A 3D application based on a simple task where the user is supposed to throw a ball into a goal (see Figure 1) has been developed. The application provides haptic, visual and auditory feedback.

Eye-tracking data will be correlated with haptic tracking data in order to investigate hypothesis 1), i.e. if focus might shift from the ball to the goal depending on the provided feedback. Hypothesis 2) will be tested through comparison between the haptic and non-haptic condition.

Figure 1: Experimental setup with the SensAble Phantom Desktop, Tobii X2-60 eye-tracker and 3D application.

Experiments with first-year students from the Computer Science program at KTH Royal Institute of Technology will be carried out. Initially, pilot experiments involving vocal sketching [9] will be carried out. The pilot tests will provide ideas for design of two different sound mod-els, but also serve as a first evaluation of the entire setup.

1http://www.dentsable.com/

haptic-phantom-desktop.htm Proceedings of the Sound and Music Computing Sweden Conference 2014

(2)

The subsequent experiments will contain auditory-haptic, auditory-visual, auditory-visual-haptic, haptic-visual and visual-only conditions. A between-group design will be adopted, where each group will solve the task in one of the conditions.

Subjects will be given a period of 5 minutes for practice before the actual experiment starts. After the practice trial, they will be instructed to try to throw the ball into the goal 40 times. Task performance, defined as a quota between 40 hits and total number of trials, will be computed for each subject. We define visual attention in terms of time that a user is focusing on a specific area of the screen.

3.1 Apparatus

The 3D application, based on the haptic software library Chai3D [10], is written in C++. As previously mentioned, a SensAble Phantom Desktop device will be used to pro-vide force feedback. The sound models for providing audi-tory feedback have been developed in Max and sound syn-thesis is done on a separate computer. Communication be-tween Max and the 3D application is done via OpenSound-Control (OSC) [11]. A pair of Sennheiser HD 433 head-phones will be used for auditory feedback. Eye-tracking data will be recorded using a commercial X2-60 eye-tracker from Tobii Technology2. The Morae software3 for us-ability testing will be used to set up, record and analyze study data.

3.2 Sound Design

A summary of different interaction events and suggested corresponding auditory feedback can be seen in Table 1. Most interaction sounds were designed as Earcons [12], since many of the sound-triggering events in the 3D appli-cation have no intuitive mapping to an auditory event. As for the sonification of the interaction with the haptic ball, i.e. the gesture where the user is aiming at the target, we compare two sound models: one simple model based on filtered white noise (simulating a whooshing sound), and one sound model designed using the friction preset from the Sound Design Toolkit [13].

The models are designed in such a manner that sound changes in terms of stereo panning and frequency depend-ing on movements in the x- and y direction respectively. Velocity is mapped to volume and a specific mapping for movement along the z-axis is adopted for each sound model (see Table 1).

4. PRELIMINARY RESULTS

Pilot tests involving vocal sketching are being performed at the time of writing. Initial findings have led to conclusions regarding adjustments that are required in order to ensure robust behaviour and reliable interaction in the virtual 3D environment. Improvements on the application as well as sound models will be done in an iterative manner as the

2http://www.tobii.com/en/

eye-tracking-research/global/products/hardware/ tobii-x2-60-eye-tracker/

3http://www.techsmith.com/morae.html

Table 1: Auditory feedback and mapping.

EVENT AUDIO MESSAGE

goal MIDI sequence, increasing pitch miss MIDI sequence, decreasing pitch

hit wall impact sound model: dissonant bell (SDT) grasp ball filtered noise, increasing frequency + click ball bouncing impact sound model: wood (SDT) aim at target velocity mapped to volume

movement in x,y,z mapped to:

panning, frequency, comb-filter characteristics panning, frequency, rubbing force?

=filtered noise model,?=friction model (SDT)

pilot tests proceed, until the setup is stable enough for the actual experiments to be carried out.

5. FUTURE WORK

As a continuation of this study, future investigations could involve assessment of how visual spatial attention could be affected by auditory and haptic feedback in a multi-user setting.

Acknowledgments

This work was supported by the Swedish Research Council (Grant No. D0511301).

6. REFERENCES

[1] L. L. Chu, “Haptic feedback in computer music perfor-mance,” in Proceedings of the International Computer Music Conference. ICMA, 1996, pp. 57–58.

[2] S. M. O’Modhrain and C. Chafe, “Incorporating haptic feedback into interfaces for music applica-tions,” in Proceedings of the International Symposium on Robotics with Applications, World Automation Conference, 2000.

[3] M. Giordano, S. Sinclair, and M. M. Wanderley, “Bow-ing a vibration-enhanced force feedback device,” in Proceedings of the New Interfaces for Musical Expres-sion Conference (NIME), 2012, pp. 445–448.

[4] P. Moss and B. Cunitz, “Haptic theremin : Developing a haptic musical controller using the Sensable Phantom Omni,” in Proceedings of the International Computer Muisc Conference, 2005, pp. 275–277.

[5] A. Kontogeorgakopoulos and G. Kouroupetroglou, “Low cost force-feedback interaction with haptic dig-ital audio effects,” in Gesture and Sign Language in Human-Computer Interaction and Embodied Com-munication, ser. Lecture Notes in Computer Science, E. Efthimiou, G. Kouroupetroglou, and S.-E. Fotinea, Eds. Springer Berlin Heidelberg, 2012, vol. 7206, pp. 48–56.

[6] K. Crommentuijn and F. Winberg, “Designing auditory displays to facilitate object localization in virtual hap-tic 3D environments,” in Proceedings of the 8th inter-national ACM SIGACCESS conference on Computers and accessibility. ACM, 2006, pp. 255–256.

Proceedings of the Sound and Music Computing Sweden Conference 2014

(3)

[7] M. E. Altinsoy and M. Stamm, “Touch the sound: The role of audio-tactile and audio-proprioceptive interac-tion on the spatial orientainterac-tion in virtual scenes,” in Proceedings of Meetings on Acoustics, vol. 19, no. 1. Acoustical Society of America, 2013, pp. 3458–3458. [8] C. Spence and J. Driver, “Cross-modal links in

atten-tion between audiatten-tion, vision, and touch: Implicaatten-tions for interface design,” International Journal of Cogni-tive Ergonomics, vol. 4, pp. 351–373, 1997.

[9] I. Ekman and M. Rinott, “Using vocal sketching for designing sonic interactions,” in Proceedings of the 8th ACM Conference on Designing Interactive Systems, ser. DIS ’10. New York, USA: ACM, 2010, pp. 123–131.

[10] F. Conti, F. Barbagli, D. Morris, and C. Sewell, “CHAI: An open-source library for the rapid development of haptic scenes,” in Proceedings of the IEEE World Haptics Conference, Pisa, Italy, March 2005, available: http://www.chai3d.org/.

[11] M. Wright, “Open Sound Control: an enabling tech-nology for musical networking,” Organised Sound, vol. 10, no. 3, pp. 193–200, 2005.

[12] M. M. Blattner, D. A. Sumikawa, and R. M. Greenberg, “Earcons and icons: Their structure and common de-sign principles,” Human–Computer Interaction, vol. 4, no. 1, pp. 11–44, 1989.

[13] S. Monache, P. Polotti, and D. Rocchesso, “A toolkit for explorations in sonic interaction design,” in Pro-ceedings of the 5th Audio Mostly Conference: A Con-ference on Interaction with Sound, New York, NY, USA, 2010, pp. 7–13.

Proceedings of the Sound and Music Computing Sweden Conference 2014

References

Related documents

empirical and methodological challenges for researching improvisation in the general music classroom and complement existing music education research in two aspects: (i) teachers

Inertia, backdrivability, friction/damping, maximum exertable force, continuous force, minimum displayed force, dynamic force range, stiffness, position resolution, system

We first propose an experiment to assess how we can use the force feedback on the wrist to alter the weight perception when manipulating physical props in VR. Then, we implement

The desired effect in the local political field of the operations area is that all politi- cal activity is conducted within the framework defined by the politico- strategic goals

Boken är något för forskningsbibliotek, kommunbibliotek och arbetsplatsbibliotek och för professionella och enskilda som har beröring med eller intresse av sociopatiska

Min studie hade som syfte att undersöka hur alternativa verktyg upplevdes av elever och pedagoger och identifiera möjligheter och hinder. De fördelar med

Often, “excessive responsibility” is laid on her. Work task demands are too emotionally challenging.. Table 6 Codes for each WEIS item and number of meaning units which

The intermediate disturbance hypothesis (Brewer 1994; Townsend 2003) and Janzen-Connell hypothesis (Kellman and Tackaberry 1997) do not uniquely explain tropical savanna coexistence