• No results found

Using eye-tracking to study the effect of haptic feedback on visual focus during collaborative object managing in a multimodal virtual interface

N/A
N/A
Protected

Academic year: 2021

Share "Using eye-tracking to study the effect of haptic feedback on visual focus during collaborative object managing in a multimodal virtual interface"

Copied!
4
0
0

Loading.... (view fulltext now)

Full text

(1)

http://www.diva-portal.org

Postprint

This is the accepted version of a paper presented at SweCog 2017, Uppsala, Sweden, October

26–27, 2017.

Citation for the original published paper:

Moll, J., Frid, E. (2017)

Using eye-tracking to study the effect of haptic feedback on visual focus during

collaborative object managing in a multimodal virtual interface

In: Proceedings of the 13th SweCog conference (pp. 49-51). Högskolan i Skövde

N.B. When citing this work, cite the original published paper.

Permanent link to this version:

(2)

Using Eye-Tracking to Study the Effect of Haptic Feedback on Visual Focus During

Collaborative Object Managing in a Multimodal Virtual Interface

Jonas Moll1, & Emma Frid2

1Department of Information Technology, Uppsala University

2Department of Media Technology and Interaction Design, KTH Royal Institute of Technology

jonas.moll@it.uu.se

Haptic feedback provides feedback to our sense of touch. The effects of haptic feedback on task performance and dual-task capacity in multimodal virtual environments have been extensively studied (Burke et al., 2006). Moreover, the effects of this type of feedback on collaboration and communication in collaborative virtual environments are gaining more and more interest (Moll, 2013; Sallnäs, 2004). However, none of the previous studies on this topic has, to our knowledge, investigated how haptic feedback affect visual attention during collaborative object handling. The pilot study described in this abstract extends the work on collaborative haptic interfaces to also involve visual attention, by employing eye-tracking methodologies. Eye tracking is a technique in which an individual’s eye movements are measured in order to detect where this person is looking at a specific time (Pool and Ball, 2006). Common eye tracking metrics include measures of fixations, such as number of fixations and total fixation duration. Fixations are moments when the eyes are relatively stationary due to information processing.

In our previous work (Frid et al, 2017), we presented an exploratory study on the effect of auditory feedback on gaze behavior in conditions with versus without haptic feedback. Analysis of eye tracking metrics indicated large inter-subject variability and the difference between subjects was greater than the difference between feedback conditions. No significant effect of feedback type was observed, but clusters of similar behaviors were identified, and certain participants appeared to be affected by the presented auditory feedback.

Following up on this study, we developed a collaborative interface in which two users had to work together in order to move an object to a defined destination. The graphical interface is similar to the one used in Frid et al. (2017), but the task is different; two users may move an object by pushing it from each side, the object can then be lifted and placed on top of one of two pillars (the interface and setup is shown in Figure 1). The two users performed the experiment in the same room, with two separate displays. The users could talk to each other, but a white screen separated them.

(3)

A pilot experiment was conducted using the above-described collaborative interface. The experiment had a between-group design. Six pairs participated, three pairs had access to haptic feedback (could feel all parts of the workspace, including the object and forces from the other user when moving the object) and three pairs did not experience any haptic feedback. The following hypotheses were tested in the study:

H1: Users will focus significantly longer on the target areas (i.e. the pillars) in the haptic condition than in the nonhaptic condition.

H2: Users will have significantly more fixations and visits (glances) in the haptic condition than in the nonhaptic condition.

Although we found no significant differences for the investigated eye tracking metrics (fixation count, total fixation duration, visit count, and total visit duration) between the haptic and non-haptic groups, interesting tendencies arose, related to the effect of haptic feedback on visual focus.

Figure 2. Box plots of fixation counts for the left and right target (i.e. pillar) (n=4 for nonhaptic case, n=6 for haptic case).

Boxplots of fixation counts for the left and right target (pillar) can be seen in Figure 2, with lower median and smaller interquartile ranges for the nonhaptic case. A heat map analysis on total fixation duration also indicated that participants focused more on the target areas in the haptic case. Even though the null hypotheses could not be rejected, the results indicated that a larger sample size might result in significant findings. The pilot experiment results suggested that haptic feedback could indeed affect gaze behavior during joint object manipulation in virtual environments. These tendencies encourage us to move forward with future experiments with a similar setup. The next step of the study would involve adding movement sonification to the task in order to evaluate the effect of auditory feedback (and combined auditory and haptic feedback) on gaze behavior, similarly to the previous study performed by Frid et al. (2017).

References

Burke, J., Prewett, M.S., Gray, A.A., Yang, L., Stilson, F.R.B., Coovert, M.D., Elliot, L.R., and Redden, E. (2006). Comparing the effects of visual-auditory and visual-tactile feedback on user performance: a meta-analysis. Proceedings of the 8th international conference on Multimodal interfaces (Banff, Canada, November 2006). pp. 108-117.

Frid, E., Bresin, R., Pysander, E. L. S., & Moll, J. (2017). An Exploratory Study On The Effect Of Auditory Feedback On Gaze Behavior In a Virtual Throwing Task With and Without Haptic Feedback. In Sound and Music Computing (SMC) 2017, pp. 242-249.

(4)

Moll, J. (2013). The influence of Modality Combinations on Communication in Collaborative Virtual Environments. Doctoral thesis, School of Computer Science and Communication, Royal Institute of Technology, Stockholm.

Poole, A. and Ball, L. J. (2006) Eye Tracking in HCI and Usability Research. Encyclopedia of human computer interaction, vol. 1, pp. 211–219.

Sallnäs, E-L. (2004). The effect of modality on social presence, presence and performance in collaborative virtual environments. Doctoral thesis, School of Computer Science and Communication, Royal Institute of Technology, Stockholm.

References

Related documents

Women aged, 55–59 years, excluded from the screening with a normal cy- tology cervical sample were found to have a high-risk HPV (hrHPV) prev- alence of 5.5% in paper II.. Nineteen

The other two curves show how the dialogue act tagging results improve as more annotated training data is added, in the case of active (upper, dashed curve) and passive learning

Nordin-Hultman (2004) menar att olika handlingsmöjligheter i rum ger förutsättningar för att kunna bemöta barns olikheter, samt att rummen är mindre reglerade. Detta kan passa

[r]

I de resultat som vi har kommit fram till genom våra studier på alla genomförda Paralympiska vinterspel visar att både i Aftonbladets- och DNs sportbilagor erhåller de

The objective of the present work was to develop multilayer films based on CNF substrates and organic-inorganic hybrid multilayer coatings with lower permeability to water vapor and

Anna Fagerström (2020): Long-term molecular epidemiology of extended- spectrum β-lactamase-producing Escherichia coli in a low-endemic setting.. Örebro Studies in

Prototypen som testade visar förbättrade värden av prestanda i figur 8 eftersom differensen blivit betydligt lägre mellan varje mätpunkt, men eftersom graf kontrollen från LiveCharts