• No results found

Humanized technology - A study of how to implement human characteristics in eye tracking interactions with Tobii EyeX

N/A
N/A
Protected

Academic year: 2022

Share "Humanized technology - A study of how to implement human characteristics in eye tracking interactions with Tobii EyeX"

Copied!
83
0
0

Loading.... (view fulltext now)

Full text

(1)

Humanized technology

A study of how to implement human

characteristics in eye tracking interactions with Tobii EyeX

PAOLA ALONSO CHAPEL BJANKA COLIC

Master of Science Thesis Stockholm, Sweden 2016

(2)

Humanized technology

A study of how to implement human characteristics in eye tracking interactions with Tobii EyeX

Bjanka Colic

Paola Alonso Chapel

Master of Science Thesis MMK 2016:112 IDE 184 KTH Industrial Engineering and Management

Machine Design SE-100 44 STOCKHOLM

(3)

Abstract

This thesis has been done in collaboration with the Swedish eye tracking company Tobii AB, at the department Tobii Tech. The purpose has been to investigate what humanized technology means and how Tobii’s product EyeX can enhance a more natural and humanized interaction with the computer. This work has been conducted throughout an explorative approach; using secondary sources from relevant literature and qualitative studies; divided in a theoretical and practical part.

Firstly, a theoretical part was conducted to gain general knowledge about the eye tracking and its different areas of implementation. Currently, eye tracking can be found in various fields of research and communication tools for people with neurological disabilities. Contrary to those application areas of eye tracking, EyeX is aimed for the consumer and gaming market and therefore the work is limited to able-bodied users. This was continued by research concerning:

human interactions and humanization in human-computer interaction. The theoretical part of this project created a foundation for how relationships are created and how emotional aspects of human characteristics could possibly be related to computer interaction. It also described how a meaningful and lasting relationship could be created with computers and their various features by the means of habits. Furthermore, the work continued with the practical part: three workshops, performed to further understand the relevance of the results from the related work and qualitative data collection through interviews and observations with EyeX users. By the end of the user study it could be assumed that emotional aspects of human characteristics was essential for the development of interfaces and therefore should be implemented in human-computer interactions.

Finally, a supplementary workshop was conducted to clarify how emotional and abstract factors could easier be related to practical eye tracking and computer features. Also, two evaluation methods were created for evaluating humanization and emotional aspects in features/products.

Keywords: Eye tracking, humanized technology, UX design, stickiness, human interaction Master of Science Thesis MMK 2016:112 IDE 184

Humanized technology

A study of how to implement human characteristics in eye tracking interactions with Tobii EyeX

Bjanka Colic Paola Alonso Chapel

Approved

2016-06-20

Examiner

Claes Tisell

Supervisor

Sara Ilstedt

Commissioner

Tobii AB

Contact person

Ida Nilsson

(4)

Examensarbete MMK 2016:112 IDE 184 Mänsklig teknik

En studie om hur mänskliga egenskaper kan implementeras i eye-tracking interaktioner med Tobii

EyeX

Bjanka Colic Paola Alonso Chapel

Godkänt

2016-06-20

Examinator

Claes Tisell

Handledare

Sara Ilstedt

Uppdragsgivare

Tobii AB

Kontaktperson

Ida Nilsson Sammanfattning

Detta examensarbete gjordes i samarbete med det svenska eye tracking företaget Tobii AB, på avdelningen Tobii Tech. Syftet med arbetet var att utforska termen mänsklig teknologi och undersöka hur deras produkt EyeX i högre grad kan skapa naturligare och intuitivare interaktioner med datorn. För att åstadkomma detta har ett explorativt tillvägagångssätt, uppdelat i en teoretisk och praktisk del med sekundära, godtagbara källor och kvalitativ datainsamling.

Examensarbetet initierades med den teoretiska delen i syfte att erhålla generell kunskap om eye tracking och dess nuvarande tillämpningsområden. Idag används eye tracking främst i forskningssyften och som kommunikations- och interaktionsverktyg för funktionshindrande.

Tvärtemot dess nuvarande användningsområden, är målet med EyeX är att nå ut till konsument- och spelmarknaden och därmed har examensarbete begränsats till icke funktionshindrande. Den teoretiska delen fortsattes med: mänskliga faktorer och mänskligare människa-dator interaktioner.

Detta utgjorde en bas för att skapa en förståelse för hur mänskliga relationer skapas och hur mänskliga emotionella faktorer skulle kunna kopplas till datorinteraktion.

Vidare i arbetet gjordes den praktiska delen: tre workshops, för att utreda relevansen av resultatet från litteraturstudien och en kvalitativ datainsamling i form av intervjuer och observationer av EyeX användare. Användarstudien visade att emotionella faktorer är viktiga vid design utav gränssnitt och därmed bör användas vid utveckling av människa–datorinteraktioner.

Slutligen, för att undersöka till vilken grad mänskliga aspekter skulle kunna användas, gjordes ytterligare en workshop, där fakta från litteraturstudien användes för att koppla till praktiska tillämpningar av eye tracking och datorinteraktioner. Slutsatserna av arbetet resulterade i utvecklingen av två utvärderingsmetoder för att analysera mänsklig teknologi: en designguide uppbyggd på frågor för att utveckla mänskligare datorinteraktioner.

Nyckelord: Eye tracking, mänsklig teknologi, UX-design, ‘stickiness’, mänsklig interaktion

(5)

Acknowledgments

This master thesis work has been performed in collaboration with Tobii AB, at the Tobii Tech department, during the time period of February to June 2016. The report completes our studies at The Royal Institute of Technology, KTH, in the field of Industrial Design Engineering, a track held by the Master’s program in Integrated Product Design.

We would like to give thanks to all members of the UX Team at Tobii Tech that have helped guided us through this process. Thanks Rebecka Lannsjö, Erland George-Svahn, Dennis Rådell for continued feedback during the project. We are deeply thankful to Martin Dechant, of the UX team, who provided considerable help and guidance for the structure of this thesis. Furthermore we would like to give a special thanks to our supervisor Ida Nilsson, also a member of the UX team, who has been a great sounding board and contributed to relevant discussions and direction.

Other contributing parts for this project have been Björn Thuresson, who helped guide the project in the right direction, and all user study participants that provided with valuable insight, without which this project could not have been accomplished.

Lastly, we would like to express our greatest gratitude to our supervisor at KTH, Sara Ilstedt, who gave continuous encouragement and guidance throughout the project process.

(6)

Content

1. Introduction ... 1

2. Related work to eye tracking ... 4

2.1 Technology ... 4

2.1.1 Eye tracking applications ... 6

3. Related work to humanized technology ... 8

3.1 Human interactions ... 8

3.1.1 Emotions ... 8

3.1.2 Human senses ... 8

3.1.3 Human relationships ... 10

3.2 Humanized technology in HCI and UX ... 11

3.2.1 Recognizing human emotions HCI ... 12

3.2.2 Evoking human emotions in HCI ... 12

3.2.3 Human relationships in HCI ... 13

3.2.4 Habits and technology ... 16

3.2.5 Suggested definition for humanized technology ... 19

4. Enhancing humanized technology with EyeX ... 20

4.1 Tobii EyeX eye tracker ... 20

4.2 Methodology ... 22

4.2.1 Workshops and Brainstorming methods ... 22

4.2.2 User study I - Beta users ... 24

4.2.3 User study II - Expert users ... 26

4.2.4 Methods for Analyzing the user findings ... 27

4.2.5 Concept development ... 29

4.2.6 Method discussion ... 29

5. Results ... 32

5.1 Workshop results ... 32

5.1.1 Trust and security ... 32

5.1.2 Collaboration ... 33

5.1.3 Feedback ... 33

5.1.4 Adaption ... 34

5.1.5 Curiosity ... 34

5.1.6 Enjoyment ... 34

5.2 Customer Journey Map ... 34

5.2.1 Expectations ... 35

5.2.2 Initial bonding ... 36

5.2.3 Learning ... 37

5.2.4 Continuous use ... 39

5.2.5 Establishment of relationship ... 40

5.3 Results from the Concept Workshop ... 40

6. Methods for creating humanized technology ... 42

6.1 Emotional scale for users ... 42

6.2 Checklist for UX-designers ... 43

6.2.1 Relationship bonding ... 43

6.2.2 Emotional aspects ... 44

(7)

6.2.3 Habits ... 44

7. Conclusions ... 46

8. Discussion... 48

8.1 Emotional relationship to EyeX ... 48

8.1.1 Trust and security ... 48

8.1.2 Collaboration ... 48

8.1.3 Feedback ... 49

8.1.4 Adaption ... 49

8.1.5 Enjoyment and curiosity ... 49

8.2 Implementation of humanization ... 49

9. Future possibilities ... 52

9.1 Differentiating and humanizing the product for general computer use ... 52

9.2 New market - Ergonomic purposes ... 52

References ... 53

Appendix A. Mind Map of Important Humans Factors ... 1

Appendix B. Current UX in connection to human factors... 2

Appendix C. Role play scenarios ... 4

Appendix D. Important factors for a long lasting relationship ... 5

Appendix E. Triggers... 6

Appendix F. Beta study diary ... 7

Appendix G. Interview guide for Beta users ... 8

Appendix H. Interview guide for EyeX expert users ... 10

Appendix I. Exercise for the concept development workshop ... 11

Appendix J. Customer Journey Map for Gamers ... 12

Appendix K. Customer Journey Map for non-gamers... 13

Appendix L. Humanization checklist for UX designers ... 14

(8)
(9)

1. Introduction

Human aspects are increasingly more significant for all sorts of product development projects and creating humanized products and technologies has become a trend (Extend Limits 2016). The term: humanized technology implies a technology that adapts more to human characteristics and modes of communication. Previously, human beings have adapted to technology and learned to use it, despite the lack of intuitive and natural interactions. Mouse and keyboard are two examples of how interactions have developed to facilitate computer use. To continue this course of development it is suggested that eye tracking is a natural extension of human interaction (Porta

& Ravelli 2009) and could enable more humanized and fluent interactions with the computer (Liebling & Dumais 2014). The first eye trackers were built in the 19th century; they were technically challenging and uncomfortable for the participants (Holmqvist et al. 2011). Today it has become a more common and convenient technique to be counted on in various areas of research and users can choose between a large variety of systems and manufacturers (Holmqvist et al. 2011). Despite the potential of eye tracking and the enormous growth in recent years (Holmqvist et al. 2011), it is mostly used as an aid for disabled people with neurological difficulties (Porta & Ravelli 2009; Kumar, Paepcke, et al. 2007), or recently in market research e.g. usability studies of websites (Duchowski 2007). In December 2015 Tobii Tech launched their eye tracker device EyeX (Tobii Tech 2016) created mainly for gamers to enhance their gaming experience through features such as natural targeting and infinite screen (UK 2015). One of the leading words for Tobii Tech is humanized technology and Tobii Tech has tried to implement various human aspects into the EyeX product (Tobii 2016b). The aim is to create features that are more natural to humans in both interaction and behavioral aspects both in games but also general computer use (EyeX 2016). To try to extend eye tracking to the consumer market, which is the goal of Tobii Tech (Tobii 2016a) one approach is to investigate if more human characteristics could be included to enhance the user satisfaction. UX design is considering human experiences and translating them to interfaces or products (Kim 2015). An interesting starting point was to firstly explore the term humanized technology by investigate humans, understand how these factors are applied in UX and product development and later implement these inputs for new product design concepts for the eye tracking device EyeX.

An important aspect of this master thesis project was to try to implement another approach to eye tracking. Currently, the research about eye tracking techniques is focused on efficiency and finding solutions for improving the low accuracy (Yeoh et al. 2015; Jalaliniya 2016) the market, the eye tracker is mainly used as an input device for disabled people (Porta et al. 2010; Kumar, Building, et al. 2007). Hence, this master thesis project is investigating how eye tracking could be used to create more humanized interactions with computers for able-bodied users and in the end two evaluation methods are going to be presented as an aid to Tobii and for their future development of EyeX.

The large research question needing to be answered during this project was: How could EyeX enhance a more natural and humanized interaction with the computer for general use?

To answer this question several subtopics had to be answered and resolved and these were stated with the following questions:

(10)

 What is ‘humanized’ technology and how can it be applied in UX?

 Which aspects are important for a long lasting relationship of products and humans?

 How is the relationship between the EyeX device and its users?

 How are habits created and the stickiness of the product improved?

 How could Tobii use these insights to better create future features for EyeX? Present these insights as evaluation methods for Tobii to use in future development processes.

To be able answer the question, the work is divided into three parts described in Figure 1:

Figure 1. The three large parts of the master thesis project and their content.

Finally, as both time and knowledge within programming areas were limited, it was decided that the results presented should not contain prototypes or mockups, and instead rather be left on a conceptual stage; improvements were suggested, static visualizations of some ideas have been done, however these were not developed into functioning proposals or prototypes.

Another limitation in this project was the chosen direction in the HCI field. Today HCI is divided into affective computing, where the aim is to have computers that understand the users, recognize their face and eye movements and UX – design, where the aim is to evoke emotions and create positive experiences for the users. The later was chosen in this project as it was more suitable for the purpose of the work and was more in line with the objective of Tobii.

(11)
(12)

2. Related work to eye tracking

Today eye trackers are relatively easy and user-friendly, used in various areas both in research, market study and most recently in games (Duchowski 2007; Lankes et al. 2014). Below, are a short introduction of the technology, application, current issues and possibilities.

2.1 Technology

The most crucial fact about the eye for understanding eye tracking is the differences between the foveo-peripheral vision i.e. the detailed image and the overall picture. The perception of details is obtained through the fovea in the center of the retina (Duchowski 2007) see Figure 2. In other words, fixations are eye movements that stabilize the object of interest on the highest resolution area, the fovea (Cantoni & Porta 2014). Fixations could be misleading as the eye is never still but performs small micro-movements: tremor, micro saccades and drifts, however, when the eye remains relatively still over a period of time to gather information (Holmqvist et al. 2011). Eyes are moved in a particular way to bring various ranges of the visible field of view into high resolution, so they could be seen in fine detail (Cantoni & Porta 2014). Often, our attention is diverted to the objects with regions of interest. Approximately 90 % of the human vision is spent on fixations, during which the information is gathered (Duchowski 2007). Ultimately, longer fixations mean an increased cognitive activity. In between the fixations, when the attention is switching to another area, the eyes perform rapid eye movements, called saccades (Duchowski 2007). During saccades the vision is suppressed and the eye is unable to focus (Duchowski 2007).

Another important feature in eye tracking is gaze. Gaze could be defined as direct looking at any object, person of direction, while eye contact is considered when gaze is directed at another’s eyes (Bohannon et al. 2013).

Figure 2. Illustration of the eye (Duchowski 2007, p.19)

The eye tracker is providing the possibility to follow someone’s eye movements, as the gaze is being detected (Cantoni & Porta 2014). Furthermore, by tracking someone else’s gaze you are

(13)

automatically investigating the path of attention of the observer, which in turn provides with the knowledge of how the environment is perceived by the user or observer.

Eye tracking could be used in two types of applications (Cantoni & Porta 2014). In the first application type it is used as a tracker, a passive sensor that monitors the eyes to determine what the user is watching. In the second type the eye tracking has an active role in the integration between the computer and the user, which is going to be further investigated in this report.

According to Duchowski (2007) there are four main classes of eye movement measurement methodologies; Electro-Oculo Graphy (EOG), which records electric potential differences with scleral contact lenses (search coil), using a lens to measure precise eye movements; Photo-Oculo Graphy (POG) or Video Oculo Graphy (VOG) which measures features during eye rotations and video-based combined pupil and corneal reflection. Nowadays Video-Based Combined Pupil- Corneal Reflection (Remote Eye Tracking) is mostly used (Duchowski 2007). The technology uses infrared light to create reflection patterns on the user’s eyes, a high rate camera can capture images of the user’s eyes and complex algorithms are used to determine the positions of the user’s eyes and the gaze points on the device screen (Purits & Söderback 2013). Compared to earlier generations of eye tracking this method is automatic, i.e. the focus does not have to be set manually, which leads to easier and faster calibration that users can perform by themselves (Holmqvist et al. 2011). Figure 3 below shows how the technology works in relation to optics. PR stands for Purkinje lines, which are the corneal reflections on the eye. Measuring the distance between the first purkinje line and the center of the pupil makes it possible to determine eye movements separated from head movement (Duchowski 2007).

Figure 3. Purkinje images (Duchowski 2007, p.57)

To calculate where someone is looking by using the video-based eye tracking method, there are three steps: image acquisition, image analysis and gaze estimation (Holmqvist et al. 2011).

Briefly described, the image of the eye is captured by the camera and sent for analysis. Secondly, the position of the face and the eyes are detected in the image, followed by having algorithms extracting the pupil and corneal reflection from the image. Finally, various geometricals in combination with the calibration data are mapping the position (x,y) of the pupil and corneal reflection. The system has its weaknesses in: sensitivity in pupil dilation, extreme gaze angles

(14)

and difficulty in calculating pupil center due to descending eyelid and disturbances from eye lashes (Holmqvist et al. 2011).

2.1.1 Eye tracking applications

It is increasingly common that eye tracking is used as a means of studying and observing human behavior (Purits & Söderback 2013), to obtain knowledge about what the user is experiencing on the screen to increase the usability (Duchowski 2007) and aesthetically important locations in images (Santella et al. 2006). The main focus has been on limited interaction for disabled users (Kumar, Paepcke, et al. 2007) resulting in specialized applications that apply dwell-time and blink. It is less known that eye tracker technology has begun to expand to the consumer market, mostly in gaming and mobile application (Lankes et al. 2014) and that it also could be used as an input device in general computing for able-bodied users, similar to a mouse (Yeoh et al. 2015).

Zhai (1999) was one of the first to combine hand movements and gaze in his MAGIC pointing to include able-bodied users in the development. Today, the eye tracking technique holds promises in a variety of situations: in environments away from the desk (e.g. working with laptops) (Porta et al. 2010), in allowing users to increasingly hold their hands on the keyboard (Kumar, Building, et al. 2007) and in terms of more direct and fluent interactions (Yeoh et al. 2015). From a human perspective, eye movements are initiated 70 ms earlier and subsequently followed by hand movements and therefore motivating that the use of eye tracking could be faster (Liebling &

Dumais 2014) and could minimize the mouse movements to mitigate the problems of repetitive strain injury (RSI) related to computer mouse use (Sibert & Jacob 2000). More importantly, gaze interaction is believed to be a natural extension of the human interaction (Porta & Ravelli 2009) as it uses the natural activity in the human eyes and would not demand the users to learn a new technique (Sibert & Jacob 2000). Knowing what the user is looking at could help to create a more natural interface e.g. saccades are of significant interest as they provide a measure of changing focus or attention, which is of high importance for a better HCI experience (Liebling & Dumais 2014) Ultimately, gaze interaction should be considered as the future design interface (Sibert &

Jacob 2000) as it emphasizes on natural human practice, requires little conscious effort and frees the hands for other activities (Yeoh et al. 2015). Also, the eyes contains information about current tasks, the state of the individual and implicitly indicates the area of user’s attention, which suggest that eye gaze interaction is a good candidate as a future input method (Sibert & Jacob 2000).

Despite the opportunities and interest for eye tracking and gaze-based interaction, there are crucial problems needed to be solved before the eye tracker could be well-established on the consumer market. The most significant challenge for the algorithms is to improve the accuracy;

making a more robust gaze estimation algorithm, better common grounds for understanding fixations and minimizing the noise from pupil/corneal reflection location (Holmqvist et al. 2011).

Other issues connected to the technical aspects are: challenges of calibration, changing lighting conditions and optical properties of visual aids such as glasses and contact lenses, lack of analogous functions for single-clicking and double-clicking etc (Zhang et al. 2008; Yeoh et al.

2015; Jalaliniya 2016; Ashmore & Duchowski 2005). Also the physiological limitations of the human eye pose challenges e.g. inherent jittery, involuntary movement (tremors, drifts,

(15)

microsaccades) as well as the limited size of our fovea (Zhang et al. 2008; Yeoh et al. 2015;

Ashmore & Duchowski 2005; Jalaliniya 2016). Various ideas have been presented in hope of minimizing these issues. Kumar and Winograd (2007)proposed a Gaze enhanced UI design with three practical applications for an improved pointing, application switching and scrolling. A practical gaze-based solution, EyePoint was presented, done by a two-step interaction: look- press-look-release (Kumar, Paepcke, et al. 2007): holding a hotkey to bring up a magnified area and later release to select. Similarly, Yeoh et al (2015) presented three gaze + click alternatives:

the letter assignment (assigning letters to hyperlinks), offset menu (the use of hotkey) and ray selection (radial menu). Others, focused on optimizing the scrolling function (Porta & Ravelli 2009; Ishii & Kobayashi 1992) e.g. EyeGrip that tried to use the natural Optokinetic Nystagmus eye movements (eye movements that occur when following objects in motion while the head remains stationary), to detect when automatic scrolling should occur or not (Jalaliniya 2016).

It has been discussed that to be able to improve the accuracy, the targets should be larger either through zoom (Bates & Istance 2002), target expansion (Miniotas et al. 2004) or enlargement of specific areas (Ashmore & Duchowski 2005). Ashmore (2005) also proposed the alternative solution to use a fisheye lens to locally magnify the display. Furthermore, Zhang (2008) tried to increase the stability of the eye cursor by introducing three methods: force fields (a magnetic field to attract the mouse cursor), speed reduction and warping on target. Other research papers have also considered new design or concepts of the cursor to compensate for inaccuracy and facilitate the use (Kumar, Paepcke, et al. 2007; Porta et al. 2010; Blanch & Ortega 2009). An alternative solution has been to introduce EyeGestures similar to the Firefox Web browser plugin mouse gesture application (Drewes & Schmidt 2007).

Gaze based interaction techniques are proposed to be the next type of computer devices going mainstream (Yeoh et al. 2015). However, there are still various issues and challenges to solve, in particular in creating value for user in everyday computing. The previous work has shown that eye tracking devices are considered more natural (Sibert & Jacob 2000; Yeoh et al. 2015; Kumar, Building, et al. 2007), however their work all solely showed examples of solving the problems with accuracy, instability in the eye movements and improve efficiency and not focusing on attractiveness of the technique by naturalness, ease and humanization of the technology, which is the future challenge for HCI.

(16)

3. Related work to humanized technology

This section presents a review of how human beings interact and how emotional aspects have previously been related to HCI and UX. An investigation concerning important aspects for continuous relationships are related to computer interaction to finally define the term humanized technology.

3.1 Human interactions

This section describes the emotional and sensual aspects of human interactions that are fundamental for human relationships and interactions. In the Appendix A, a mind map summarizing the findings on human beings can be found.

3.1.1 Emotions

Emotions play an essential role in our everyday life, and they are also crucial for rational and cognitive functions e.g. being a core part in decision making, perception, learning, planning, action selection and memory (Milanova et al. 2012; Hudlicka 2003; Dolan 2016) Not only are emotions important for cognitive processes, they also guide human control of behavior and are an aid in valuing various events and environments in terms of desirability (Hudlicka 2003; Dolan 2016). Humans interact with each other as well as with the society, which consequently leads to expectations imposed upon those who interact, and creates cultural differences, attitudes and values (De Mooij 2014).

On a neuroscientific level, emotions are the outcome of a complex system. The sensorial organs process environmental external signals that go through dedicated parts of the brain where they are being interpreted as meaning, and lastly creating emotions (Maiocchi 2015). Humans store various interpretations of the same situation in form of experience, feelings and emotions and later apply the most appropriate outcome for future situations (Maiocchi 2015). Hence, they are essential in social interactions as other people’s emotional states can be perceived by others thanks to prior experience with emotions, and also help us engage in other people’s lives (Adolphs 2003). Furthermore, emotional experiences can be gained from various sources, e.g.

beauty of the nature. Artifacts produced by humans can also stimulate emotions and feelings, such as art and music (Blood & Zatorre 2001; Adolphs 2003; Taura et al. 2011).

3.1.2 Human senses

To understand human emotions and interactions it is necessary to understand what type of external signals the sensorial organs process and interpret (Maiocchi 2015). Humans perceive the world through the senses: vision, hearing, touch, smell, and taste and using all the senses simultaneously and multimodal increases the perception of information, and helps to perform tasks in an intuitive way (Bordegoni 2011). Table 1, shows the five human senses in relation to human social behavior and computer interaction.

(17)

Table 1. The five human senses and their relation to human social behavior in regards to implementation in HCI

Sense Vision Hearing Touch Smell Taste

Human social behavior

Eye contact Face expressions Gestures Body language

Silence Language Intonation of speech, tone convey

emotions and meaning

(Dolan 2016)

Physical touches Convey social messages:

hug, Tactility

(Haans &

IJsselsteijn 2006; MacLean 2008)

Relation between smell and social

human behavior

has not been found

(Adolphs 2003)

Not been investigated

Relevance for HCI

Visual feedback is the main communication mode between the user and computer

Increasingly more

important

mode of feedback e.g.

Siri, voice signals etc

Mostly implemented in mobile technology as vibration

Perceived as

irrelevant for HCI

Perceived as irrelevant for HCI

The human visual system is important in communication especially with non-verbal cues e.g.

emotions are normally manifested in recognizable behavioral patterns, communicated in forms of non-verbal cues (Dolan 2016; Knapp & Hall 2002). Non-verbal communication includes: body language, facial expression, eye contact, gestures, movements, posture, touching and vocal behavior and additionally cues such as icons, indices and symbols (Knapp & Hall 2002).

Eye contact is one of the most important and fundamental non-verbal cues in social interactions (Bohannon et al. 2013; Farroni et al. 2002). It plays an essential role in perceived trust, formation of impressions (from the other person), emotion and compliance (Bohannon et al. 2013). Studies have shown (Bohannon et al. 2013) that having the ability of eye contact with your conversational partner increase the information input and also reduces misunderstanding and inaccuracy. Another crucial aspect of eye contact is gaze, which is explained by mutual eye contact. Gaze directed at another’s eye tells if the person is present or absent and what emotions that could be perceived. This interpretation of the eye gaze can be gathered for experiential information, and re-applied in similar situations, which is essential for developing understanding of other's emotional state (Farroni et al. 2002). Eye contact is important during social interactions and people look each other in the eyes for several reasons: to get feedback, to ensure that the communication channel is open, and also to avoid social rudeness (since it is not socially accepted to avoid eye contact while speaking) (Argyle & Dean 1965). Gaze is a powerful director

(18)

of attention; our attention tends to follow the gaze of another person. Additionally, it can also influence the perception of attractiveness and likeability of another person, as liking has been shown to correlate with the amount of gaze and the total percentage of time spent on looking (Soo et al. 2014).

The human face-to-face communication is the richest medium that provides immediate feedback:

a direct response through a nod, a word, a new question, etc., will encourage a continuative interaction (Ichimura & Mera 2013; Bohannon et al. 2013) through vision or speech. It is used to exchange information, reduce the uncertainty in decision-making process and also to produce meaning messages (De Mooij 2014). Facial expressions are movements of facial muscles in response to a person’s emotional state or social communication (Chang et al. 2014). Facial expressions are the most intensively monitored non-verbal cues in everyday life and are also decoded faster than words (Scholl 2013) and therefore they play an essential role in human face- to-face interaction (Chang et al. 2014). People are extremely skilled at decoding others’

emotional states through facial expressions such as fear, happiness and also make fast detection of changeable configurations of faces e.g. eye movements, lift of a eye brow, and also mouth movements. Facial cues can help us detect gender, identity, emotions and personality traits (Adolphs 2003). Even blushing could be seen as a hidden sign of communication and could be interpreted as a sign of shyness, anger, lying and attraction (Petrilli 2015).

Gestures are typical cultural signs as well as a method of enhancing the verbal language. They could be characterized into four different groups: greeting gestures, beckoning gestures, insulting gestures and touching gestures. Hand gestures are universally recognized and convey specific information to others. Furthermore, people also mimic each other, their gestures, and movements when they show interest, attraction or likeability to another person (Chartrand & Bargh 1999).

Some researchers also observe that mimicry and analogous postures and movements are correlated to liking and empathy in humans (Soo et al. 2014).

The physical distance to each other could also be a sign of non-verbal communication and could be interpreted to give privacy or to isolate, or on the contrary as an insulting act to intrude your private zone (De Mooij 2014).

3.1.3 Human relationships

Belonging is a fundamental need of human beings; maintaining social bonds, enhancing close relationships and establishing friendships and partnerships. While impaired social relationships reduce the positive feelings of meaning and belongingness, close friendship are linked to the feeling of finding meaning of life (Lambert et al. 2013). Furthermore, according to (Baumeister, R.F. and Leary 1995) two aspects are highlighted in terms of human relationships:

1. People need frequent personal contacts; interactions that are experienced as affectively positive or pleasant.

2. People need interpersonal bonds with others that are marked with stability, affective concern and continuation in the foreseeable future (security).

(19)

When people feel that they are being accepted, included and welcomed it leads to positive emotions, which is a requirement for a healthy and enforcing relationship. Also the symbol of lastingness and continuity is a crucial part for immortality of a relationship and the inclusiveness in a group or society (Lambert et al. 2013). Positive relationships are also signaled through various physical stimuli like: touching each other, close interpersonal distance, forward lean, mutual eye-contact as well as head-nodding, smiling, elated vocal expression etc (Scholl 2013).

Philippot and Feldman (2004) discuss the necessity of positive emotion during the formation, development and maintenance of social bonds. These aspects of creating interpersonal relationships can be combined with the fundamental values of the need to belong (Baumeister, R.F. and Leary 1995) and describe the process of creating a relationship (Philippot & Feldman 2004), which in this case has been divided like in Figure 4:

Figure 4. The 3 phases of relationship bonding and the actions they constitute.

Philippot and Feldman (2004) also discuss the importance of collective agency in relationship forming, which concerns shared understanding of how to react in situations. Two important social rituals in regards of showing an understanding of situations, and creating lasting relationships, are greeting and farewells. They possess the character of assuring others of a continuous relationship between them, a relationship that has remained since last time and will last into the foreseeable future (Baumeister, R.F. and Leary 1995).

3.2 Humanized technology in HCI and UX

HCI development has long focused on usability and making interactions more functional and connected to user needs, however neglecting the knowledge about how experiences and emotions are shaped and evoked by interactions (Benyon 2014). The recent years HCI researchers have put their attention on users’ emotions, designing user interfaces in connection to two preliminary areas; recognizing or evoking human feelings (Jung & Love 2008). Research has shown that people interact with computers as if they were real social actors (Reeves & Nass 1996; Kim &

Moon 1998), which means that emotions, social rules related to human interactions, need to be better incorporated in mediated environments. Designers need to account for human behavior, emotions and social interactions such as feelings of frustration and anger as well as joy and content.

(20)

3.2.1 Recognizing human emotions HCI

Affective computing is the field of HCI where computers and technology is developed to recognize and express human emotion (Martin et al. 2013). Human emotions include the feelings a person experiences that can be shown through various physical features such as;, facial expressions, heart rate, gestures, pupil size, prosody, posture etc. Affective computing is inspired by human social interactions by focusing on making the computer have and recognize human emotions (Jung & Love 2008).

Researchers of emotion recognition have found various physiological aspects for understanding human behavior, some of which can be connected to the face and eyes, such as; pupil size, eye movements variation, facial expression (Lanata et al. 2011). One method for emotional recognition is a facial action coding system called FACS, which is used for understanding emotions related to facial expressions (Ekman & Friesen 1978). What the methods for affective computing have in common is the focus on cognition, recognition and understanding of human reactions and relating them to emotions. One challenge with affective computing is to create algorithms that can detect human behavior, as it requires advanced pattern recognition (Soo et al.

2014).

3.2.2 Evoking human emotions in HCI

As a contrast to the affective computing approach, user experience (UX) focuses on evoking positive emotions (Hassenzahl & Tractinsky 2006), meaning that using products or interfaces should involve memorable experiences (Overbeeke et al. 2004). The user should always be in the center of the experience, focusing on aspects that bring the user motivation, feelings or desire (Kim 2015). Some emotional aspects of human interactions can be related to current UX aspects described below, see Appendix B for a complete table of connected factors.

Diverse experience permits the imagination to thrive, give humans knowledge and affects the personal development (Wright et al. 2006; Kim 2015). Human imagination is a great tool when designing the context because it lets users complete experiences by themselves (Wright et al.

2006). This can also be used by creating simple interactions, which give complex reactions (Senger 2004), since humans react with emotions and feelings to all kind of situations (Benyon 2014). Through engaging experiences the sense of time and space changes (McCarthy, John J 2007). Developing for emotions such as absorption and enchantment, which are ways of altering the spatio-temporal sense, can keep users focused on specific experiences (Benyon 2014;

McCarthy, John J 2004). Spatio-temporality can also contribute to technology being seen as an extension of the self (Benyon 2014); like wearing glasses gives the perception that they are part of the physical body; and in that way change the perceived space where the interaction occurs.

Other ways of altering the spatial sense is using virtual environment, which can also add to the enjoyment of the experience. Looking from an experience perspective the perceived beauty can also affect the user’s emotional engagement (Overbeeke et al. 2004). Norman (2004) describes aesthetics and design to have three levels that can be used when designing an interface: visceral, behavioral and reflective. The visceral reaction is described by the way people “Really need to buy this” but they do not know why, while design that works on a behavioral level is focused on

(21)

usability and function and provides a subconscious sense of pleasure and satisfaction. On the reflective level, conscious reactions, a person's self-image (Kim 2015; Brandtzæg et al 2004), memories and experiences are significant. These are feelings evoked after using something a long time or having knowledge that connects a product or interface to pleasure and beauty.

Furthermore, providing feedback lets the user know that they are in the environment and interacting with the interface, much like situations in physical spaces (Benyon 2014). It is a way of connecting the interface to fundamental human interactions, since humans give feedback continuously during an interaction. Norman (2004) states that the behavioral level in humans is dependent on feedback especially touch and feel, and the addition of several senses can increase the level of engagement (Kim 2015; Hummels 2000; Norman 2004)). Giving feedback on the user’s presence, and by that evoke emotions, could also be accomplished by dynamical interfaces (Kim & Moon 1998). This manner of showing that the user changes and leaves traces in the environment, and that other users do the same, is a way of visualizing interactions much like a real environment (Benyon 2014). Social relationships and predictable environments constitute safe areas in people’s lives, and therefore creating technology more consistent with social and physical rules, would result in a more enjoyable technology for users.

3.2.3 Human relationships in HCI

The core of interactions is about people having an exchange of opinions, of thoughts, of emotions, words, physical objects and gestures (Fleming & Koman 1998). Interactions on and with the computer are as complex as the interactions off the computer, and it is beneficial to understand both (Fleming & Koman 1998). Interactions between individuals and computers (or new media) are fundamentally social and natural, just like our interactions in real life (Reeves &

Nass 1996). Ultimately, humans put effort in relationships for the same reason they invest time and energy in products and services (Eyal 2014). Relationships and friendships are important in life and therefore people make an effort to get and preserve them (De Mooij 2014). Similar to good friendships is the relationship with products, the more engaged people are or the more effort is invested, the more both parties benefit (Eyal 2014).

The term humanized technology implies a technology that adapts more to human characteristics and not forcing the human to learn to use it, corresponding to Norman’s core idea (2013) - a world in which we don’t struggle to understand objects, instead they are designed to understand us. Since people evaluate a product or service based on experience, not just interactions, a good user experience should be inspired by human experiences of good relationships (Kim 2015) to truly enable long lasting human-product relationships. Communication is essential in human lives (De Mooij 2014) and practicing design should primarily be about creating interfaces that communicate in a way that humans understand them (Fleming & Koman 1998).

Research about user experience so far has revealed a clear connection to human experience.

Observation of the areas of human relationships shows that various factors are applicable in HCI as described in Table 2.

(22)

Table 2. Illustrating the link between human relationships and HCI.

Area Link between humans and HCI

Unique stories and rituals

Various situations with different people are stored with associated emotions, hence human relationships are created by stories and affective value that is remembered. By promoting the creation of rituals people obtain emotional memories of an experience they want to remember, similar to mediated experiences (Mugge et al.

2005).

Engagement and absorption

People remember activities that are engaging and fun. Absorption in activities change the perception of time and provide a deeper experience, similar to being immersed in technology can lead to an extension of the self (Benyon 2014).

Adaption

People adapt their real environment to personal needs. Adapting means that the user changes a product (using cases to your smartphone) or environment (decorating your home) to fit their personal needs and self image.

Continuous curiosity and development

Obtaining knowledge on how to navigate an environment in different manners can be seen as a skill, e.g. learn to use a new software makes users feel good and accomplished, and it will keep them wanting to learn more because it is challenging and developing (Overbeeke et al. 2004).

Trust and security

Trust and security are connected to the need of belonging, which is necessary for a deep and long lasting relationship (Baumeister, R.F. and Leary 1995). With a product or service trust and security can mean that it will not break and that the information shared with it will not be transferred to other parties.

Sharing knowledge

People share knowledge, impressions and emotions through conversations and social meetings (De Mooij 2014). In HCI sharing knowledge can be translated by providing relevant feedback and understanding the user's commands and needs.

Collaboration

Teamwork is a basic human reaction that leads to feelings of inclusiveness and mutual cooperation. In HCI social behavioral changes can be made if the

(23)

users feel they are working as a team and helping each other (Reeves & Nass 1996).

Matching personality

It is very common that people mimic each other as a sign of interest, attraction and likeability (Chartrand

& Bargh 1999). Accordingly, applications and products that fit our self-image and in some way reflects our identity will be more appreciated (Norman 2004).

Significant influence in daily life

For human-human interaction it is more about creating a routine and share a story with a loved one, while for HCI it is more about creating habits around an important factor of the life, which can take place in different forms e.g. various application that people can’t live without.

According to Reeves and Nass (1996) the characteristics of human and computers have a direct link except in the concept of control. They discussed that the most significant difference is that in a mediated context the user has control over the environment whereas this factor is absent in a physical context (ibid). Furthermore, Reeves and Nass (1996) explained how a direct design connection could be made between social relationships and human-media relationships by following the rules of social science and the physical world. An example of this is politeness.

Politeness is essential in creation of a positive interaction for humans and it is something people practice automatically, no matter if the respondent is a computer or human. Reeves and Nass (1996) showed that when media violates social norms, such as the example of being impolite, the violation is viewed as social incompetence and it is offensive. To create a more “polite” computer a set of politeness principles applied on humans also should work for computers, seen in Table 3.

These same factors are relevant for web design, showing a clear link between HCI and human- human interactions (Fleming & Koman 1998):

Table 3. The similarities in human behavior and computer actions in terms of politeness.

Set of politeness

Human behavior Computer actions Quality Speakers should say

things that are true.

Provide with relevant information according for what is searched for.

Quantity Saying neither too much or too little

Design menu systems which present a single work or at most one word for each option

Providing users with error messages that allow the users to set a level of sophistication that could be understandable

Relevance You contribute solely with what the

Icons that represent possible actions could be highlighted and the other dimmed. One aspect of relevance that is ignored in interfaces is response to user goal.

(24)

conversation demands Clarity Presenting ideas and

statements clearly and well

Avoid ambiguity, only one meaning and use icons that are generally understood. Constructing navigation paths on web pages to be clear e.g. having simple icons that are

understandable

3.2.4 Habits and technology

Especially important for creating lasting relationships is everyday use and artifacts becoming a regular part of the daily life, which can be summarized as habits. Habits are important part of our lives as they are programmed behaviors that are guiding people in their everyday actions (Duhigg 2012). What is even more important is the structure of everyday life activities creates a sense of stability and safety (Lauridsen et al. 2015). Habits are defined as “behaviors done with little or no conscious thought without mindful awareness” (Eyal 2014, p.16) and formed through repetition (Duhigg 2012). The potential for a product to become a habit-forming commodity is determined by two factors: frequency (how often the behavior occurs) and perceived utility (how useful and rewarding the behavior is in comparison to other alternative solutions) (Eyal 2014).

An example is personal technology (mobile or computer) which made their way into people’s everyday life, creating new habits for billions of people mainly due to their frequent use and utility (Lauridsen et al. 2015).

To create habit-forming products, The Hook Model is presented (Eyal 2014). The model is characterized by four phases, where each phase in various ways is pulling the user back to the product. Also Duhigg (2012) presented a habit loop similar to Eyal’s model with three steps: cue (the trigger), routine (physical, emotional and mental) and reward. Despite the similarities between the two habit models, Eyal’s model from 2014, was selected to focus on for this master thesis work, as it was believed that the fourth step is essential to a deeper understanding of the creation of habits, see Figure 5.

(25)

Figure 5. The habit loop in the hooked model with description of the 4 phases.

Triggers

Triggers are cues that influence your daily behavior and could be external or internal. External triggers are cues embedded with information that direct the user to the next step to achieve the desired user action e.g. a Coca Cola vending machine with the text ‘Thirsty’ and then offering a description of how to purchase a bottle. External triggers are divided into earned triggers (favorable press mentions), relationships triggers (one person telling about other about the product or service), owned triggers (users getting continuous notifications).

Internal triggers are, contrary to external triggers, coupled with a thought, emotion or routine.

These types of triggers have high influence over the user’s daily life as the triggers automatically direct the user to the connected behavior. An example is a person using Instagram every time she/he fears that a special moment will not be preserved and therefore lost forever (Eyal 2014).

Action

This phase is describing how to get the user to perform the desired action consisting of three required factors to initiate any behavior (Eyal 2014). Firstly, the user needs to have sufficient motivation to initiate the behavior. Secondly, the user has to have the ability to complete the desired action. Finally, a trigger has to be presented to activate the behavior. To increase the likeliness of the action, the required effort, both physical and mental have to be minimized.

Below are six elements of simplicity, factors that will minimize the difficulty of performing an action (Eyal 2014).

1. Time: how long time it takes to complete an action 2. Money: the cost of performing the action

3. Physical effort: how much work or labor that is invested in the action 4. Brain cycles: how much mental effort that is required to complete an action 5. Social deviance: how is the behavior accepted by others

6. Non-routines: How much it disrupts existing routines

(26)

Variable reward

To retain the users’ engagement and hold our attention, the products need to deliver continuous novelty and perform as expected. The variability factor is needed to keep the users excited about the experience. Rewards are essential for keeping the user returning to the product and making the effort it takes to learn the new habits worth it. Simply put, rewards are powerful as they satisfy human cravings (Duhigg 2012). Variable rewards are divided into three forms: rewards of the tribe (rewards that makes us feel accepted, attractive, important and included such as social media), rewards of the hunt (the need and reward of acquiring physical objects such as checking off all your emails) and reward of the self (reward to gaining competence and mastering various skills).

To successfully implement an alteration in behaviors, users have to be presented with a choice between using their old routines of performing a task and using a more convenient way to fulfill existing needs. This means that the users cannot be forced to use new techniques or products, they have to want to use them.

Investment

Before the users can completely manage to change their behavior, they have to invest in the product, similar to a human relationship. The more time or effort that is invested in the product or service, the users increasingly value it. The essence of changing behavior is changing the attitudes and therefore the perception of a product: cognitive dissonance, storing value and creating content. Products or services that are socially accepted and work well with our previous experiences would make us value the products more. Secondly, if the users store value into the product, e.g. software that can adapt to our needs, it automatically becomes an investment product. Thirdly, collecting and saving memories and experiences, such as pictures, increases the personal investment which makes the user less keen to change to another programme/product.

From the investment phase, the next trigger brings back the user to the service or the product. To create a sustainable habit, the product or service has to be used through the Hook Model in multiple cycles; increasingly creating a stronger bond between the product or service and the user, making the user rely on the product to solve their problems until new habits are formed.

Habits and stickiness

Habits are the human behavior of frequent and unconscious use, while stickiness specifically concerns the relationship between customers and users (Croll & Yoskovitz 2013). Stickiness is measured by the amount of value people get from the product and how engaged they are in it (Croll & Yoskovitz 2013), which relates to the creation of habit-forming products. The attachment people feel for various products is an emotional bond that exists between the user and an object with special meaning. This is accompanied with more protective behaviors towards the product and often develop a last-lasting relationship (Mugge et al. 2005), as a proof that the product/service is important in the users’ lives (Croll & Yoskovitz 2013). Creating habit-forming products could lead to increased stickiness, as they both signify that users get hooked and return back. Looking into the example of social media a large amount of stickiness has been built in which has made it become so widespread by having the hooked loop aspects of rewarding people

(27)

in various ways, such as; information flow, social interaction needs, entertainment purposes etc (Lauridsen et al. 2015). Stickiness is measured by the amount of value people get from the product and how engaged they are in it (ibid), which relates to the creation of habit forming products.

3.2.5 Suggested definition for humanized technology

The term humanized technology could be described as a technology more adapted to human characteristics and not the opposite (Extend Limits 2016). A good example of products forcing humans to learn and adapt to them are: the computer mouse and keyboard (Harper et al. 2008).

The philosopher Eric Fromm (2011) explained that to be able to create a better society, technology has to include humanistic values and be accompanied by emotional experiences.

Introducing human factors into the technical and social development would allow the well-being of human to prosper (Fromm 2011). The technology has rapidly developed and computer has increasingly obtained a higher position in our lives, being embedded in all our daily activities from buying food to paying bills (Lauridsen et al. 2015; Harper et al. 2008). This has allowed for creation of new experiences, allowing us to inhabit virtual worlds with people from various parts of the globe (Harper et al. 2008). Despite, the technological development and the changes it has brought, some aspect will remain the same: the characteristics that makes us essentially human, the need for belonging (Baumeister, R.F. and Leary 1995), the wish to be part of a group, the need for safety and comfort (Harper et al. 2008).

The results from the literature study follow a similar direction. Whether technology means that we need to bring human values and emotions forward or better understand them, they need to be central to how we comprehend and design for a changing and more intuitive world. Concluded from the literature study, humanized technology is the development of products and features that keep the user engaged by addressing important emotions and experiences in human relationships. It should in the same way as UX be influenced by the human-technology experience and designed to make the interaction more natural, memorable and highlight the emotional rewards, similar to human relationships.

(28)

4. Enhancing humanized technology with EyeX

The large question needing answer during this project was: How could EyeX enhance a more natural and humanized interaction with the computer for general use? To answer this

question several subtopics had to be answered and resolved and these were stated with the

following questions. Answers for the first two question emerged in the related work connected to this master work thesis, and were summarized for further investigation by practical research.

Concerning the last questions, these could only be answered by users of the EyeX and was therefore to be examined by conducting user studies.

 What is ‘humanized’ technology and how can it be applied in UX?

 Which aspects are important for a long lasting relationship of products and humans?

 How is the relationship between the EyeX device and its users?

 How could habits be created and the stickiness of the EyeX improved?

 How could Tobii use these insights to better create future features for EyeX? Present these insights as evaluation methods for Tobii to use in future development processes.

This section focuses on the EyeX product, and furthermore the methodology and practical methods that are needed for concluding how the research questions can be answered.

4.1 Tobii EyeX eye tracker

The EyeX device is a video-based pupil/corneal reflection and is used together with the mouse, touchpad, or/and keyboard to provide the user with control over the device. The Tobii Tech Eye tracker EyeX, seen in Figure 6, is today mostly used in gaming to target with your eyes; i.e.

providing control in two directions (Tobii Developer Zone 2016) such as driving a car and shooting at the same time, it is also used in areas like character recognition. The characters come to life, as the integration becomes better in games and situations can be handled simultaneously (UK 2015). The benefits in games are the opportunity of having another pointer and obtaining the knowledge of what the game user is looking at and therefore is interested in. The EyeX device is relatively new, launched in December 2015 on the market with gamers as the target group (Central 2016). In January, MSI released the first integrated eye tracker in their gaming computer; being the first gaze enabled laptop on the market (ibid).

Figure 6. The Tobii EyeX controller (Tobii 2016b)

(29)

For general users EyeX is not used in the same extent. An ordinary usage of Eye X could be divided into three steps:

1. Out of the box (installation and mounting): downloading the software from the Tobii’s website, installing it on the computer and mounting the hardware onto the computer.

2. Calibration: collection of variations in data from the user to enable a correct user profile to optimize precision and accuracy in the device.

3. Usage: enable everyday computer interactions to go smoothly and enhance a more natural and intuitive experience both in games and in general computer use e.g. navigation (Tobii 2016b)

One of the leading words for Tobii Tech is humanized technology and Tobii Tech has tried to implement various human aspects into the EyeX product. The aim is to create features that are more natural to humans in both interaction and behavioral aspects (EyeX 2016). The current features and their connection to human manners are presented below in Table 4.

Table 4. Current EyeX Interaction features and how they are humanized.

Feature Description Human aspects

Presence features:

Windows Hello, dim- screen, auto lock and stay awake

(Tobii 2016b)

EyeX detects if a person is present, and through that dims or locks the screen when the user is absent.

Windows Hello, provide users with a more natural login function where passwords are substituted with face recognition to log in automatically.

People notice others as they walk into a room or as they sit in front of you.

Application Switcher:

Select app @ gaze

(EyeX 2016)

EyeX provides visual feedback on the gaze point with the aim simplify the switching between applications and windows.

People switch focus and tasks regularly.

General features:

(EyeX 2016)

With EyeX, the users make direct interactions by zooming on maps where they look, scrolling in the observed window, and moving the mouse cursor to the gaze position during simple navigation on the computer screen.

People interact directly and naturally with other people e.g.

addressing a person when she or he speaks to you.

(30)

4.2 Methodology

The thesis is based on an explorative research (Håkansson 2013) where the aim was to identify the term humanized technology and investigate how applicable human characteristics are in HCI and for humanizing the EyeX device. Explorative research is conducted when the problems are not clearly defined and when the aim is to investigate, explore and develop familiarity with a new concept e.g. humanized technology (ibid). This method usually relies on secondary research such as available literature and qualitative approaches e.g. interviews, which was considered appropriate for this work. The process is divided into two parts: starting with a theoretical section containing related work in areas of eye tracking and humanized technology, which was already presented, followed by a practical part, containing user studies, see Figure 7. The methods for the theoretical part were not presented as no particular method was used, however, the methods used for the practical part will be presented below. Finally, the outcome of the work was presented and summarized as methods for evaluating the humanization in products and features.

Figure 7. Project process steps.

4.2.1 Workshops and Brainstorming methods

The goal of the initial brainstorming sessions was to collect information, thoughts and ideas around the initial research and understand users’ current relationship with computer in order to develop new concept features. This was done by exploring how people connected important bond-creating factors to; current products, their computers and the eye tracker; with different brainstorming exercises to promote creativity. The total brainstorming time was 1-1,5 hour to use the optimal time for creativity sessions suggested (Kelley & Littman 2001). In total, three workshops were conducted with various exercises; see further descriptions in Table 5 and below:

Table 5. The description and goals of the two types of workshop.

Creative people workshop The first workshop was held for ordinary computer

(31)

users in creative design fields, it was seen as a test- workshop to evaluate the exercises and the time plan.

The goal was to find new inspiration and concept ideas by mainly focusing on the human-computer relationship.

Company workshop Participants came from various areas of the Tobii Tech department such as UX designers, developers and quality assurance (QA). The two workshops for Tobii were focusing on how the human factors could be implemented in UX design, most importantly new features for the eye tracker EyeX.

Roleplay/Bodystorming

The role-playing (or body storming) exercises are good for creating an atmosphere of creativity and engagement where participants interact as they would in a defined situation. The exercise has been used at IDEO with good result (Kelley & Littman 2001), and in this case it was used to study how the interaction between humans and computers can be changed to enhance the user experience. The goal was to gain insights on how the computer should react and give feedback to improve the user experience in situations of frustration or enhance the experience in a situation of accomplishment. Participants were divided in groups of two and given a scenario to role-play, with one playing the character of the user and the other playing the computer, all three scenarios can be seen in Appendix C. This exercise was solely used in the creative people workshop.

Focus group discussion about the relationship with computers

This exercise was performed to obtain deeper qualitative information about users’ relationship and emotions to the computer, similar to focus groups this method allowed the participants to freely address the topic of creating relationships with computers (Stickdorn & Schneider 2011).

By asking the question “What does your computer mean to you?” and letting the participants individually write down important words and feelings, concerning their relationship with the computer. These written down thoughts, which participants got to present, worked as a foundation for a deeper discussion about human-computer relationships in the whole group.

Rating of relationship factors

As various important relationship bonding factors had been found in the initial research these were presented on a sheet, see Appendix D, to brainstorming participants to see how they connected the factors to a product they felt they had a good relationship to. By instructing participants to rate their three most important factors from 1-3 for a short time and then leaving them to explain and motivate their rating. For the creative workshop this was done individually.

However, for the company it started individually in the same manner as the test workshop but continued by letting the participants, in pairs or groups of three, connect the factors perceived as most important to the use of eye tracking.

References

Related documents

Both Brazil and Sweden have made bilateral cooperation in areas of technology and innovation a top priority. It has been formalized in a series of agreements and made explicit

För att uppskatta den totala effekten av reformerna måste dock hänsyn tas till såväl samt- liga priseffekter som sammansättningseffekter, till följd av ökad försäljningsandel

The increasing availability of data and attention to services has increased the understanding of the contribution of services to innovation and productivity in

Syftet eller förväntan med denna rapport är inte heller att kunna ”mäta” effekter kvantita- tivt, utan att med huvudsakligt fokus på output och resultat i eller från

Generella styrmedel kan ha varit mindre verksamma än man har trott De generella styrmedlen, till skillnad från de specifika styrmedlen, har kommit att användas i större

I regleringsbrevet för 2014 uppdrog Regeringen åt Tillväxtanalys att ”föreslå mätmetoder och indikatorer som kan användas vid utvärdering av de samhällsekonomiska effekterna av

a) Inom den regionala utvecklingen betonas allt oftare betydelsen av de kvalitativa faktorerna och kunnandet. En kvalitativ faktor är samarbetet mellan de olika

Parallellmarknader innebär dock inte en drivkraft för en grön omställning Ökad andel direktförsäljning räddar många lokala producenter och kan tyckas utgöra en drivkraft