• No results found

Tangible User Interfaces in the Smart Home Environment

N/A
N/A
Protected

Academic year: 2021

Share "Tangible User Interfaces in the Smart Home Environment"

Copied!
50
0
0

Loading.... (view fulltext now)

Full text

(1)

Tangible User Interfaces in the

Smart Home Environment

Exploring the User Experience of Instant Smart Lighting System Control

Iris Bataille

Interaction Design One-year Master 15 ECTS

Spring semester 2020

(2)

2

Abstract

Smart technologies are becoming ubiquitous and more complex in home environments, which brings the challenge for interaction designers to ensure a pleasant user experience for smart homes. Therefore, this thesis explores how different types of tangible user interfaces influence the user experience of smart lighting system control. After conducting research in this context, two different tangible user interfaces were designed: one counter device using tokens and one hand-held device using embodied metaphors. These devices were validated through testing their engagement, ease of task performance, meaningfulness of representation and controls, as well as richness of interaction and human skills. It was found that the counter device creates the best user experience when performing more complicated tasks in a frequent long-time use scenario, while the hand-held device creates the most pleasant experience when performing less complicated tasks in an infrequent use scenario.

(3)

3

Table of Contents

1. Introduction

...

4

2. Background

...

6

2.1 Theory

...

6

2.2 Canonical Examples

...

8

3. Methodology

...

12

3.1 Project Plan

...

12

3.2 Methods

...

13

4. Design Process

...

15

4.1 Analysis

...

16

4.2 Ideation

...

17

4.3 Conceptualization

...

21

4.4 Realization

...

26

4.5 Validation

...

30

5. Results

...

37

5.1 Tangible User Interfaces for Instant Smart Home Control

...

37

5.2 Results from Validating the Tangible User Interfaces

...

38

6. Discussion

...

41

6.1 Tangible User Interfaces

...

41

6.2 User Validation

...

41

6.3 Other Results from the User Validation

...

42

6.4 Future Work

...

42

7. Conclusion

...

43

8. Acknowledgments

...

44

9. References

...

45

Appendix

...

47

1. Final Digital Context Prototype Code

...

47

2. User Validation Study Set-up

...

49

(4)

4

1. Introduction

Smart homes are domestic architectural spaces in which devices and systems can be controlled automatically by using smart technologies (Allen et al., 2001). These homes are becoming bigger and more complex, for example by increasing the number of smart devices such as smart lamps. This brings an interesting challenge for interaction designers to make sure that the user still has a pleasant experience while operating smart home systems (Luria, Hoffman & Zuckerman, 2017).

There is a trend in increasing the smart home system autonomy, which can be defined as the ability to automatically and remotely control basic home functions and features (Brennan, McCullagh, Galway & Lightbody, 2015). However, the downside of this is that users can lose the sense of control over the state of their home (Koskela & Väänänen-Vainio-Mattila, 2004). This is why this thesis project is not focused on home automation, but on tangible interfaces that enable users to control their smart homes.

Tangible interfaces can facilitate a richer way of interaction, which increases the understandability and trustability of smart home devices (Angelini, Mugellini, Abou Khaled & Couture, 2018). However, in a comparative study, it was found that the usability of tangible interfaces is still a significant design challenge (Luria et al., 2017). Therefore, the usability and user experience of different kinds of tangible user interfaces in the smart home environment will be further explored in this thesis project.

There are different ways to control smart homes. A distinction can be made between two different types of smart home control. The first type is pattern control, which is a way of programming smart home automation (Koskela & Väänänen-Vaino-Mattila, 2004). Several interfaces have been designed with this goal in mind, such as Smart Home Cards (Tada, Takahashi & Shizuki, 2016). The second type is to control and adapt the smart home instantly. Examples of these interfaces are SensePods (Khan & Zualkernan, 2018) and Ikea’s Shortcut Buttons (Ricker, 2019). Further descriptions of the examples can be found in chapter 2.2. In this project, the focus will be on the latter, instant smart home control.

So this thesis aims to explore the user experience of tangible interfaces for instantly controlling smart homes. To make this research more specific and narrowed down, the use scenario of controlling a smart lighting system is chosen to focus on just one sub-area of smart home devices.

This results in the research question: How can different types of tangible user interfaces

influence the user experience of instant smart lighting system control? To answer this research

question, two different types of tangible user interfaces will be analyzed through testing 1. their engagement 2. ease of task performance 3. meaningfulness of representation and controls 4. richness of interaction and human skills.

By answering this research question, this thesis aims to contribute to the field of interaction design through generating knowledge about the differences in user experience between different types of tangible user interfaces. This knowledge can be used for future development of new tangible smart home user interfaces. Next to the goal of contributing to the field of interaction design, the topic for this thesis was also chosen because of personal interests. My interests are in tangible interaction, design for the everyday life, user-centered design and IoT, which are topics combined in this thesis project.

In this thesis project report, the background will be explained first by introducing the theory about user control and experience in the smart home and tangible interaction. Next to that,

(5)

5 several canonical examples are discussed. After that, the project plan and the methods used during this project will be introduced. Then, the process of this thesis is described, which is followed by the results produced during the process. Lastly, the project process and results will be discussed and a conclusion is drawn.

(6)

6

2. Background

In this section, an overview of the background of this thesis is given by discussing theory related to user control in smart homes, the user experience of smart homes and tangible interaction. Next to that, several canonical examples related to this thesis are reviewed.

2.1 Theory

2.1.1 User Control in the Smart Home

Smart homes are domestic architectural spaces in which devices and systems can be controlled automatically by using smart technologies (Allen et al., 2001). This is an application of ubiquitous computing, the integration of multiple technical devices into the everyday physical world (Weiser, 1993). These devices are often interconnected, which makes smart homes an application of the Internet of Things (IoT) as well (Xia, Yang, Wang & Vinel, 2012). Smart homes offer a better quality of life by optimizing user comfort at home (Alam, Reaz & Ali, 2012). However, these homes are becoming bigger and more complex because of the increase in smart devices in the home environment. This brings an interesting challenge for interaction designers to make sure that the user still has a pleasant experience while operating smart home systems (Luria et al., 2017).

There are different ways of smart home control. Firstly, a trend can be seen in increasing the smart home system autonomy. Since smart homes are becoming bigger and more complex, automation is a suitable solution for not overwhelming the user. However, the downside of this is that users can lose the sense of control over the state of their home (Koskela & Väänänen-Vainio-Mattila, 2004). Therefore, there is another way of smart home control in which the user is given full authority. This way, they gain a feeling of control again. Another argument for this way of control is that life should be mentally and physically challenging and therefore human effort should be required for operating smart homes (Intille, 2002).

These different ways of smart home control can be viewed as two extremes of a bigger spectrum. The interaction-attention continuum (see figure 1) visualizes this spectrum very well (Bakker & Niemantsverdriet, 2016). An example of a way of smart home control that is in between these extremes is calm computing. With calm computing, the focus is on informing the user in the periphery of attention and only demanding focus when necessary (Weiser & Brown, 1997). This way, the user is still in control and the smart home supports the user when needed.

Figure 1. The interaction-attention continuum (Bakker & Niemantsverdriet, 2016).

Lastly, a division can be made in two different kinds of user control for smart homes. One way of control is pattern control, which focusses on controlling functions in advance. The other way is instant control, which focusses on managing functions instantly. It was found that these different ways of control should correspond to different kinds of interfaces (Koskela & Väänänen-Vainio-Mattila, 2004). In this thesis, the focus will be on instant smart home control.

(7)

7 2.1.2 The User Experience of Smart Homes

As mentioned before, there is a challenge for interaction designers to improve the user experience of smart homes (Luria et al., 2017). The user experience can be defined as the individual experience of someone using a system, which is influenced by prior experiences and related to its social and cultural context (Roto, Law, Vermeeren & Hoonhout, 2011). For a good user experience, the focus should not only be on the functionality and usability of a user interface, but also on autonomy, competence, relatedness and self-actualization (Eggen, van den Hoven & Terken, 2016).

Several studies have been conducted to compare the user experience of different kinds of user interfaces for smart home control. In 2004, researchers found that user interfaces with diverse input methods are most suitable for pattern control, while portable central interfaces fit instant control best (Koskela & Väänänen-Vainio-Mattila, 2004). More recently, a study was conducted to compare social robot, screen and voice interfaces. It was found that the social robot interface could provide high situation awareness, but that its usability is still a big design challenge (Luria et al., 2017). This social robot interface relied on physical human action by sharing objects with the user and can therefore be seen as a tangible interface. More information about this social robot can be found in section 2.2.2.

2.1.3 Tangible Interaction

In 1997, Ishii and Ulmer introduced the term “Tangible Bits”, which are tangible interactive surfaces which allow for grasping and manipulating (Ishii & Ulmer, 1997). This has further developed into the field of Tangible Interaction. Tangible Interaction (TI) can be defined as a subfield of interaction design that focusses on tangibility and full-body interaction and materializes data and computational resources (Hornecker & Buur, 2006). Tangible interaction can be applied to smart homes by introducing tangible user interfaces (TUIs). Tangible user interfaces are interfaces that interlink the digital and physical world (Shaer & Hornecker, 2010). These interfaces facilitate a richer way of interaction, which increases the understandability and trustability of smart home devices (Angelini et al., 2018).

There are two other design areas related to tangible interaction and smart homes that are worth mentioning as well. Firstly, the field of tangible computing, which includes tangible user interfaces and ubiquitous computing, amongst other things. The characteristics of tangible computing are multiple input possibilities, no enforced order of action and the use of affordances to guide the user in their interactions (Dourish, 2001). This is closely related to the field of embodied interaction, which can be defined as interacting through involving people’s physical body with technology, for example through gestures and movements (Dourish, 2001). Another field that is related to tangible interaction and smart homes is the Internet of Tangible Things (IoTT). In this field, tangible interaction is being applied to the Internet of Things. By applying tangible interaction to the Internet of Things, rich and meaningful interactions that use human skills can be created (Angelini et al., 2018).

It was found that there are several different types of tangible user interfaces. Firstly, a distinction can be made between counter devices and hand-held devices. Next to that, four different types of interacting with tangible user interfaces were found.

The first type is a counter device with which can be interacted through tokens. These types of tangible user interfaces often consist of two parts: tokens and a display which provides the constraints for the tokens. The tokens are physical objects that represent digital information (Ullmer Ishii & Jacob, 2005). By changing the placement of the tokens within the constraints of the display, the output of the connected smart devices changes as well. The second type is

(8)

8 interacting with a counter device through physical manipulation (Manches & O’Malley, 2012). This can be done by for example pressing a button or moving a slider to change the output. The third type of interaction is interacting with a hand-held device through embodied metaphors. Embodied metaphors are metaphorical extensions of embodied structures, so added metaphorical meanings to movements (Bakker, Antle, & Van Den Hoven, 2012). By moving while holding the hand-held device, changes are made to the output of the smart home system. Lastly, the fourth type is interacting through gestures. This interaction relies on moving, holding and touching hand-held devices to make changes (Angelini, Lalanne, Van den Hoven, Khaled & Mugellini, 2015).

In the following chapter, examples of these different types of TUIs that are related to the smart home environment are discussed.

2.2 Canonical Examples

The examples discussed in this chapter have been selected since they show different types of tangible user interfaces and are all inspirational for the thesis project. The first interface is designed for pattern control and the others are designed for instant control. Example 2.2.3 and 2.2.5 are not related to the smart home environment, but are still included since they are good examples of different types of tangible interaction.

2.2.1 Smart Home Cards

Figure 2. The Smart Home Cards. a) Paper cards, b) drawing parameters, c) putting cards, d) the result (Tada,

Takahashi & Shizuki, 2016)

The Smart Home Cards are paper cards (see figure 2, a) that are used to control a smart home programming environment. They are designed by Tada, Takahashi and Shizuki. To change the settings, users can draw on the cards (figure 2, b). Then, the cards are placed on a rule board (figure 2, c). This way, a more expressive end-user tangible programming environment for controlling smart devices is created (Tada et al., 2016).

These cards are a good example of making data physical and of interaction through tokens. One of its strengths is the possibility for users to change the settings by drawing on the cards. This way, the user has a lot of freedom in controlling their smart home. However, these cards are meant for pattern control of smart homes and would therefore not be as useful and meaningful for instant control. Nevertheless, they are still inspirational for designing an interface that enables the user to adapt aspects of the interface to their needs.

(9)

9 2.2.2 Vyo

Figure 3. Vyo, the embodied social robot interface for smart-home control (Luria et al., 2017)

This social robot interface was part of a comparative study between different user interfaces for smart home control (see 2.1.2). This interface, which is designed by Luria, Hoffman and Zuckerman, combines tangible user interfaces with expressive robotics. Users can control their smart home by placing and moving icons that represent the smart devices that are connected to the interface over the bottom of the interface, using them as physical sliders. The output of this interface is the state of each smart device that is connected to the interface, expressed in physical gestures and icons that appear on a screen (Luria et al., 2017).

This social robot interface demonstrates the type of tangible user interfaces that uses tokens to enable users to give input to the device. It would be interesting to further explore the possibilities for using tokens and the experience that is created. The use of expressive robotics is not relevant for this thesis project, but still an interesting addition.

2.2.3 Marble Answering Machine

Figure 4. The Marble Answering Machine. Left: new messages have arrived. Middle: The user plays back a

message. Right: The user stores a message. (Shaer & Hornecker, 2010)

The Marble Answering Machine has been designed by Durrell Bishop in 1992 (Abrams, 1999). This device is a very iconic example of a tangible user interface. It uses marbles (tokens) that represent messages. By placing the marble on a play-back area (figure 4, middle) or on a dish (figure 4, right), the user can listen to the message and store it. To delete the message, the marble can be put into the device again.

This interface is not related to smart homes, but is still relevant to this thesis project. It is namely a well-known example of using tokens for a tangible user interface. This is an interesting type of tangible interaction that could be further explored.

(10)

10 2.2.4 Shortcut Buttons

Figure 5. The Shortcut Buttons with different icons (Ricker, 2019)

At the end of 2019, Ikea introduced their new Shortcut Buttons for smart home control. These buttons can be attached to the wall and have a changeable cover. By pressing the button, the user can change their smart devices to a pre-defined setting. These settings are connected to an event, such as leaving home (Ricker, 2019).

The shortcut buttons are an example of interacting through physical manipulation. Next to that, this is a good example of tangible user interfaces that are not only used for research, but also really designed to be sold to customers. The advantages of these buttons over controlling the smart home via an app are accessibility and situatedness. The buttons are easily accessible to children who do not have access to a smartphone and to guests. Additionally, the buttons could be placed at convenient places that correspond to the related event. For example, a ‘dinner time’ button could be placed in the kitchen. This makes their situatedness an advantage as well.

2.2.5 Moving Sounds

Figure 6. Three out of eight artifacts designed for Moving Sounds. These three artifacts embody manipulating the

tempo of sound (Bakker, Antle & Van Den Hoven, 2012)

Moving Sounds is a tangible system for teaching children about abstract sound concepts, which is designed by Bakker, Antle and Van Den Hoven. Different artifacts were designed to embody different abstract sound concepts (see figure 6). This was done by using embodied metaphors in interactive objects. The Moving Sounds have been used to develop a design approach to designing tangible systems with embodied metaphor-based mappings (Bakker et al., 2012).

Even though the design context of Moving Sounds is completely different from the design context for this thesis project, the Moving Sounds project is still relevant for this project. It demonstrates a different type of interacting with tangible user interfaces, namely through embodied metaphors. It would be interesting to see this type of interaction being applied to smart homes as well.

(11)

11 2.2.6 SensePod

Figure 7. A prototype of the SensePod (Khan & Zualkernan, 2018)

The SensePod is a hand-held wireless device, designed by Khan and Zualkernan. This device can be used to control a smart home using gestures such as rubbing, tapping or rolling the device on a surface. It has been designed to complement voice or smartphone-based interfaces (Khan & Zualkernan, 2018).

The SensePod is a good example of interacting with a hand-held device through gestures. The use of gesture recognition creates an interesting new smart home user experience. However, the designers’ focus was on the functionality of the device. Therefore, it would be interesting to further look into the user experience of a gesture-based smart home interface.

To conclude, several insights have been gained by analyzing these examples. Firstly, the examples showed different ways of making data physical, which are inspirational for the designs created during this thesis project. Next to that, several interesting examples of types of tangible user interfaces, such as interfaces using tokens and embodied metaphors, were found. These examples are very useful for defining the different types of tangible user interfaces.

(12)

12

3. Methodology

3.1 Project Plan

week 1 2 3 4 5 6 7 8 Analysis Ideation Conceptualization Realization Validation

Figure 8. An overview of the different phases of this thesis, placed on a timeline from the start to the end of the

thesis project (10 weeks)

This thesis project has been divided into six different phases (see figure 8). Each phase will be discussed below.

3.1.1 Analysis

During this phase, the focus is on analyzing the research domain by doing literature research and analyzing related work. By doing this, the research question will be refined and different types of tangible user interfaces will be defined.

3.1.2 Ideation

The ideation phase starts with defining a design scenario and the different types of tangible user interfaces that will be used for this thesis project. After that, design ideas are developed using the rapid ideation method. The goal of this phase is to come up with a variety of proposals for each type of tangible user interface.

3.1.3 Conceptualization

In this phase, one design idea per type of tangible user interface will be selected and further developed by creating customer journeys. Next to that, low-fidelity prototypes will be created. 3.1.4 Realization

During the realization phase, experience prototypes will be produced of the design concepts. The focus will not be on making the prototypes completely functional, but on demonstrating the experience of the concepts using the Wizard of Oz method.

3.1.5 Validation

In this phase, the different types of tangible interfaces will be validated by conducting user tests. For these user tests, several design requirements will be defined using the literature from the analysis phase. The goal of these tests is to compare the different user interfaces and to draw conclusions about the user experience of these different interfaces.

(13)

13

3.2 Methods

3.2.1 Literature Research

The literature research for this project consisted of several different phases. Firstly, possible relevant sources were searched for on several platforms and databases using specific keywords.

After having found the sources, each source was numbered, to keep a clear overview, and then read. Extra sources were found by looking at the references in the papers that were selected. Notes were made for each source, which were later used to divide the sources into different categories. Lastly, schematics were made of different fields and the sources belonging to these fields.

This method was chosen to get a better understanding of the different subfields of interaction design that are relevant for this thesis project. Next to that, the knowledge gained from doing the literature research was used for defining a realistic sketch of use (see 3.2.2) and different types of tangible user interfaces that could be designed (see 4.2.2).

3.2.2 Scenario-Based Design

The term Scenario-Based Design covers several techniques that describe the use scenario of a future system, early in the design process (Rosson & Carroll, 2009). The focus of this design practice is on the use of a system to accomplish tasks. The scenario of this user interaction is a

sketch of use (Rosson & Carroll, 2009).

This method was used during the ideation phase to make the design context more specific. By doing this, it was made sure that both devices can be used in the same scenario, since there is no difference in the details of the different concept directions. This way, the different concepts are easily comparable without the influence of external factors.

3.2.3 Rapid Ideation

The goal of rapid ideation is to generate, evaluate and refine a wide range of designs in a short time period (Clark & Reinertsen, 1998). It can be defined as an active idea generation session with high speed. During this ideation session, notes, drawings and photographs are made. This ideation method was chosen to make sure that qualitative ideas were generated within the tight time constraints of this thesis project.

3.2.4 User Journey Mapping

A user journey map, also known as a customer journey map, visualizes the user’s journey of performing a certain task (Marquez, Downey & Clement, 2015). It is used to get a better understanding of the steps required to perform this task. The visualization can be done in several ways, one of which is making a storyboard.

This method was used during the conceptualization phase to further develop the selected tangible user interfaces. It was decided to make a customer journey storyboard of a certain use scenario, to further define the concepts.

3.2.5 Experience Prototyping

An experience prototype is a prototype that allows for first-hand experiencing existing or future conditions through interacting with it (Buchenau & Fulton Suri, 2000). This means that the prototype does not have to be completely functional, as long as it still can convey the experience that would be created by using a design concept. An experience prototype can be used for understanding existing experiences, exploring design ideas and communicating design concepts (Buchenau & Fulton Suri, 2000).

(14)

14 This method was used to develop prototypes that can communicate the design concepts for the two different tangible user interfaces to potential users during the validation phase. These prototypes are not fully functional since they aim to envision and explore the user experience. 3.2.6 Wizard of Oz

The Wizard of Oz method is a way of creating the experience of using a design concept, without the prototype being completely functional (Dow, MacIntyre, Lee, Oezbek, Bolter & Gandy, 2005). In studies that use this method, which is inspired by the movie with the same name, a wizard operator plays a role in the system that would otherwise be performed by the system itself.

This method was used during the user validation to create the user experience of the different tangible user interfaces without making the prototypes fully functional. The role of wizard

operator was taken by me, by operating the digital context prototype. So instead of using

technology to change the light settings when the user performed certain actions with the tangible user interfaces, they were changed manually. This way, I was able to test the devices without having to make them fully functional.

3.2.7 Design Requirements

Design requirements state the important characteristics that a design has to meet to be successful (Van Boeijen, Daalhuizen, van der Schoor & Zijlstra, 2014). In several resources found during the analysis phase, various design requirements related to the user experience were discussed (Luria et al., 2017, Koskela & Väänänen-Vaino-Mattila, 2004, Angelini et al. 2018).

The design requirements were used during the validation phase to find categories for comparing the two tangible user interfaces. Relevant design requirements were selected from the resources mentioned above and used to make interview questions. This way, well-founded questions were formulated that could be used for answering the research question. 3.2.8 Semi-structured Interviewing

A semi-structured interview is a conversation between an interviewer and another person during which the person is asked questions, but is also able to discuss topics and issues that they think are important (Longhurst, 2003). So even though the interviewer has prepared interview questions, the interviewee still has the opportunity to deviate from these pre-determined questions.

This method was chosen for the user validation to get a better understanding of the experience of the participants. Even though I prepared specific questions related to the design requirements (see 3.2.7), I also wanted to know about the general experience of the participants. Therefore, I chose to use the semi-structured interviewing method instead of a fully structured interview.

3.2.9 Affinity Diagram

An affinity diagram is a structured way of presenting information in groups based on their natural relationships. It is often used to analyze and organize data and ideas (Naylor, 2019). This method was used during the validation phase to process the data gathered during the user validation sessions. By using this method, the qualitative data were organized in a clear and structured way.

(15)

15

4. Design Process

Analysis Ideation Conceptualization Realization Validation

4.1 4.2.1 4.2.3 4.3.1 4.3.4 4.4.1 4.4.2 4.5.1 4.5.5 4.2.2 4.3.2 4.5.2 4.5.6 4.3.3 4.5.3

4.5.4

Figure 9. An overview of the design process of this thesis project. For each phase, the corresponding chapters in

this thesis report can be found underneath the graph.

The design process of this thesis project has been divided into five phases (see 3.1). Each of these phases is again divided into several sub-phases. A graph was made to show the process (see figure 9). An ascending line shows the process of broadening the project, while a descending line shows the process of narrowing down the project. Each peak and valley are explained below:

1. Start of the project. I knew that I wanted to do something with tangible interaction in the smart home environment, but that was all I knew.

2. End of the analysis phase. Information was gathered about several ways of smart home control and several types of tangible interfaces.

3. Mid-ideation phase. I defined the sketch of use and the two tangible user interfaces that I was going to design for.

4. End of the ideation phase. Several ideas for the two tangible user interface concepts were developed.

5. Mid-conceptualization phase. One idea per concept was selected and further developed by making customer journey storyboards and low-fidelity prototypes.

6. Second iteration for the hand-held device. I ideated again about implementing a signifier for changing the color temperature into the hand-held device.

7. End of the conceptualization phase. For each tangible user interface type, a well-documented design concept was defined.

8. Mid-realization phase. A tangible experience prototype for each design concept was made. After this, the development of the digital context prototype started.

9. Mid-realization of the digital context prototype phase. Two extra iterations were conducted to develop an understandable way of showing the change of the color temperature of each lamp.

10. End of the realization phase. A final digital context prototype was made.

11. Mid-validation phase. New insights were gained by testing the prototypes with several possible users.

12. End of the validation phase. I formulated several insights gained and conclusions drawn from testing the prototypes.

1 2 3 4 5 6 7 8 9 10 11 12 co nv erg en ce

(16)

16

4.1 Analysis

Figure 10. The schematic of the literature read during the analysis phase. Each circled number refers to a source

that was read.

This thesis project started with exploring the context by doing literature research (see 3.2.1). First, a collection of papers was gathered using the keywords smart home, tangible interaction,

interaction design, user experience design, Internet of Things, design for the everyday life and

combinations of these words. The platforms and databases used for finding these sources are the ACM digital library, Google Scholar and the Malmö University library.

After that, these papers were read while taking notes. To not become too overwhelmed with all the different kinds of literature, I divided the literature into different categories: smart home theory, user experience theory, tangible interaction theory and canonical examples (see figure 10). Since some theory sources belonged to multiple categories, a Venn diagram was made.

Figure 11. The schematic of the literature about tangible interaction and the subfields I encountered. To get a better understanding of the subfields of tangible interaction that I encountered, another Venn diagram was made (see figure 11). All overviews were then used to write a coherent text about the background theory and canonical examples using the relevant literature. By going through this phase, I got a better understanding of the context that I am designing for, which helped me define a realistic sketch of use. Next to that, I was able to define different types of tangible user interfaces by looking at the canonical examples.

(17)

17

4.2 Ideation

4.2.1 Sketch of Use

To get a better understanding of the design context I was designing for, I made a sketch of use (see 3.2.2):

The context that I am designing for is a smart home, which has one room with two smart light bulbs, such as Philips Hue (Philips, 2020) or Ikea Trådfri (Ikea, 2020). There are no other smart devices in this room. For each lamp, the user can change:

- the brightness of the lamp - the color temperature of the light

Both variables can be changed over a range and are therefore not binary (either on or off). These changes are also made instantly, which means that the user immediately controls and adapts the system.

The smart home will be operated by just one user. This user hasn’t had a smart home before. Therefore, this is the first time that the user is operating a smart home. However, the user has prior experience with using technology and is therefore not technology illiterate. The tools that are provided to the user to operate their smart home are the different tangible user interfaces that I will design. These artifacts will be used one at a time.

By doing this, I felt more confident in starting to ideate about different types of tangible user interfaces. It helped me to exactly state which devices and what settings should be controlled with the tangible user interfaces.

4.2.2 Types of Tangible User Interfaces

The next thing I did was defining the different types of tangible user interfaces that I could design for. First, I defined the type of TUI for each of the canonical examples explored during the analysis phase:

Canonical example Type of TUI

Smart Home Cards Counter device

Interaction through tokens

SensePod Hand-held device

Interaction through gestures

Vyo Counter device

Interaction through tokens

Marble Answering Machine Counter device

Interaction through tokens

Moving Sounds Hand-held device

Interaction through embodied metaphors

Shortcut Buttons Counter device

Interaction through physical manipulation (pushing)

From this, I noticed a clear difference between hand-held and counter devices. This led me to further investigate these two different types:

(18)

18

Type of TUI Type of TI

Counter Device Interaction through tokens

Interacting with a device by using movable physical parts that embody certain functions or parts of the smart home.

Interaction through physical manipulation (pushing)

Interacting with a device by physically manipulating, in this case pushing, the interface.

Hand-held Device Interaction through gestures

Interacting with a device by gesturing over or on the surface of the device.

Interaction through embodied metaphors

Interacting with a device by moving the body while holding the device.

So both types of TUIs can be divided into two different types of TI with these devices.

The aim of this thesis project is to explore the user experience of different types of tangible user interfaces. To be able to make a clear comparison and to make this thesis feasible considering the time constraints and the scope of the project, two specific types of TUIs were chosen to be further developed. These two were chosen because of their interesting aspects and diversity from each other.

The first TUI is the counter device using tokens. This TUI uses the concept of situatedness. The interaction possibilities with this device are limited to a specific field. Next to that, the user uses mostly their hands to operate this device. The tokens are chosen as type of interaction with the counter device, since they allow for rich interactions and can be applied in different ways. The second TUI is the hand-held device using embodied metaphors. The use of this TUI is more expressive and enables the user to use its whole body. Next to that, the device is not bound to a specific place. The embodied metaphors are chosen as type of interaction with the hand-held device, since they give meaning to movements and therefore aim to help the user to understand the device.

To summarize, these are the two types of TUIs that I am going to design for:

Counter Device Hand-Held Device

Interaction through Tokens Interaction through Embodied Metaphors

= interacting with a device by using movable physical parts that embody certain functions or parts of the smart home

= interacting with a device by moving the body while holding the device, adding metaphorical meanings to movements

Examples:

⋅ Marble Answering Machine ⋅ Vyo

Example:

(19)

19 4.2.3 Rapid Ideation

After having defined the sketch of use and the two types of TUIs that I am designing for, it was time to start ideating. I did a rapid ideation session (see 3.2.3) for each TUI type.

Counter Device using Tokens

For the first type of tangible interface, I first defined the different things that the tokens can embody. Three categories were defined: a connected smart device, a setting or a pre-set for a specific situation. After that, I explored the possibilities for different interfaces with different kinds of tokens by making sketches (see figure 12).

Figure 12. A few of the sketches made while ideating about the counter device using tokens. Left: the device that

uses tokens that embody a connected smart device. Middle: the device that uses tokens that embody a setting. Right: the device that uses tokens that embody a pre-set for a specific situation.

Hand-Held Device using Embodied Metaphors

For this TUI, I first ideated about the possible metaphors that could be used to embody the brightness and color temperature settings (see figure 13). I chose the metaphor small vs big to embody the brightness of the light, since this can be associated with the lighted area around the lamp (small if the brightness is low and big if the brightness is high) The metaphor low vs

high was chosen for the color temperature, since this can be associated with the number of

degrees of the temperature (a low number means a cold temperature and a high number means a warm temperature).

Figure 13. Ideating about the possible embodied metaphors for the brightness and color temperature settings.

After that, some new ideas were developed for implementing these embodied metaphors in a hand-held device (see figure 14).

(20)

20

(21)

21

4.3 Conceptualization

4.3.1 Idea Selection

For the counter device using tokens, the idea was selected in which the tokens embody a connected smart device (see figure 15). It was found that this concept makes the best use of the spatiality of using tokens and facilitates best for connecting the device to multiple devices. In this concept, the base is a display that is placed at an angle. By moving a token of one of the lamps vertically (and therefore higher or lower), the brightness is changed. The color temperature can be changed by moving the token horizontally.

This device is different from a screen based solution because of its possibilities to move the tokens not only horizontally or vertically, but also on top of each other. While graphical user interfaces allow for 2D interaction, this device enables users not only to move the tokens horizontally and vertically, but also upwards or downwards.

Figure 15. A sketch of the selected idea for the counter device using tokens.

For the hand-held device using embodied metaphors, a harmonica-like structure was chosen to embody the brightness of the lamps (see figure 16). The color temperature can be changed by moving the device vertically. To select which lamp is being controlled, a twistable ring was chosen, since this facilitates best for connecting the device to multiple devices and has a more interesting way of interacting than pushing a button. To confirm the settings, the handles should be squeezed. This was chosen since it is another way of interacting without adding tangible parts.

Figure 16. A sketch of the selected idea for the hand-held device using embodied metaphors.

These two types of TUIs are interesting to compare, since the interactions with these devices are very unalike. The counter device is situated in one specific place, while the hand-held device can easily be moved somewhere else. Next to that, the amount of bodily involvement is also different between the two interfaces. The interaction with the counter device is quite limited to using a hand to reposition tokens, while the hand-held device relies more on bodily movement such as moving your arms up and down.

(22)

22 Regarding the artefact behavior of the devices, the two devices function as a means for giving input to the smart lighting system, since both devices were designed for instant control. The output is the change of the light settings. For the input, it was decided that adjustment of the settings is a linear process. This means that the settings can be adjusted from a scale to 0% to 100% with equal interim steps. Changing this linear process to a more irregular process would over-complicate the use of the TUIs and therefore decrease the user experience.

4.3.2 User Journey Storyboard

To get a better understanding of what a use scenario of using the TUIs would look like and to be able to easily explain both concepts, a customer journey map (see 3.2.4) for each TUI was made (see figure 17 and 18).

(23)

23 Figure 18. The storyboard of the customer journey for the hand-held device using embodied metaphors.

4.3.3 Low-fidelity Prototyping

For each of the TUIs, a low-fidelity prototype was created to see if there should be made any adjustments to the concept before making the prototypes that would be used during the validation phase. These prototypes can be seen in figures 19 and 20.

Figure 19. The low-fidelity prototype of the counter device using tokens.

(24)

24 By doing this, the need for several adjustments was found. Firstly, since the display of the counter device was placed on an angle, I noticed that the tokens would slide down after placing them on the display. Therefore, a solution had to be found to make sure that the tokens would stay in place. This could be done by implementing magnets into the tokens and using metal for the display, but this would make the device appear very cold and mechanical. Since it was preferred to give the device a warmer and more inviting look, it was chosen to use MDF as the material for the prototypes used during the validation phase. Therefore, another solution had to be found for the sliding tokens, which resulted in attaching double-sided tape to the bottom of the tokens.

Next to that, it was found that the interaction of squeezing the handles of the hand-held device did not feel smooth. The handles were made of a very stiff material and would therefore not react to the squeezing movement. Therefore, it was decided to add a thin layer of foam to the handles, which changes its shape slightly when being squeezed.

4.3.4 Second Iteration Hand-Held Device

After making the low-fidelity prototype of the hand-held device, it was found that the device did not indicate well enough how it should be used. The counter device had very clear signifiers, but the signifier for the color temperature in the hand-held device was missing (see figure 21). Signifiers are signs on a device that communicate where a certain action should take place (Norman, 2013).

Figure 21. Analyzing the signifiers in both devices.

After finding out that a signifier for the color temperature was missing, I ideated about possible signifiers. The signifier in the counter device makes use of a color spectrum from blue to red, which is associated with cold to warm. Examples of where this signifier is used as well are showers, water faucets and thermostats. It was decided to apply this color spectrum to the hand-held device as well as the color temperature signifier. Several possibilities for this were found (see figure 22).

(25)

25 It was chosen to apply the color spectrum vertically to the harmonica structure. This corresponds best to the movement that should be made with the device, which is also vertical. By making a low-fidelity prototype and acting out the movements with this prototype (see figure 23), it was predicted that this would be a good addition to improve the usability of the device. To make sure that the user would use the device with the colors in the right direction, the colors were placed so that the red side would be on the same side as the mark to change which lamp was adjusted (see figure 24). This way, when the user looks at the icon of the lamp that is being adjusted, they automatically hold the device with the red side up.

Figure 23. Testing out the color temperature signifier for the hand-held device.

(26)

26

4.4 Realization

4.4.1 Tangible Prototypes

After making the last adjustments to the tangible user interface concepts, improved prototypes were made. These prototypes can be defined as experience prototypes (see 3.2.5). These prototypes were made using MDF for the main parts and paper and foam for the detailing. First, all MDF parts needed were drawn in an Adobe Illustrator file (see figure 25). These parts were then cut out using the laser cutter.

Figure 25. The drawings made in Adobe Illustrator used for laser cutting the MDF parts.

After that, the paper parts were designed and printed (see figure 26). Lastly, everything was assembled. This resulted in the final prototypes, which can be seen in figures 27 and 28.

Figure 26. The paper parts for the prototypes.

(27)

27 Figure 28. The final hand-held device prototype.

4.4.2 Digital Context Prototype for Validation

In the ideal situation for the user validation, I would have built a test living room with two smart lamps and invited participants to come over to test the tangible user interface prototypes in a real-life situation. But because of the pandemic and lack of access to smart lighting systems, I was unable to test the tangible prototypes with users in the same room that would be equipped with smart lights. Therefore, I had to think of a way to create the use context online.

The solution was found to make a digital context prototype in Processing, which is an open-source graphical library tool to program using the Java programming language (Processing, 2020). The program shows an image of a living room with two lamps. For each lamp, the brightness and color temperature can be controlled by pressing keys on the keyboard of the laptop. To show the brightness of a lamp, a white circle is shown around each lamp. The opacity of this circle changes, which represents the brightness changing (see figure 29).

Figure 29. Changing the brightness of the top lamp in the first digital context prototype. The background picture

is retrieved from Unsplash, a website that provides free stock photos (Pal, 2019).

The color temperature was harder to show, since this setting is quite subtle and should be experienced in real life. It was chosen to draw a little ellipse over each lamp to show the color temperature, since this would be a subtle adjustment (see figure 30).

(28)

28 However, I was not satisfied with this way of showing the color temperature of the lamps. It was not very realistic and could also be understood as changing the color of the light. Therefore, a second iteration was conducted. In this iteration, the color temperature was shown by changing the color temperature of the area around the lamps (see figure 31). Next to that, the background image was edited so that the lamps were further away from each other. However, since this prototype used a lot of images, the program stopped working because it quickly ran out of memory. Thus, a third iteration for the prototype was conducted.

Figure 31. Changing the color temperature of both lamps in the second digital context prototype. In the third iteration of the digital context prototype, the color temperature was indicated by showing a colored ring around the lamps (see figure 32). This was a suitable solution, since it still showed that the surroundings change without using too much memory. The code for this final digital context prototype can be found in appendix 1. A sketch was made to show how the prototype should be operated, which can be seen in figure 33.

1. both lamps are turned off 2. increase brightness top lamp 3. increase color temp. top lamp

4. increase brightness bottom lamp 5. decrease color temp. bottom lamp 6. decrease color temp. even more

(29)

29 Figure 33. A sketch to show how the final digital context prototype should be operated using the keyboard of my

(30)

30

4.5 Validation

4.5.1 Design Requirements

In the literature read during the analysis phase, several design requirements and goals were described. For each paper, an overview of the requirements and a selection of the requirements relevant for the validation of the tangible user interfaces I designed were made:

1. Comparing Social Robot, Screen and Voice Interfaces for Smart-Home Control (Luria et al.,

2017)

For their social robot interface design, Luria, Hoffman and Zuckerman defined five design goals: ⋅ Engaging

Evoke engagement and bring back ”excitement of interaction” ⋅ Unobtrusive

Not disturb the user and stay in the periphery of attention ⋅ Device-like

Resemble a device and not a human or pet ⋅ Respectful

Be polite and aware of social situations ⋅ Reassuring

Express reliability and reassure the user during their use

The last four design goals are focused on designing a social robot and therefore not relevant for this thesis project, but the engaging goal is an interesting requirement to use for the user validation. Therefore, the following question was formulated based on the design requirement:

Which device was most exciting/enjoyable to use? Why?

2. Evolution towards smart home environments: empirical evaluation of three user interfaces

(Koskela & Väänänen-Vaino-Mattila, 2004)

In their paper, Koskela and Väänänen-Vaino-Mattila discuss the differences between pattern control and instant control. For each of them, they formulated several UI requirements. Since the focus of this thesis is on instant control, the requirements for instant control are stated here:

⋅ Simple task performance

Only few action steps to get something done ⋅ Centralized control device

One centralized means to control all different devices

The second design requirement is already fulfilled in both concepts designed for this thesis. However, the simple task performance requirement is interesting to test. Therefore, the following question was formulated:

Do you feel like it was easy to perform a task by using the device? Why?

3. Internet of Tangible Things (IoTT): Challenges and Opportunities for Tangible Interaction with IoT (Angelini et al., 2018)

As one of the results of their systematic IoTT review, Angelini, Mugellini, Abou Khaled and Couture have formulated eight tangible properties to reflect on:

(31)

31 ⋅ Meaningful representations and controls

The function of the object can be understood from its form ⋅ Rich interactions and human skills

Natural human skills and senses are exploited through rich interactions ⋅ Persistency

Ability to control the system during a power or connectivity outrage ⋅ Spatial interaction and collaboration

Support collaborative setups with multiple IoT objects ⋅ Immediacy and intuitiveness

Users need minimal learning time to understand and control the device ⋅ Peripheral interaction

Interactions that are integrated in daily routines and do not disrupt attention ⋅ Reflection and memories

Support for reflections and associating and sharing memories ⋅ Long-lasting interactions and emotional bonding

Durable designs that avoid electronic waste due to the technology becoming outdated From this list, the properties meaningful representation and controls, rich interactions and

human skills and immediacy and intuitiveness can be used as relevant design requirements

for this thesis project as well. However, the requirement immediacy and intuitiveness is closely related to the simple task performance requirement gained from the second paper. The question used for that requirement can also be used for the immediacy and intuitiveness requirement. Therefore, the following extra questions were formulated:

Did you easily understand how to use the device? Why (not)? How have you used your body while using the devices?

The analysis of the design requirements mentioned in the literature above resulted in the following list of design requirements relevant for this thesis project with the corresponding interview questions:

Engaging Which device was most exciting/enjoyable to use?

Why?

Simple task performance /Immediacy and intuitiveness

Which device was easiest to use to perform a task? Why?

Meaningful representation and controls

Did you easily understand how to use the device? Why (not)?

Rich interaction and human skills How have you used your body while using the devices?

4.5.2 User Experience Questionnaire

In order to gain more insights in the user experience of the two devices, I decided to use the User Experience Questionnaire. The original User Experience Questionnaire has been developed by Laugwitz, Held and Schrepp to measure the user experience in a simple and immediate way (Laugwitz, Held & Schrepp, 2008). For the user validation, the short version of the User Experience Questionnaire that has been developed by Schrepp, Hinderks and Thomaschewski will be used (see figure 34) (Schrepp, Hinderks, & Thomaschewski, 2017). The

(32)

32 short version was used to gain the desired insights without taking up too much of the participants’ time.

Figure 34. The short version of the User Experience Questionnaire (Schrepp, Hinderks & Thomaschewski, 2017).

4.5.3 User Validation Plan

As mentioned before, the user validation had to be executed online, because of the social distancing restrictions. Therefore, the prototypes and consent forms were brought to the participants’ homes, after which the user validation would be conducted through Zoom. After the interview, the prototypes and filled-in consent form were picked up again.

Goal of the User Validation

The goal of this user validation is to gain insights about the use of each TUI separately and the comparison of both TUIs. The topics of the insights include, but are not limited to, the design requirements and user experience categories mentioned in 4.5.1 and 5.4.2.

Participants

For this user validation, five people who currently do not own a smart home participated. This number of participants was chosen since research has shown that this is the ideal number of participants for gaining enough insights (Nielsen, 2000). People who do not own smart homes were chosen since they did not have any experience with smart home control and therefore also had very little prejudices about smart home control.

Study Set-up

Below, I will shortly explain each phase of the study. The complete study set-up can be found in appendix 2.

1. Introduction

I started by explaining the goal and set-up of this study. I also asked the participants to fill in the consent form.

2. Using the Tangible Prototypes

After that, I introduced the digital context prototype and the tangible prototypes. I explained what functions both devices have (adjusting the brightness and color temperature of the two lamps). However, I did not explain how to adjust these settings, since I wanted to test how easy and intuitive (Norman, 2013) it was to use the device without having instructions. Then, I gave the participants a scenario of coming home and wanting to turn on the lights, which they had to act upon. I also asked them to think out loud to make it easier for me to Wizard of Oz the digital context prototype.

3. Interview questions about each device

After using each device, I asked the specific questions about the experience of using one device that were derived from the design requirements (see 4.5.1). Next to that, the questions from the short User Experience Questionnaire (see 4.5.2) were asked.

(33)

33

4. Interview questions to compare the devices

Lastly, after both devices were used, some final questions based on the design requirements (see 4.5.1) were asked to compare both devices.

Informed Consent Form

An informed consent form was made, which can be found in appendix 3.

4.5.4 Execution

The user validation sessions were conducted on May 14 and 15 over Zoom. My roles during these user validation sessions were Wizard of Oz operator, interviewer and observer. The sessions were video-recorded trough Zoom for data processing purposes. Next to that, some notes were made of remarkable actions and quotes. To make the role of Wizard of Oz operator easier for myself, I added stickers to the keyboard of my laptop (see figure 35).

Figure 35. The keyboard of my laptop with stickers to make the process of operating the digital context

prototype easier.

4.5.5 Data Processing

During the user validation sessions, both qualitative and quantitative data were gathered. The qualitative data was processed by making an affinity diagram (see 3.2.9). First, all quotes, observations and other insights were written down on post-it notes. After that, I categorized the post-it notes in the requirements mentioned in chapter 4.5.1 (see figure 36). Then, sub-categories were made within each requirement category. These sub-sub-categories were then named, formulating some preliminary insights gained from the user validation sessions. The final affinity diagram can be found in figure 37.

(34)

34 Figure 36. The first step of categorizing the qualitative data gathered during the user validation sessions.

Figure 37. The affinity diagram of the qualitative data gathered during the user validation sessions. The quantitative data was processed by making overviews in Microsoft Excel. First, two tables were made showing the ratings of the counter device and the hand-held device by each participant, together with the average (see figure 38).

(35)

35 Figure 38. An overview of the rankings of the counter device and the hand-held device by the participants.

After that, another table was made to compare the counter device and the hand-held device (see figure 39).

Figure 39. An overview of the average rankings of the counter device and the hand-held device. By making these overviews, I was able to draw conclusions from the quantitative data.

4.5.6 Drawing Conclusions

Lastly, I wrote down all preliminary insights gained while making the affinity diagram and conclusions that could be drawn from analyzing the qualitative data. It was found that some of the quantitative categories could be connected to the requirements formulated in 4.5.1:

⋅ Engaging – Boring/Exciting & Not interesting/Interesting

⋅ Simple Task Performance – Obstructive/Supportive & Inefficient/efficient ⋅ Meaningful Representation and Controls – Complicated/Easy & Clear/Confusing ⋅ Rich Interaction and Human Skills

⋅ Innovation - the categories Conventional/Inventive and Usual/Leading edge do not belong to one of the categories mentioned above, but represent their own new category: innovation.

A summary of the results (see figure 40) was written, which can be found in chapter 5.2. The conclusions that have been drawn from these results can be found in chapter 7.

(36)

36 Figure 40. The overview of results from the qualitative and quantitative analyses. The results have been divided

vertically according to the different requirements. Next to that, a division was made between the counter device (left) and the hand-held device (right).

(37)

37

5. Results

This thesis project has produced two different types of results. Firstly, two new tangible user interfaces for instant smart home control were designed. Next to that, several insights were gained from exploring the user experience of these tangible user interfaces by validating them with users. These insights can be used for future development of tangible smart home user interfaces.

5.1 Tangible User Interfaces for Instant Smart Home Control

Two different types of tangible user interfaces for instant control of a smart lighting system were designed. With these devices, the user can adjust the brightness and color temperature of two smart lamps. These devices show new applications of tangible user interfaces to the smart home. They were designed as a tool for answering the research question of this thesis. The first tangible user interface is the counter device using tokens (see figure 41) (Ullmer et al., 2005). This device allows for interaction by using movable physical parts that embody the smart lamps. The lamp to adjust can be selected by picking up the corresponding token. By moving the token vertically over the display, the brightness is adjusted. The color temperature can be adjusted by moving the token horizontally.

Figure 41. The experience prototype of the counter device using tokens.

The second tangible user interface is the hand-held device using embodied metaphors (see figure 42) (Bakker et al., 2012). This device allows for interaction by moving the body while holding the device, adding metaphorical meanings to movements. The lamp to adjust can be selected by turning a ring to the corresponding image of the lamp. By opening up the device through moving the handles away from each other, the brightness is adjusted. The color temperature can be adjusted by moving the device vertically. To confirm the settings, the handles have to be squeezed.

(38)

38

5.2 Results from Validating the Tangible User Interfaces

The experience prototypes shown in figures 40 and 41 were used to test the user experience of the two different types of tangible user interfaces (see chapter 4.5). The insights gained from these tests are divided into five chapters. First, insights about the engagement are discussed. After that, several results related to the simple task performance are mentioned. Next, insights about meaningful representation and controls are presented. Then, results regarding rich interaction and human skills are presented. After that, the innovativeness of both devices is discussed. Lastly, insights regarding the best use scenarios are presented.

5.2.1 Engagement

By evoking engagement with the user, excitement of interaction is created (Luria et al., 2017). Therefore, to create a better user experience, the tangible user interfaces should engage the user. Both interfaces were perceived as quite exciting and interesting, but for different reasons. Four out of five participants mentioned that the counter device was easy to understand and logical, which made this interface very inviting to use. One participant stated: “I was more excited to use the device with the tokens because I immediately knew what I could do with it.” The hand-held device was exciting to use because of the need to move the body to change the settings. Next to that, the use of this device was described as playful, since the participants felt like it was a challenge to understand how to use the hand-held device. As one participant mentioned: “It had this playing element. You could grab it and move it around.” However, users also felt demotivated to use the interface sometimes when it was too hard to understand how the device should be used.

5.2.2 Simple Task Performance

A task is simple to perform when only a few action steps have to be taken to get something done (Koskela & Väänänen-Vaino-Mattila, 2004). This makes the device efficient to use. When it is easy to perform a task with a device, the user will have a better experience. Therefore, the devices were tested on ease of task performance, efficiency and supportiveness.

The counter device was perceived as quite efficient. As two participants mentioned, the use of the device would get easier over time. One participant explained: “You can visualize a pattern of where you would put the token. That way, it is very easy to remember what setting you like. So the next time I would use this device, I would know exactly where to put it.” However, it was also found that users had to constantly switch from looking at the device to looking at the lamps to see what they changed. This is not very efficient. Next to that, the counter device was perceived as quite supportive. Users only needed one hand to operate the device.

The hand-held device was perceived as less efficient and supportive. Even though the participants predicted that the use of this device would become easier over time, they also remarked that it would be harder to remember settings that they liked. One participant stated: “It is harder to remember what height for the color temperature I like most. I would have to refer to my body parts, like my nose or shoulders to remember the exact height.” Next to that, the actions that have to be performed with the hand-held device are more time-consuming compared to the counter device. An advantage of the hand-held device is that it can be moved around so that the device and the lamps are in the same field of view. This way, users get immediate feedback on their actions.

References

Related documents

The chapter touches different aspects of search user interfaces such as how users perform searches as well as what kind of functionality to incorporate to support the search

Stöden omfattar statliga lån och kreditgarantier; anstånd med skatter och avgifter; tillfälligt sänkta arbetsgivaravgifter under pandemins första fas; ökat statligt ansvar

I dag uppgår denna del av befolkningen till knappt 4 200 personer och år 2030 beräknas det finnas drygt 4 800 personer i Gällivare kommun som är 65 år eller äldre i

Detta projekt utvecklar policymixen för strategin Smart industri (Näringsdepartementet, 2016a). En av anledningarna till en stark avgränsning är att analysen bygger på djupa

The ambiguous qualities in terms of interaction and design of the tangible user interface Cubieo were utilized in order to evaluate its impact on collaboration

The results of the study act as a guideline to search user interface developers by clarifying the importance of providing the user with understanding and knowledge

For the consideration of future development of the test tool, if the users are mostly programmer, we may consider using script type of interface; if most users are customer,

In the first test participants are asked to map a spatial input to lighting outputs based on interaction with objects that have different design properties.. Multiple