• No results found

Finding the problem : Improvements to increase efficiency and usability when troubleshooting

N/A
N/A
Protected

Academic year: 2021

Share "Finding the problem : Improvements to increase efficiency and usability when troubleshooting"

Copied!
89
0
0

Loading.... (view fulltext now)

Full text

(1)

Linköping University | Department of Computer Science Bachelor thesis, 18 ECTS | Cognitive Science Spring term 2020| LIU-IDA/KOGVET-G--20/030--SE

Finding the problem

– Improvements to increase efficiency and usability when troubleshooting

Nina Eliasson

Supervisor: Rachel Ellis Examiner: Erik Marsja Client: Spotify

Linköping University SE-581 83 Linköping +46 13 28 10 00, www.liu.se/en

(2)

Copyright

The publishers will keep this document online on the Internet – or its possible replacement – for a period of 25 years starting from the date of publication barring exceptional circumstances. The online availability of the document implies permanent permission for anyone to read, to download, or to print out single copies for his/her own use and to use it unchanged for non-commercial research and

educational purpose. Subsequent transfers of copyright cannot revoke this permission. All other uses of the document are conditional upon the consent of the copyright owner. The publisher has taken technical and administrative measures to assure authenticity, security and accessibility. According to intellectual property law the author has the right to be mentioned when his/her work is accessed as described above and to be protected against infringement. For additional information about the Linköping University Electronic Press and its procedures for publication and for assurance of document integrity, please refer to its www home page: http://www.ep.liu.se/.

(3)

Abstract

In a time where competition for software services is big and the time to market crucial, speed and productivity is the key to competing with other organizations. One of the competitors is Spotify, who provide an audio streaming platform to 286 million monthly active users around the world. Due to the number of users, a disturbance in the service has a great impact. In order to avoid disturbances, the back-end developers have to locate and solve the issue fast.

To be able to identify user problems and frustrations with a troubleshooting tool, fifteen interviews and five observations were conducted. The resulting data, combined with the five-step Design

Thinking model, resulted in the two defined problems: finding specific information and narrowing the problem space. Furthermore, a search feature and a feature to customize the view, were tested on a middle-fidelity prototype to investigate the impact on troubleshooting and the usability of the tool.

Keywords: Design Thinking, User-Centered Design, Usability Test, Usability, Efficiency, Qualitative Methods

(4)

Acknowledgments

I would like to acknowledge and thank my tutor Rachel Ellis, for her comments and support. To my team at Spotify for their continuous feedback and help with everything from questions to participating in interviews. To everyone at Spotify for making me feel welcome. To my test participants. To my classmates and other thesis students at Spotify for feedback and their advice. And I would especially like to acknowledge Ekaterina Garbaruk Monnot, product manager in my team, for her support, help and valuable feedback during the whole project.

Stockholm, May 2019 Nina Eliasson

(5)

Contents

1. INTRODUCTION ... 1 1.1THE THESIS PROJECT ... 1 1.1.1 Spotify ... 1 1.1.2 The tool ... 2 1.2AIM ... 3 1.3RESEARCH QUESTIONS ... 3 1.4DELIMITATIONS ... 3 1.5THESIS OUTLINE ... 4 2. THEORETICAL FRAMEWORK ... 5

2.1USER-CENTERED DESIGN ... 5

2.2USABILITY ... 6

2.3DESIGN THINKING ... 7

3. PREPARATORY FIELD WORK ... 10

3.1.METHOD THEORY ... 10 3.1.1. Interviews ... 10 3.1.2 Observations ... 11 3.1.3 Thematic Analysis ... 12 3.1.4. Ethical Aspects ... 14 3.2.METHOD ... 14 3.2.1. Participants ... 14

3.2.2. Interviews and observations ... 15

3.3RESULTS ... 15

3.3.1 Usage of the tool ... 16

3.3.2. Users’ knowledge and experience of the tool ... 19

3.3.3 Limitations ... 23 3.3.4. Frustrations ... 23 3.3.5. Discoverability ... 28 3.4DISCUSSION ... 30 3.4.1. Method discussion ... 33 4. CONCEPT IDEATION ... 34 4.1METHOD THEORY ... 34 4.1.1 Complexity-Value matrix ... 34

4.2.1. Concept ideation and evaluation ... 35

4.3RESULTS ... 36

4.4DISCUSSION ... 37

5. PROTOTYPE DEVELOPMENT ... 39

5.1METHOD THEORY ... 39

(6)

5.2RESULTS ... 40 5.2.1. Customized view ... 40 5.2.2. Search ... 41 6. VALIDATION ... 43 6.1METHOD THEORY ... 43 6.1.1. Usability testing ... 43 6.1.2. Think-aloud protocol ... 44 6.2METHOD ... 45 6.2.1. Participants ... 45 6.2.2. Usability study ... 45 6.3.RESULTS ... 46 6.3.1. Customized view ... 47 6.3.2. Search ... 51 6.4.DISCUSSION ... 54 7. OVERALL DISCUSSION ... 57 8. CONCLUSION ... 60 REFERENCES ... 61 APPENDICES ... 66

APPENDIX A- INTERVIEW QUESTIONS ... 66

APPENDIX B-ORIGINAL QUOTES IN SWEDISH FROM THE INTERVIEWS ... 68

APPENDIX C-VALUE VS COMPLEXITY MATRIX ... 72

APPENDIX D-OUTLINE OF THE USABILITY STUDY ... 73

(7)

1. Introduction

In a time where competition for software service is big and the time to market crucial, speed and productivity is the key in order to compete with other organizations. One of the software services is Spotify who is known for providing the best streaming service in the world competing against Apple Music, Amazon Music and YouTube Music, to mention a few (Looper, 2020). Spotify has 286 million monthly active users, over 50 million tracks and 3 billion of playlists (Spotify, 2020). As an outcome of the competition, the need to provide the end-user with a robust and trustworthy product is of great importance. In other words, a disruption in the audio streaming platform could impact many music listeners around the world. Furthermore, technical development has not only created increased

competition for software services, but also made the software more complex by introducing new client services such as Sonos, Bose and Alexa. Due to the increased complexity and services, and that Spotify work in a very high pace of development, a consequence is a higher probability of problems. In order to manage the complexity and deliver a smooth and seamless listening experience to the user, the errors have to be managed quickly. As a result to provide the back-end developers with a tool to troubleshoot that is both efficient and useful, would have a great impact on the end-users of Spotify's audio streaming platform.

1.1 The thesis project

The thesis project was initialized by Spotify with the purpose of investigating how an internal developer tool could improve its efficiency and usability. The tool is named Remote Admin and enables back-end developers to troubleshoot Spotify's back-end services on connected devices. Furthermore, the tool has 88 monthly active users with an increasing number of new users. Due to the value of the tool, the project investigated how the tool could be improved in terms of efficiency and usability.

1.1.1 Spotify

Spotify consists of autonomous cross-functional teams all over the organization, with developers, product- and engineer managers in back-end teams. As Spotify gives high autonomy to the teams, the teams themselves are responsible for quality of the software they release. The teams need to be able to troubleshoot issues by themselves and also to be able to view how their service work with other services in the ecosystem of Spotify’s back-end system. In other words, each team is

responsible for a specific area or services in the back-end and work together with other teams to solve issues and to develop new features.

(8)

1.1.2 The tool

The tool is web-based and has four different functionalities where Traffic Visualizer is used in around 80 % of the cases. The tool itself is called Remote Admin, but as the Traffic Visualizer is the most used part the thesis has been limited to only cover the Visualizer. Therefore, the thesis refers to the “tool” instead of the Traffic Visualizer.

The Traffic Visualizer is used by back-end developers to troubleshoot on back-end services on connected-devices. The tool allows the user to view how different calls are sent between the services and in which order they occur. In short, they are able to view how the different services communicate with each other and what information is passed along. This enables the users to investigate if, where and when a call failed, in order to address the root cause of the problem. The calls are visualized in the image below and the information sent between the services have been censored due to privacy reasons.

Figure 1. Traffic Visualizer.

As mentioned before, the tool allows developers to view the message for each specific call, and the message contains all relevant information about the call made from one service to another. The message is visualized below and placed to the left of the flow-diagram.

(9)

Figure 2. Message of a call in Traffic Visualizer.

1.2 Aim

The aim of the thesis was to identify current user problems and evaluate possible solutions in order to increase the usability of the tool.

1.3 Research questions

● What problems and frustrations do the users have?

● Which features and user experience factors should be considered when designing for increased efficiency and usability?

1.4 Delimitations

First of all, the thesis work is limited to only one part of the tool, the Visualizer, as it is the most used part and are used in almost 80 % of the cases. Secondly, the thesis project has been limited to

investigate the functionality of the tool in order to enhance the usability for experienced users. In other words, novices or users that are not familiar with the tool will not be included in the project. Finally, the thesis work will not attempt to implement the features in the tool. Instead, the intended result is to identify suggested improvements to the tool in order to increase the usability.

(10)

1.5 Thesis outline

The next chapter presents the overall theoretical framework for the thesis and is followed by the three phases of the project presented in a chronological order from a Design Thinking perspective. The thesis begins with Preparatory fieldwork in Chapter 3, followed by Concept ideation in Chapter 4, Prototype development in Chapter 5 and Evaluation in Chapter 6. Each of the three chapters contains of a separate section for the method theory, method, result and a discussion. Finally, the thesis ends with an overall discussion in Chapter 7 and the conclusion is presented in Chapter 8.

(11)

2. Theoretical Framework

__________________________________________________________________________

This chapter presents the overall theoretical framework and begins with a description of user-centered design, followed by a definition of usability and ending with presenting Design Thinking, the overall method for the thesis. To point out, the user-centered design is included in the Design Thinking model in order to answer the research questions of the thesis.

__________________________________________________________________________________

2.1 User-centered design

Humans do not only contribute to the technological development and innovation of products but are also the end-users. Which means that in order to create meaningful products the design decisions must be based on the user in mind and their needs (Preece, Sharp & Rogers, 2015). Similar to Preece et.al’s (2015) definition, Williams (2009) defines user-centered design (UCD) as characterized by placing the user first when making design decisions. In other words, during the design process the users are always in focus (Arvola, 2014; Preece, et. al, 2016). However, Williams (2009) highlights that this does not imply that the users are creating the final design, but rather have an active role in design activities such as design research. The benefit of involving the end-user is that it ensures that the product will be based on the users’ needs and therefore ensures a valuable product for the user (Arvola, 2014; Norman, 2013). However, Williams (2009) points out that it is important to avoid asking the participants about their needs and instead try to understand the user behavior and their preferences, and to base design decisions on that. To summarize, UCD is a practice that allows collaboration between designers and users to create valuable products (Williams, 2009).

In order to create a meaningful improvement of the tool, the thesis work used a user-centered approach to ensure that the improvements are based on the user’s needs. In other words, the users of the tool were considered during the whole process. To adapt the UCD practice, Norman (2013) present four main phases that should be involved in the process to create a user-centered product: 1) Observation, 2) Generating ideas, 3) Creating a prototype and 4) Testing. Similar to Norman (2013), the

International Organization for Standardization (ISO 9241-210, 2010) has created guidelines on how UCD practice should be conducted when designing for any interactive system and propose for the following design activities: 1) Understanding and specifying the context of use, 2) Specifying the user requirements, 3) Producing design solutions, 4) Evaluating the design. Moreover ISO (9241-210, 2010) presents six principles that should be applied in human-centered design:

(12)

1) The design is based upon an explicit understanding of users, tasks and environments 2) Users are involved throughout design and development

3) The design is driven and refined by user-centered evaluation 4) The process is iterative

5) The design addresses the whole user experience

6) The design team includes multidisciplinary skills and perspectives.

On the whole, both approaches include activities to gain an understanding of the user, requirements that corresponds with the users’ needs, implementation and evaluating. The thesis work used the six principles suggested by ISO (9241-210, 2010) due to the iterative process and continual refinement from a user-centered perspective. However, all principles were not included as the thesis work did not focus on the whole user experience.

2.2 Usability

Usability is defined by ISO 9241-11:2018 as "the extent to which a system, product or service can be used by specified users to achieve specific goals with effectiveness, efficiency and satisfaction in a specified context of use.". Furthermore, it is described by Nielsen (2012) as a quality attribute of the user interface and are characterized by five components. 1) Learnability, how easy it is for the user to accomplish a task the first-time integration with the design. 2) Efficiency, the amount of time it takes for a user to accomplish a task on the design. 3) Memorability, to what extent the user can reestablish proficiency when they have not used it over some time. 4) Errors, the amount of errors, how difficult and easy it is for the user to recover from them. Lastly, 5) Satisfaction, the users’ thoughts about the design. All of the five components were used to evaluate the usability of the tool in order to address which features and user experience factors should be considered when designing for increased efficiency and usability.

However, the International Design Foundation (n.d.) also highlights the fact that a product is not usable or unusable, but is rather a combination of the users need and the context that determine the level of usability. Furthermore, the International Design Foundation (n.d.) describes three outcomes of a usable interface, similar to Nielsen's (2012), which is that the user should feel comfortable and perceive the interface as easy to use the first time they encounter the design to allow the user to perform actions quickly. Moreover, the user should be able to recall the interface to perform actions as quick and easy as the first time of usage.

(13)

2.3 Design thinking

Design thinking is a user-centered method drawn from disciplines such as software development, engineering, anthropology, psychology, the arts and business (Luchs, Swan & Griffin, 2015). Luchs et al (2015) presents the method as consisting of two phases, identifying problems and solving them. Specifically, they recommend that the method should be applied in cases where the problem is not well defined or in the context of creating a new innovative idea or solution that has an impact on the business by, for instance increasing revenue growth. According to Sims (2013) Design Thinking is an approach that consists of many low-risk actions that he defines as "small bets". In other words, he means that many small, rather than big, iterations are preferable when designing a new product. In addition, Gibbons (2016) highlight that design thinking is a problem-solving approach that increases the probability to create a successful product or a breakthrough innovation. Furthermore, Luchs et al (2015) express that the method should be applied in situations where the goals is to create potential solutions fast while accepting incomplete knowledge with the possible outcome of flawed solutions. On the other hand, Luchs et al (2015) argue that potentially flawed solutions can be used in order to increase learnings about the user and the problem or to refine the solution.

Furthermore, the Design Thinking model has been criticized by Pietro, Wilner, Bhatti, Mura and Beverland (2019) as a model that lacks clarity of its definition. They claim that the model is “fuzzy” in a sense that it is hard to grasp due to it is drawn from a range of disciplines. Furthermore, Pietro et al (2019) argue that the model lacks empirical evidence of its impact and therefore the theoretical and methodological relevance should be questioned. On contrary, the model provides a high practical relevance (Pietro et al, 2019). Despite the critique of the model and due to the practical relevance of it, that it is iterative and increases the probability to create a meaningful product, this model, together with a user-centered approach, will be adapted with the purpose to answer the research questions and to increase the usability of the tool. To point out, there are different kinds of Design Thinking models but the model used in the thesis is based on the five-stage Design Thinking model proposed by the Hasso-Plattner Institute of Design at Stanford, and are presented below. The five-step model consists of the following five phases: Empathize, Define, Ideate, Prototype and Test (d.school, 2010).

Furthermore, the decision to adapt the five-stage model was based on that the model consists of a iterative process that generally converts broad concepts to distinct details after each iteration, which (d.school, 2010) highlights is fundamental for a good design. The model is also used to adapt a user-centered approach and to answer the research questions. With the help of the model, the first research question – what problems and frustrations do the user have?, is answered in the next chapter which covers the empathize and define step of the model. And the second research question - Which features

(14)

usability? is answered in the last chapter, evaluation, and is the result from the five-step Design

Thinking model. In addition, the model was also used as the outline of the thesis. The five-steps and the iterations are illustrated in the figure and further described in the sections below.

Figure 3. The Design-thinking phases.

Empathize

The first step in the design thinking process is to empathize with the user. In other words, the phase attempts to understand the user's behavior and needs in the context of the design. This can be attained by observing the users in their context, interviewing to gain a deeper understanding, or a combination of the two, for example by asking the user to present the steps they do to accomplish a task. The outcome of this phase is to gain insights about the user to be able to define the problem space (d.school, 2010).

Define

The second step of the design thinking process is to define the design space based on insights drawn from the user and their context. In other words, the phase attempts to synthesize the information to identify patterns and connections is the gathered user data. D.school (2010) highlights that this phase is important due to determine which problems that should be addressed in the ideate phase.

Ideate

Based on the problems defined in the previous step, the ideate phase seeks to generate ideas of

potential solutions to the identified problems. A common approach in this phase is to brainstorm to get new and broad ideas to reach the outcome of having a wide set of possible solutions evaluated by their impact. Other common ideate techniques include bodystorming, mind mapping and sketching

(15)

Prototype

In this phase, the goal is to create low fidelity prototypes to evaluate the ideated solutions to a low cost with a minimum of time. However, d.school (2010) highlights that before developing a prototype the design team has to consider how the prototype will be evaluated.

Test

The final step in the Design Thinking process is the test phase which seeks to evaluate the prototype. The outcome of this phase is to establish more profound empathy with the user, improve the prototype or to refine the solution (d.school, 2010).

As mentioned before, the Design Thinking model was used as the outline of the thesis, meaning each step will be covered in the following chapters beginning with empathize in the next chapter and ending with evaluation in Chapter 6.

(16)

3. Preparatory Field Work

__________________________________________________________________________

This chapter describes the user research for the thesis project and covers the empathize and define step in the Design Thinking model. The chapter consists of four parts beginning with an explanation of the method theory, followed by a description of the method, presentation of the thematic analysis and ending with a discussion reflecting on the results and the method.

__________________________________________________________________________________

3.1. Method theory

The method of the thesis is based on the five-step Design Thinking model and this chapter covers the empathize and define phase. As mentioned before, the first step, to empathize, can be attained by observing the users in their context, interviewing to gain a deeper understanding, or a combination of the two, for example by asking the user to present the steps they do to accomplish a task (d.school, 2010). The outcome of this phase is to gain insights about the user to be able to define the problem space (d.school, 2010). The second step, define, seeks to define the design space based on insights drawn from the user and their context (d.school, 2010). In other words, the phase attempts to synthesize the information to identify patterns and connections is the gathered user data.

3.1.1. Interviews

Frequently used methods to gather qualitative data include interviews, group interviews and observations, where interviews are the most used method for conducting user research (Bryman, 2018). Furthermore, interviews are distinguished by Howitt (2013) as structured or unstructured, where the first is characterized by having a prepared script with questions that need to be followed accordingly. By contrast, unstructured interviews do not have this requirement which allows the researcher to interact more naturally with the participant which makes the interview more flexible (Howitt, 2013). However, the research can adopt a combination of the two, a semi-structured

interview, which is characterized by having a prepared script of questions but at the same time allows to deviate from the script to make the interview more flexible (Bryman, 2018). According to Martin and Hannington (2012), semi-structured interviews are preferred as they create a less formal situation which makes the participants more comfortable asking questions or addressing issues. Due to the flexibility, and with the purpose to uncover as much information as possible, a semi-structured approach have been applied to empathize with the user and create a comfortable situation for the participant.

(17)

Furthermore, when conducting user research Goodwin (2009) states that on average one interview is between 45 to 60 minutes, and are usually conducted between four to eight participants. Generally, when recruiting participants, a diversity of users should be considered to make the result applicable to a wider population and to avoid focusing on one type of user (Baxter, 2015). On the other hand, Baxter (2015) highlight that it depends on the purpose of the research. For this study the focus has been on recruiting participants that are active users of the tool. In other words, the purpose has not been to make the results applicable to a wider population. Therefore, the interviews were focused on emphasize with the users’ that used the Visualizer on a weekly basis and are dependent of the tool.

Group interviews

A group interview is conducted in pairs or groups and has advantages of creating rich and natural material by making the participants feel more relaxed than they would in a one-on-one interview (Martin & Hannington, 2012). Furthermore, Martin and Hannington (2012) highlights the advantage of that group interviews allows the participants to help each other remember which generate

information that otherwise would not be discovered. On the contrary, Martin and Hannington (2012) state that a downside is that the risk of participants influencing each other increases and could lead to one participant dominating the interview. For this thesis, the advantages of a group interview

generating more information than an individual interview was assessed high for two participants, which resulted in the choice to conduct one pair interview. Mainly because the two participants were from the same team and used the tool in an equivalent way. However, no additional group interviews were conducted as the purpose was to gain as many individual reflections of the tool as possible.

3.1.2 Observations

As mentioned previously, observations are another common method for gathering qualitative data and are described by Howitt (2013) as an ethnographic method which requires the researcher to view the environment or task from the perspective of the participant. Furthermore, Martin and Hannington (2012) highlight that interviews and observations do not need to be separated when collecting qualitative data, instead they argue that observations are a great complement to interviews to gather information about the behavior of the user that they are not aware of. As a result, they state that observations could be used to gain a better understanding of the user and their habits. On the other hand, although observations have their advantages, Goodwin (2009) highlight the fact that

observations are both time consuming and could cause the participant to feel uncomfortable and therefore affect their behavior. However, Goodwin's (2009) conclusion is that observations are overall a good research method to minimize the risk of self-reporting errors in qualitative research. With the advantages of observations providing a better understanding of the behavior of the user, combined with the purpose to empathize with the user and their context, the choice to include observations in the

(18)

interviews were made. Furthermore, as mentioned previously, d.school (2010) highlight that both interviews and observations can be combined in order to gain a deeper understanding when emphasizing with the users. For instance, by asking the participant to show the steps needed

accomplish a common task with the tool. However, as observations are time consuming, the choice to not include observations in all of the interviews was made

3.1.3 Thematic Analysis

To be able to define data, synthesize the information to identify patters and connections in the gathered user data (d.school, 2010), the choice to adapt a thematic analysis was made. Furthermore, the analysis aims to answer the first research question - What problems and frustrations do the users

have?.

Theme identification is described by Ryan and Bernard (2003) as the most fundamental part in qualitative research. Furthermore, Javadi and Zarea (2016) highlight that thematic analysis is the most used method in different fields within qualitative research, and describe the method as an approach to examine data and record patterns and themes. The advantages of the method mentioned by Braun and Clarke (2006) is that thematic analysis has the advantages of being flexible and easy to use for novices. Moreover, Javadi and Zarea (2016) highlight that a good thematic analysis can result in reflecting and clarifying the reality behind the data. Similar to Braun and Clarke (2006), Javadi and Zarea (2016) argues that the method is both easy and flexible, but add that the method could also be amusing for the researcher to analyze the data. What is meant with the flexibility of the method, according to Braun and Clarke (2006), is that the method does not have to be linked to a theoretical framework. Which makes the method applicable to many different contexts and research fields (Braun & Clarke, 2006). Moreover, Braun and Clarke (2006) state that the method is flexible in the sense that it can be adapted to any analysis depending on the decision regarding the method due to that the method is used to identify, analyze and report different patterns and themes in a data set. They also highlight that before conducting the analysis the researcher has to consider what type of analysis should be applied on the dataset. In general Braun and Clarke (2006) distinguish between two types of thematic analysis, an inductive and a theoretical, where the first is defined as a form of analysis which is data-driven, meaning that the themes are strongly linked to the data. On the contrary, a theoretical thematic analysis is explicit analyst driven which means that the analysis tends to give a detailed analysis of a specific and narrow part of the data, and less description of the data as a whole (Braun & Clarke, 2006). Furthermore, Braun and Clarke (2006) state that the researcher needs to reflect and decide what the purpose of the analysis is, if it is to cover a rich description of the whole data set or detailed information of a specific aspect.

(19)

In short, as mentioned before, the method seeks to identify, analyze and report various themes in the data set. Braun and Clarke (2006) describe a theme as something that illustrates the gathered data in relation to the research question. Furthermore, they point out, that the themes are not something that arises from the data by itself, instead, it is the judgment of the researcher that decide what should be encountered as a theme (Braun and Clarke, 2006). Braun and Clarke (2006) add that even though the researcher has an active role finding patterns, it does not need to be negative as long as the researcher is observant of their assumptions and decisions when choosing themes. However, Javadi and Zarea (2016) highlight that a big issue with the method is that it increases the chance of bias as a

consequence to the researchers simplistic and own assumptions when analyzing. They especially highlight that interview questions can be caused by the researchers’ presumptions, and therefore could impact the presentation of the data. Despite the concerns stated by Javadi and Zarea (2016), the choice to use a thematic analysis was made based on that it is flexible and easy to usu. The choice was also based on Ryan and Bernard (2003) argument that themes can arise from what the participants do not verbally state, for example through observations. With this in mind, the thematic analysis includes the observations made from the interviews and are included in the themes. In order to conduct a thematic analysis, Braun and Clarke (2006) presents six steps that should be used as a guideline when analyzing data. The steps are presented below and were used when analyzing the data from the interviews and observations.

Step 1) Familiarize with the data - The first phase seeks for the researcher to familiarize themselves with the data gathered from the user study and is usually done by listening to the recorded interviews and transcribing the material. Thereafter the researcher should repeatedly read through the material to search for patterns. Braun and Clarke (2006) highly recommend taking notes and begining to think about possible codes. They also point out that in cases where the researcher is not an attendant of the interview, the familiarization step is even more crucial.

Step 2) Generating initial codes - Based on the understanding of the data the researcher should begin to create initial codes. The codes are created by addressing every line of the transcription with a describing note to distinguish the property of data that is of special interest and relevance to the research question.

Step 3) Searching for themes - The third phase is based on the initial codes created in the previous step and seeks to find patterns between them. The patterns can be found by analyzing the codes and group them by categories to create themes and sub-themes.

(20)

Step 4) Reviewing themes - In this step, the created themes are reviewed by comparing them to the original data to fine-tune them by changing, adding or removing themes in order to compile with the raw dataset.

Step 5) Defining and naming themes - Based on the reviewed themes this step seeks to name and define each theme to describe what the theme captures.

Step 6) Producing the report - The final step is to create a finalized report with a presentation of the result from the analysis and the method used.

3.1.4. Ethical Aspects

In order to protect the integrity of the participants, good research practice should always be applied in a research study (Vetenskapsrådet, 2017). The participants should prior to the study receive

information about the purpose of the research, their right to withdraw and how the material will be used (Vetenskapsrådet, 2017). During this study, all of the participants received information about the purpose of the study and their rights, and an informed consent was obtained. Furthermore, due to the GDPR (Datainspektionen, n.d.), all of the participants received a randomized number in order to be anonymized.

3.2. Method

In order to gather data both group and individual semi-structured interviews were used, and five of the fifteen interviews included an observation.

3.2.1. Participants

Prior to the study a list of potential participants was given by Spotify. The list contained users from different teams with previous and current experience of the tool. The inclusion criteria to participate in the study was for the users to have previous experience of the tool and that the tool is of importance for their work tasks. In addition to the list, participants were recruited though team slack-channels advertising about the study and its purpose. As a result, two participants volunteered from the open advertisement. Furthermore, a total of twenty participants from the list were contacted directly with respect to include participants from different teams, out of these twenty, five of them were excluded due to the inclusion criteria and two declined to participate in the study. As a result, the research was conducted on fifteen participants from seven different teams.

(21)

3.2.2. Interviews and observations

The main purpose of the interviews was to gain insights about the user's thoughts, experience and attitudes towards the tool and also to understand how they use it. Prior to the study, an interview guide was created and can be found in Appendix A. The guide was used in the interviews and a semi-structured method was chosen to gain a sufficient amount of information by welcoming discussions during the interview.

As mentioned previously, the interviews were conducted on fifteen participants, but as two

participants worked in the same team and used the tool in a similar way, the decision to include the two participants in a group interview was made. In other words, the research study contained thirteen individual interviews and one group interview. In addition to the interviews, five of them contained an observation and required the participants to demonstrate how they use the tool and which features they used. The purpose of this was to gain a better understanding of how the participants navigate within the tool and what features are the most crucial when troubleshooting. The reason to include

observations for only five of the participants was due to a limit amount of time as observations are more time-consuming than interviews.

Furthermore, all of the interviews were conducted in a meeting room isolated from disturbing noises and with the possibility to connect their computer to a screen in order to demonstrate how they use the tool. The average time for the interviews were 30 minutes and for the interviews that included an observation had an average time of 45 minutes. Furthermore, the interviews were audio recorded and notes was taken on a laptop. No visual recording was used in order to avoid making the participants uncomfortable. Instead, detailed notes were taken during the observation.

3.3 Results

The results from the thematic analysis is based on an indicative approach which means that the themes aim to capture all material from the interviews, including the observations. However, the interview guide was designed to understand the user's problems and needs which influences the created themes. In order to answer the following research question “What problems and frustrations do the users have?”, the Braun and Clarke (2006) six steps were applied to analyze the data. The whole process was iterative which required continual adjustments of the themes. The analysis began with reading through the notes gathered from the interview to assess the notes with initial codes. In other words, give a specific part of the data a label. Secondly, the codes were written on post-its together with a unique number for each participant. The purpose of the assigned numbers was to be able to identify each participants statement to adjust the themes to the raw material. Next, the post-it notes were re-organized and grouped on a whiteboard to find common categories and themes. The process required

(22)

continual adjustments by comparing the categories with each other and to the raw material. The finalized categories were then named after what they capture and resulted in the five themes presented below. To respect the participants integrity, all have been assigned with a randomized number. Furthermore, the majority of the quotes were stated in Swedish and have been translated to English in the thematic analysis below, the original quotes can be found in Appendix B.

Table 1. Themes and sub-themes from the thematic analysis.

Themes Sub-themes

Usage of the tool • Troubleshooting

• Reporting problems • Getting an overview Users’ knowledge and experience of the tool • Previous experience • Limited knowledge • Shallow learning curve • Complexity

Limitations • Knowledge requirements

• System failure

Frustrations • Find specific information

• Narrowing the problem space • User experience and user interface

Discoverability • Limited awareness of available

functions • Forgetfulness

3.3.1 Usage of the tool

Based on the data gathered from the interviews, three distinct usages of the Visualizer were identified: troubleshooting, report an error and to overview. In general, the usage of the tool was determined by which team and service the user is responsible for, but are in the most cases used for troubleshooting. In the following sections, these usages will be further elaborated.

Troubleshooting

As mentioned above the majority of the participants use the tool for troubleshooting purposes and on average utilize the tool once a week. To troubleshoot, the users view the visual-flow that illustrates the different calls sent between the different back-end services. This enables the users to oversee the

(23)

communication and makes it possible to detect abnormalities to determine what should be investigated further.

“We use the tool for debugging end to end requests to check what is happening in the back-end and what payloads are being passed around”

- Participant 3

However, users also stated that they have a limited knowledge about all the services and instead have a deep domain knowledge of a few. The below quote has been translated from Swedish where the participant states that he thinks that most of the users are aware of the different services in the tool, but have more knowledge of their own services.

“I have a sense of the involved services, which I think most have. However, each team has control of the bugs in their own systems. So there you are better able to keep track”

- Participant 14

Therefore, they only receive reports and errors that occur on their services. In other words, the users are not able to troubleshoot all issues, only the ones that are related to their backend service.

Furthermore, when receiving an issue, the participants state that the context of the problem is crucial to locate the root cause efficiently.

“I often receive a link together with an explanation of what they think the problem could be about and that is really nice because the context is crucial for me to know where to start troubleshooting. In some cases teams have started to troubleshoot before reaching out to me which is really like”

- Participant 15

Reporting problems

In contrast to troubleshooting, a few participants have stated that they use the tool to report problems. In general, the users tend to report problems that themselves are not able to solve due to their lack of knowledge of the tool. In short, users that are not familiar with the tool tend to report problems to other teams. However, the participants state that they have a sense of what could be the cause of the issue but are not sure. As a result, they forward the issue together with a description of the context of the problem to the developer responsible for the specific services that they suspect is causing the error.

(24)

“We use remote admin to check for player cookies or to look for errors during, for example, a long-run playtest. If we see an error during the night, we record the data or try to recreate the error in order to pass it on to the developers. So if we see a bug, we save it in a j-son file and send it over with some information about what happened, at what time, etc.”

- Participant 9

“I don’t spend that much time troubleshooting but if my team notice an error we try to recreate it to record and send it to the team that is responsible for the service where the error occurred”

- Participant 1

However, not all the users that report problems to other teams do it because of lack of technical knowledge about the tool. Instead, one participant stated that when fronted with a more complex issues, they have to reach out to other teams in order to troubleshoot and solve the problem.

“For more difficult bugs, I need to talk to other teams to troubleshoot if there is a fault in our service or in theirs. Sometimes it turns out to be theirs and then we pass the bug on to them”

- Participant 15

Getting an overview

Besides troubleshooting and reporting issues a few participants have stated that they use the tool to get an overview of the back-end services. For instance, they do it out of curiosity to acquire a better understanding of how the back-end is structured and operates. Furthermore, one participant stated that he uses the tool to learn more about how different errors are visualized.

“Sometimes I troubleshoot out of curiosity to understand which requests are involved in each error”

- Participant 4

However, users do not only use the tool out of curiosity to learn about issues. It is also used to

understand each service which has the advantage of providing the users with a better understanding of the back-end as a whole.

(25)

“The flow-chart has an indirect benefit of providing the user with a visual presentation about how the different services talk to each other”

- Participant 13

To summarize, the tool is not only used for troubleshooting and reporting but also to gain a better understanding of the back-end systems, which is invaluable when working with complex back-end systems.

3.3.2. Users’ knowledge and experience of the tool

The second theme from the analysis is the user's knowledge and experience of the tool. All of the participants have stated that they have had or have issues with understanding the tool, regardless of their level of knowledge, and highlight the importance of making the tool easier to both learn and use.

Previous experience

The majority of the participants, regardless of their technical knowledge, stated that they learned to use the tool by trial and error. Especially in cases of troubleshooting, the participants stated that they have learned what to expect if an error occurs in order to detect deviations. For instance, users that have a good understanding of the tool stated that they "just know" when something is abnormal due to having learned what to expect for different issues. In other words, due to trial and error, the users are able to detect abnormalities in order to distinguish where to investigate further.

“If a call is missing then I just know that. To troubleshoot you must learn how it should look like and by yourself understand if something is missing. This is something that you just learn”

- Participant 3

Furthermore, participants stated that knowledge of the tool and previous experience is crucial in order to troubleshoot and is attained through trial and error.

“In order to understand the messages that are sent between the different services you need to have previous experience to sort out what might seem interesting or not. And before you do that you need to sort the messages based on order because- if you don’t know you don’t know”

(26)

However, previous experience and knowledge of the tool is not only useful to detect abnormalities but is also valuable to know where to begin the troubleshooting.

“So usually it is similar error, so now you have kind of learned where to start

troubleshooting. But as I said, this is something that I have learned after using it quite a lot.

- Participant 15

To summarize, all of the participants stated that to be able to use the tool for troubleshooting they had to attain a profound knowledge and experience of the tool through trial and error. In order to learn to detect abnormalities and what to expect when encountering different issues.

Limited knowledge

Regarding the knowledge and experience of working with the tool, the majority of the participants stated that they were uncertain of all the functionalities of the Visualizer. For instance, due to the limited knowledge, the users are restricted to using the tool for simple tasks as reporting or overviewing the services.

“But personally I use it for quite simple things, since I don’t really know how it works”

- Participant 9

Furthermore, the same participant stated that she wish to get a better understanding of the tool and how it could be used in order to be able to troubleshoot.

“But I would have liked to learn more about the tool! Feels like there is a lot that I don’t know about and I think that if I had known more then I would have been able to use the tool better than I do today, haha”

- Participant 12

As a consequence of the limited knowledge, the users need to click and explore to find the information they need, regardless if they are troubleshooting or reporting an issue. This is something that the participant's state is both time consuming and a direct consequence of their limited knowledge of where to find certain information.

(27)

“Some in my team know what to start investigating for various errors, which means that it will not take them long. But on the other hand, I don't know where everything is, which means I have to click and expand a lot of information to find the right one, which of course takes a lot longer for me than it does for them because I have to look in a completely different way”

- Participant 7

“I am only interested in seeing one parameter and then I have to click through every request. You kind of learn that you should take the first one but then you don't really know where it is so you have to click around anyway. So if someone is looking for this parameter it would have taken a very long time to do it the first time because you do not know where to look. But I do not know, maybe it is because we do not know how the tool works properly and therefore just click until we find the right way”

- Participant 4

Shallow learning curve

As mentioned before, the participants stated that to be able to use the tool it requires previous experience gained from trial and error. Furthermore, in order to gain the required knowledge, both experience and unexperienced users, stated that it takes a lot of time.

“It takes a lot of time to understand how everything works”

- Participant 7

This points to the fact that, regardless of their previous experience, users were still not aware of all the functionality available in the tool. For instance, one participant, even though he had used the tool for two years, was still not aware of all the functionality within the tool and asked during the interview for a feature that is already integrated in the tool. This indicates that the users are continuously learning about the functionality and the tool.

“It was difficult to understand the system initially but now I understand it more and more. But there are still some things that I do not understand or know what they mean”

- Participant 15

However, although the tool requires both time and effort in order to be able to use it, all of the participants stated that the tool is invaluable for them and their team.

(28)

“ Despite everything, the tool is still invaluable for us. I think other teams would have been interested in this to see how different services talk to each other”

- Participant 8

Complexity

Even though the tool was described by the users as invaluable, the participants highlighted that the tool is complex in the sense of how the information is visualized. For instance, one participant mentioned that the visual information of the calls between the services is hard to understand and are not "human-readable"- Participant 3. However, due to the user's previous experience, they still reported being able to detect abnormalities despite the complexity.

“I don’t understand everything that happens in the tool but if something is wrong I usually can find it anyway”

- Participant 11

Furthermore, participants stated that the complexity is also a consequence of the amount of information that is visualized in the tool.

“ I spend most of my time trying to understand how everything is and how everything is related. In the beginning it was a lot of information for me to try to understand which required me to talk to other people to understand it.”

- Participant 7

In other words, due to the tool provides information about all calls between Spotify's back-end services, it increases the complexity.

“ I think it is difficult to see the flow between the different services, it goes without saying that it would look nicer and I think it would also make it easier if it didn’t look so complex. So right now I spend most of my time trying to understand where everything is and how they are connected, it gets complex in a way.”

(29)

3.3.3 Limitations

Despite the required experience and knowledge required to use the tool, the majority of the participants stated that the tool could be beneficial for developers in other parts of Spotify's

organization. However, the participants think that the tool should not be shared as it is today due to the following limitations of the tool. These limitations will be elaborated below:

Knowledge requirements

The first limitation is that the tool requires knowledge about the back-end services to be able to use it. As mentioned previously, the participants stated that the knowledge is crucial to detect abnormalities in the calls sent between the services. In other words, the user has to be able to understand the logic behind how and why the different back-end services communicate as they do.

“You have to know what is expected to happen to know when something is wrong. In order to know this you need to understand the logic behind the communication between the services”

- Participant 10

System failure

The second and main limitation stated by the participants is that the tool sometimes fails when using it. For instance, the tool fails when loading big amounts of data which causes the tool to "freeze", which creates confusion and frustrations.

“The tool has really bad performance when looking at much traffic at the same time. As it is right now it often freezes when too much information is viewed”

- Participant 8

Furthermore, the participants stated that the most critical system failure is when calls disappear or are not visualized at all. This is especially problematic for users that are not familiar with how the calls between the services should be visualized, and therefore could cause the user to be deceived. With respect to this, one of the participants highlighted that the tool should not be distributed to other teams before addressing the unreliability of the tool.

3.3.4. Frustrations

The fourth theme describes the frustrations stated by the participants, which were shown to include finding specific information, narrow down the search space and the user experience of the tool. The

(30)

difference between the frustrations and the limitations theme is that the participants stated that in order for the tool to be distributed to other teams, the limitations needs to be addressed. The frustration, on the other hand, reflects the current user’s frustration of the tool.

Find specific information

As previously mentioned, the context of an issue is crucial for the user to know where to start

troubleshooting to be able to find the root cause of the problem. In most cases, based on the context of the issue, the users know which information or parameter they need to inspect first.

“I don’t always know what to look for but I know where I should start to investigate”

- Participant 15

In order to find the issue, the participants stated that they need to open every suspect call to investigate the parameters. This is something that the participants state is both frustrating and time-consuming due to that they know what should be examined but are not able to locate it. The main reason is due to the amount of information displayed in the tool which makes it difficult to find.

“I find it painful to open and collapse all the information when I know what

information I need to look at because most often I am interested in investigating only one parameter. Something that have been wonderful to have is something that makes it possible to find specific information, maybe a search bar or something similar”

- Participant 1

Furthermore, to be able to find specific information, one participant stated that he bypasses the step of opening and collapsing information, and instead exports information to a code editor to be able to search. This requires him to download the data and move the troubleshooting to another place which is both time-consuming and creates additional complexity.

“Sometimes I have not been able to open and collapse to find information, so I have sometimes downloaded data to be able to troubleshoot in a code editor. Really ugly trick but it has worked. Especially when there is a lot of information that you can't really go through”

(31)

Moreover, to avoid opening and collapsing information, the majority of the participants use control + F to search on parameters in the call messages, which indicate that they want to be able to search for specific information.

However, the frustration experienced when trying to find specific information is affected by the experience of the user. For instance, more experienced users can locate information without opening and collapsing all the calls. On the other hand, when fronted with a complex issue the experienced users need to examine further which requires them to investigate many calls.

“Now I have learned where to find specific information, which I was not able to at the beginning, so now I do not really open and collapse messages because I know in advance where to look. But as I said, if there are more difficult problems that we are troubleshooting, then of course I would need to do this”

- Participant 15

Narrowing the problem space

The second frustration is narrowing down the problem space. As mentioned previously, developers’ knowledge is limited to a few services and therefore they only receive error report on services that themselves are responsible for. In other words, the developers are not involved in all the back-end services and therefore ignore those that are not relevant.

“I only have knowledge of my own services and a little knowledge of the rest but most often I ignore those services that are not relevant for me”

- Participant 1

In order to narrow down the problem space and ignore the services that are not of relevance, the users need to remove the services from the visual flow before investigating the issue. In other words, due to the large amount of information shown, the users need to filter out and remove the services manually from the user interface that are not relevant for troubleshooting a specific issue. This is something that all of the participants state is time-consuming, difficult and required each time they troubleshoot.

“I think filtering is the hardest thing. It takes a lot of time because I always have to shut down several services. Would have liked to avoid doing this every time.”

(32)

“My biggest struggle is that I need to remove all services that I am not interested in and I'm usually only interested in seeing how the communication between three or four service looks like. This takes a lot of time and can sometimes be hard.”

- Participant 11

“It is important to click out of the services that I know I do not use or need to check. I am pretty tired of having to deal with this every time and remove services because it takes so much time.”

- Participant 2

Furthermore, a few participants stated that to remove services that are not of importance is what takes the most time when troubleshooting

“Learning what services can be removed and doing it is what takes the most time and is most difficult”

- Participant 7

“What takes the most time in troubleshooting is definitely having to remove services. Also I think it can be difficult to decide what can be removed or not, you have to know this which is difficult at first. So yeah, it's really awkward.”

- Participant 8

However, the reason to narrow down the problem space is not only due to the participants’ interest in a limited number of services. It is also to make the issue graspable by removing information to obtain an easier and better perception of the problem space. For instance, one of the participants highlighted that it would be beneficial to have a customized view for him and his team.

“I wish that I didn’t need to close down all of the services each time. It would be really nice if it could be saved to next time or if I could customize what information I saw either for or for my team”

- Participant 3

User Experience and User Interface

(33)

“I like the visualizer but I think it could have been improved with better design. Functionality is great but the design can definitely be better”

- Participant 7

“I think that the UI is problematic and particularly when tracking data between the various services. It would have been nice if you had been able to highlight to be able to more easily follow how information moves between services”

- Participant 5

Mainly due to the amount of information visualized in the tool, which makes it difficult for the users to overview all the information. A few participants stated that by changing the color and adding white space could improve the user experience of the tool.

“As there can be a lot of information displayed, it can sometimes be difficult to follow how the various services communicate. I think that just by being able to highlight or change the colors it would probably have made it a lot easier to see”

- Participant 13

“I am mostly bothered by the UX, by adding some padding, changing color and

changing the buttons, I think it would have made a big difference. But the functionality itself is okay but the design can definitely be improved, so that is bothering me the most”

- Participant 8

Furthermore, one of the participants stated that the tool, as it is today, gives the impression of being very complicated and complex, and suggest that it is due to how the information is shown to the users.

“It could definitely have used a make-over to not look so daunting for new users”

(34)

Moreover, one of the participants stated that he believes that a possible reason for why so few use the tool is due to the tool giving a bad first impression because of the current user interface.

“What could have been improved is definitely the UX. In fact, I even think that more people would like and be able to use it then”

- Participant 13

3.3.5. Discoverability

The last theme is the lack of discoverability which causes the user to forget or be unaware of the functionality in the tool.

Limited awareness of available functions

During the interviews, a few participants stated that they wished for certain functionality and features in the tool that are already integrated. This indicates that the users are unaware of the functions available. One of the participants, during the interview, began to explore the different features and stated:

“Aha, that's a really smart! didn’t know that I could do that!”

- Participant 15

Furthermore, the participants highlighted that they want to gain a better understanding of the tool and the features as they are not aware of all the functionality.

“I think it can be improved by making it clearer what features are in the tool. It

becomes more a discoverability issue. So think it can be made more clear about what is possible and what is and isn’t possible to do”

- Participant 14

Because of the lack of discoverability, the participants wish that the functionality should be

highlighted in the interface and especially regarding the feature that enables the user to compare two different call messages.

“I like that you can choose two services and that the request between them is

highlighted. In contrast, it took a long time before I knew it was possible to do this but it has made it a lot easier”

(35)

This feature is obtained by left clicking on the two services and a few of the participants were not aware of this. For instance, one of the participants stated that he discovered the functionality by chance when clicking on the calls.

“I discovered ‘compare’ by clicking around the interface, can't remember what I did but suddenly I had it”

- Participant 11

Furthermore, one participant stated that the functionality to compare two calls have been time-saving and invaluable when troubleshooting, which he therefore wish should be made more visual.

“Being able to compare two requests is incredibly helpful, you can have them next to each other and compare which is valuable which saves time when troubleshooting the messages. However, it was difficult to find this functionality, so it should be made clearer that this is possible”

- Participant 8

In other words, by improving the discoverability of the features the user will be aware of important functionality that could have an impact on the efficiency when troubleshooting.

“It took a long time to understand the functionality of the tool for example to compare messages and to highlight the requests”

- Participant 11

Forgetfulness

A second consequence of the lack of discoverability is that users reported how they forgot about certain functionality in the tool. For instance, one of the participants when asked to walk through the last time he troubleshot, forgot how to enable the view to compare two different call messages. He highlighted that the reason could be that he has not used the tool for three weeks.

“I usually compare the two but now I don’t remember how to do so”

(36)

However, to be able to compare two messages he had to navigate on the user interface in order to recall how to enable the functionality, and suppressed that he usually uses it when troubleshooting. Furthermore, when asked about the functionality of the tool, a few participants stated that they are aware of functionality but are not able to find it. This indicates that the functionality is not visible when viewing the user interface.

“I know that there are several features but I do not know where to find them or how they can be used. I know, however, that you can compare two requests but always forget how to get it up. Sometimes I succeed and sometimes I don't, but it takes a while before I can get into this”

- Participant 3

3.4 Discussion

The analysis resulted in the following six themes: Usage of the tool, knowledge and experience of the tool, limitations, frustrations and discoverability. In order to discuss the analysis, the discussion has been divided into two parts, the first part discusses the usability of the tool based on Nielsen (2012) five components of usability: learnability, efficiency, memorability, errors and satisfaction. The second part of the discussion covers the define step in the Design Thinking Model.

To summarize the analysis, the usage of the tool varies depending on the user’s expertise and the purpose of use. However, despite the experience of the users, they stated that the tool is difficult to use due to bad discoverability and the extensive amount of information visualized. This leads to

frustrations for the users when trying to find specific information or to narrow down the problem space.

To begin with, learnability is defined by Nielsen (2012) as how easy it is for a user to accomplish a task the first time faced with the design or product. With the definition in mind, the conclusion is that the tool has poor learnability because of the following reasons: Firstly, the participants have stated that they experienced the tool to be difficult to use regardless of their previous experience. Furthermore, they highlighted that the first impression of the tool was appalling. Moreover, the participants have stated that the tool has a steep learning curve and that they still are not aware of all the functionality of the tool. These reasons based on the statements of the participants, indicates that the tool has poor learnability as the users’ experiencing the tool hard to use, despite their previous experience and knowledge of the tool.

(37)

Secondly, Nielsen (2012) defines efficiency as the time it takes for the user to accomplish a task. This is difficult to assess due to the fact that time needed to troubleshoot depends on the complexity of the issue and the user’s previous experience with similar problems. However, the users stated that removing services before troubleshooting and finding specific information is not only frustrating but also time-consuming. This indicates that the tool can improve its efficiency when using it by

addressing these frustrations.

The third component presented by Nielsen (2012) is memorability which he defines as the extent to which the user can reestablish proficiency when the user has not used the product or design for a period of time. With this definition in mind and the statements from the participants regarding the discoverability of the tool, indicates that the tool has bad memorability. Especially, as one of the participants, during the observation, forgot how to enable a feature that he uses on a regular basis.

Furthermore, Nielsen (2012) present Errors as the fourth component to measure usability and is defined as the number of errors, their severity and how easy it is to recover from them. This component is hard to assess as the number of errors and severity depends on the issue they are troubleshooting. For instance, a more complex problem might cause the user to make more errors. Therefore no conclusion can be drawn on this component of usability.

The last component that Nielsen (2012) highlights to characterize usability is the satisfaction which he describes as the user’s thoughts about the design or product. Participants stated that the tool can be improved in several ways, for instance by adding feedback to the users, improving the design and adding new features, which indicates that the users are not satisfied with the tool as it is, especially with respect to their frustrations and concerns regarding the system failures, the user experience and the amount of information visualized in the tool. On the other hand, the participants had stated that the tool is invaluable for them, which makes it hard to draw a conclusion whether the users are satisfied with the tool or not.

To summarize, based on integrating the results from the thematic analysis discussed with Nielsen’s (2012) five components for usability, the conclusion is that the usability of the tool can be improved, and especially the efficiency component. It is also important to point out that the researcher has an active role when deciding on which patterns and themes to identify (Braun & Clarke, 2006). In other words, due to the purpose of the thesis and the research questions, to identify the user's problems and how the tool can increase the efficiency. The thematic analysis and the interview questions have not focused on the user's positive feelings towards the tool.

References

Related documents

46 Konkreta exempel skulle kunna vara främjandeinsatser för affärsänglar/affärsängelnätverk, skapa arenor där aktörer från utbuds- och efterfrågesidan kan mötas eller

För att uppskatta den totala effekten av reformerna måste dock hänsyn tas till såväl samt- liga priseffekter som sammansättningseffekter, till följd av ökad försäljningsandel

The increasing availability of data and attention to services has increased the understanding of the contribution of services to innovation and productivity in

Generella styrmedel kan ha varit mindre verksamma än man har trott De generella styrmedlen, till skillnad från de specifika styrmedlen, har kommit att användas i större

Parallellmarknader innebär dock inte en drivkraft för en grön omställning Ökad andel direktförsäljning räddar många lokala producenter och kan tyckas utgöra en drivkraft

Närmare 90 procent av de statliga medlen (intäkter och utgifter) för näringslivets klimatomställning går till generella styrmedel, det vill säga styrmedel som påverkar

Respondent A also states that if a current client makes changes in the ownership, a new credit assessment process will be initiated and if the bank does not get to know

The EU exports of waste abroad have negative environmental and public health consequences in the countries of destination, while resources for the circular economy.. domestically