• No results found

An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians and carers

N/A
N/A
Protected

Academic year: 2021

Share "An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians and carers"

Copied!
33
0
0

Loading.... (view fulltext now)

Full text

(1)

IN THE FIELD OF TECHNOLOGY DEGREE PROJECT

MEDIA TECHNOLOGY

AND THE MAIN FIELD OF STUDY

COMPUTER SCIENCE AND ENGINEERING, SECOND CYCLE, 30 CREDITS

STOCKHOLM SWEDEN 2020,

An usability evaluation of TRIO’s e- learning modules enhancing the communication between cancer patients, clinicians and carers

MELANIE BONNAUDET

KTH ROYAL INSTITUTE OF TECHNOLOGY

SCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

(2)

Abstract

The involvement of carers in oncology is important for the health of people diagnosed with cancer as well as carers themselves. To improve their involvement, three groups;

patients, their carers, and clinicians should maintain good communication. The e- learning interface, eTRIO, has a learning module for each of these three groups. The design of eTRIO is based on research from psycho-oncologists. This study aims to

answer the question; What are the strengths and weaknesses of the eTRIO interfaces for

clinicians, carers, and patients in terms of their usability? Heuristic evaluation and

think-alouds have been conducted to answer this. The results of this study show that

interactive activities, as well as neatly presented content, are engaging the user, buttons

and content should have clear purposes. The eTRIO interface will enhance carers'

involvement with good usability, making it easy for users to retain important

information. Strengths and areas for improvement will be presented in this study.

(3)

Sammanfattning

Inkludera cancerpatienters närstående i onkologi är viktigt för både cancerpatienter och deras närstående. För att förbättra de närståendes inkludering måste tre grupper;

patienter, deras närstående och läkare ha god kommunikation mellan varandra. E- lärande platformen, eTRIO, har en modul för varje ovannämnd grupp. Designen av eTRIO är baserad på forskning av psyko-onkologer. Denna studie har som syfte att besvara frågan; Vad är eTRIOs gränssnitts styrkor och svagheter för läkare,

cancerpatienter och närstående med avseende på användarvänlighet? En heuristisk utvärdering och think-alouds har gjorts för att svara på denna fråga. Resultaten av denna studie visar att interaktiva aktiviteter och visuellt tilltalande presenterat innehåll engagerar användarna samt att knappar och innehåll behöver tydliga syften.

Gränssnitten av eTRIO kommer att förbättra närståendes inkludering med bra användarvänlighet och gör det lätt för användare att komma ihåg viktig information.

Styrkor och områden för förbättring kommer att presenteras i denna studie.

(4)

An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients,

clinicians and carers

Melanie Bonnaudet The University of Sydney

Sydney, Australia melbon@kth.se

ABSTRACT

The involvement of carers in oncology is important for the health of people diagnosed with cancer as well as carers them- selves. To improve their involvement, three groups; patients, their carers, and clinicians should maintain good communica- tion. The e-learning interface, eTRIO, has a learning module for each of these three groups. The design of eTRIO is based on research from psycho-oncologists. This study aims to an- swer the question; What are the strengths and weaknesses of the eTRIO interfaces for clinicians, carers, and patients in terms of their usability? Heuristic evaluation and think- alouds have been conducted to answer this. The results of this study show that interactive activities as well as neatly presented content are engaging the user, buttons and content should have clear purposes. The eTRIO interface will enhance carers’ involvement with good usability, making it easy for users to retain important information. Strengths and areas for improvement will be presented in this study.

Author Keywords

Usability; user experience; think-alouds; heuristic evaluation;

e-learning; medical teaching.

1. INTRODUCTION

Cancer incidences are rising and becoming overall more com- mon, according to Bray et al.’s global cancer statistics 2018 [3]. As cancer affects millions of people across the world it is important to provide accurate cancer care.

The cancer patient’s medical situation and decisions have an impact on their relatives’ life and health [1]. Therefore, it is important that relatives can participate in the patient’s medi- cal consultations and treatment decision-making. The carers to the patient have expressed needs in getting medical and behavioural information [14]. Oncologists also revealed to have needs in learning how to provide adapted information for carers [21]. In general, patients value the help and support of carers [12]. They need support outside the medical system, from family or friends, in the challenging cancer experience and in their interactions with the medical system. Hereby, for ideal cancer care, patients, carers and oncologists need to en- hance their communication and behavioural skills for medical situations.

A project with this aim is "TRIO, Clinician-patient-family working together for quality care". It is carried out by mem- bers of the Psycho-oncology Co-operative Research Group (PoCoG) from the University of Sydney and is grant-funded by Cancer Australia and Cancer Council NSW. The TRIO Framework is introduced by Laidsaar-Powell et al. [13] and has three main persons:

• The cancer patient: a cognitively competent adult cancer patient.

• The main clinician: an oncology physician.

• The main caregiver: someone related to the patient biolog- ically, legally or emotionally, accompanying the patient to medical consultations and assisting in the patient’s care.

To enhance the communication between cancer patients, their caregivers and oncologists, the TRIO project has created an on- line e-learning platform, called eTRIO, with content based on TRIO guidelines [9, 10]. Each group in the TRIO Framework has its own learning module with sections covering differ- ent topics. It is a multimedia e-learning interface as it has text, videos and different interactive activities. These are, for example, yes-no questions, sorting cards according to their importance, surveys about the user’s emotional profile, etc.

(see Appendix A for eTRIO walk-through). This type of con- tent makes learning an active and engaging process. However inadequate user experience can affect the learning outcome negatively [8]. Therefore, this study contributes to the TRIO project by evaluating the usability of eTRIO. It investigates the user experience of the TRIO e-learning interfaces to determine what interactions and presentations of content are needed to successfully meet the users’ communication and information needs.

2. BACKGROUND

2.1 The TRIO Framework and their needs

Laidsaar-Powell et al. [13] introduced the notion of a TRIO Framework, also called TRIO Triangle. It has a cancer pa- tient, a main clinician and a main caregiver. The purpose was to study why it is important to understand the involvement of family caregivers in cancer care. Their study shows how much they are involved in treatment decisions in cancer con- sultations. Results have shown that caregivers’ involvement depended on several factors and varied from person to person.

(5)

These factors can be demographic, psychological, relational, cultural and medical. Three different cases are given with different influences in decision-making from the caregiver.

An example of the first case is where a patient and oncolo- gist discuss whether to undergo chemotherapy or not. The caregiver states they will support whatever decision the pa- tient makes, their influence in this decision is very small. As shown in Figure 1, the decision is focused on the patient and oncologist.

Figure 1:TRIO Framework/Triangle with focus between the patient (left) and the clinician (top). Figure from [13].

A second example case is of a patient who has to choose whether to delay chemotherapy or to undergo fertility treat- ment. The patient discusses this with her husband, and he states he would prefer the fertility treatment. As shown in Figure 2, the decision-making focus is mainly on the patient with influence from her husband.

Figure 2:TRIO Framework/Triangle with focus between the patient (left) and the caregiver (right). Figure from [13].

A third case is where a patient with limited English proficiency has cancer and his son, fluent in English, exchanges all infor- mation with the oncologist. The son directs the conversation with the consent of the patient. The decision-making of the treatment focuses on the son, the caregiver.

How much each party is involved is important information to the TRIO e-learning website as all three are target groups for eTRIO. Triadic communication has been shown to be

helpful in medical encounters but can also be challenging [11].

To facilitate this communication, psychologists, clinicians and academics have through years of research determined the TRIO guidelines. These have been used to design the TRIO e-learning modules. Guidelines consist of enhancing the collaboration of caregivers’ involvement in medical encounters of oncology as well as how to handle challenging interactions with caregivers [9, 10].

What needs family members have in the decision-making of treatment for people with chronic diseases have been studied by Lamore et al. [14]. Family members need to be pro- vided with medical knowledge and often want to participate in decision-making discussions. In these discussions, Lamore et al. have shown that adopting helpful behaviours is needed such as not being dominating, providing information and sup- porting the patient. The patient must also be allowed to decide when they need a private conversation with the physician with- out their family member. Lamore’s study mentions that all three parties, caregivers, patients and physicians are positive towards the involvement of caregivers in medical consultations and decisions. Although, for the involvement to work well, caregivers need information, both medical and behavioural.

The TRIO e-learning aims to provide this type of information, which is why it is important to study the interface before being released for accurate learning and caregiver involvement that fulfils the three parties’ needs.

How the partner’s involvement is related to decision-making in triadic cancer consultations has been studied by Bracher [2]. In these consultations, the partner has had different roles, behaviours and relationships with the patient and clinician.

Some have been dominant, interruptive, difficulties to cede while other characteristics have been emotional support and helpful contributions such as self-initiated questions. Spouses and children to the patient are more likely to engage than other relatives and friends. Some physicians interacted more with the patient than the partner and often shifted back the conversation to the patient. This showed how all involved in triadic consultations need to get a better understanding of their role and communication with each other.

A literature review about the health of patients and their care- givers by Hodges et al. [7] showed there is an association be- tween their well-being. The caregivers’ psychological health and stress level are strongly related to the patients’ health and stress. The patients are also very likely to become distressed when their caregiver does; they both get a similar level of distress. Bevans et al. [1] have studied how a caregiver’s life and health get influenced by the caregiving responsibilities of a cancer patient. These responsibilities bring stress and burdens that affect the caregiver negatively. Stressful moments are inevitable, but they can be eased. Bevans concludes that avoiding barriers between caregivers and physicians by let- ting caregivers participate in the medical proceedings, is good prevention. McCarthy [15] has studied carers’ information needs: they need medical information and often want to hear it from the physicians. It is recurrent that they have to actively seek it from physicians and can feel ignored. Soothill et al’s study [20], shows that caregivers to patients have more unmet

(6)

needs than the patients. This shows how communication is important for everyone’s well-being. The TRIO e-learning has an important aim to target these problems and educate all three user groups to avoid unmet needs.

2.2. E-learning and Usability of e-learning

Ruiz et al. [18] have done a literature study on e-learning’s effectiveness and how it can be applied in the medical world.

They state that e-learning enhances individual learning and can be integrated into education as well as be used during duty hours. Being able to use e-learning in these circumstances can be convenient for the medical world. Ruiz’s study also points out important things to think about when evaluating an e-learning platform, for example, the ease of use, naviga- tion, the material, interaction, etc. Without investigating these examples, e-learning loses its effectiveness. Therefore, the eTRIO can be suitable for its user groups, if the usability is adequate.

The first steps in developing e-learning for oncologists to im- prove their information-giving skills have been studied by Stuij et al. [21]. They have shown the oncologists’ learning needs within how they should provide information to patients as well as their training preferences within their profession. Oncol- ogists want to be able to adapt the information for patients, structure the information and deal with patients’ emotions. Fo- cus groups and interviews revealed that the preferred learning method is a digital platform with multimedia content such as videos. Feedback from peers, experts and patients would also be appreciated. They want to be able to adapt their learning to their own personal needs, have it easily accessible with simple use. The eTRIO aims to fulfil these needs by letting the user choose which module they want to do without having to follow a particular order and can complete them whenever they want to.

Huang [8] shows that designing an interactive multimedia learning tool with dynamic content makes learning an active and engaging process. Examples of features to enhance this, are to be able to immediately test your knowledge, easily visualise information, having content in different forms such as animations and videos. Huang recommends several steps to create such a tool. Firstly, it is necessary to understand the learning goal and the user needs, to then design the content and utilize adequate technology. Multimedia materials, content in different forms (text, video, images, etc.), are recommended and should be implemented in an e-learning platform. The platform should consider web standards and human factors (e.g. how human behave with the platform) to achieve good usability. User tests are then required to be able to evaluate and improve the design. When the module is built, user tests and heuristic evaluation are needed to know how the modules perform. Especially since technical problems can interfere with the intended learning outcomes. Science, education and technology knowledge must be combined to create an accurate educational media. The current state of the eTRIO is that it needs to be evaluated to ensure good user experience. User experience evaluations such as think-alouds and a heuristic evaluation will be conducted in this study.

2.3. Purpose and research question

The TRIO e-learning aims to enhance the communication be- tween all parts of the TRIO framework. This is an important objective for the well-being of patients, carers and clinicians.

To achieve this, the usability of the e-learning must be ad- equate. Therefore, this study aims to answer the following research question: What are the strengths and weaknesses of the TRIO interfaces for clinicians, carers and patients in terms of usability?

3. METHOD

This section describes the scientific methodology used to an- swer the research question. The method includes a heuristic evaluation, conducting and analysing think-alouds and a Sys- tem Usability Scale questionnaire.

Prior to think-alouds, a heuristic evaluation was conducted to discover major usability flaws, a method that does not involve any users. In this evaluation, the eTRIO interfaces were exam- ined to identify problems that didn’t comply with recognized usability principles. These principles are heuristics determined by Nielsen and Molich [17]. The problems found have been ranked according to their severity, prevalence and ease to solve.

Severity is ranked from 1-5 where 1 is the least severe and 5 the most severe in terms of usability and stopping the user from using the functions properly. Prevalence is ranked from 1-5 where 1 is an infrequently occurring problem and 5 is a frequently occurring problem. The ease to solve is ranked from 1-5 where 1 is a hard and time-consuming problem to fix and 5 is an easy problem to fix. The higher the ranking in the three categories, the more the problem is a priority to fix.

To determine eTRIO’s effectiveness, think-alouds have been conducted. It consisted of having participants use the interface and express their thoughts out loud as they go through it.

Think-alouds have several advantages as described by Nielsen [16] such as being simple to learn. This was beneficial for the collaboration with the TRIO team and the participants that are not familiar with this method. They were conducted by the author of this paper and members from the TRIO team.

The first think-alouds were conducted in person but were then conducted remotely with the video-conferencing tool Zoom1. This allowed recording the users’ interactions on their screen and hear their comments which benefit analysis [6]. The think- aloud method measured the effectiveness and user satisfaction of the TRIO e-learning interfaces. These are two important points also Brooke [5] mentioned to analyse the usability of a website.

Eleven think-alouds were conducted with one participant at a time. The inclusion criteria for participation were that they were doctors or nurses within oncology, patients diagnosed with cancer at least six months ago, or carers to such a patient.

Five think-alouds were conducted with clinicians, three with carers, and three with patients. Nine females and two males participated and were between the ages of 35 and 77. Patients were in average 65 years, carers also 65 and clinicians 47. The participants were recruited by the TRIO team.

1https://zoom.us/

(7)

First, each participant filled out a questionnaire about their background to get demographics. The participants had around an hour to perform tasks that consisted of going through sev- eral sections of the module that included pages with text, videos (Figure 4), and various interactions such as sliders (Figure 5), bubbles (Figure 6) and scenarios (Figure 7). They would begin at the dashboard (Figure 3) and were free to choose which sections they wanted to go through. This because eTRIO is developed without any obligations to go through the sections in any specific order. It was ensured that all sections had been gone through by at least one participant so that all potential usability issues get covered, also several sections contain similar layout and interactions (see Appendix E for user progress during think-alouds). The layouts and in- teractions shown in Figure 3-7 will be mentioned in the result part.

Figure 3:eTRIO clinician dashboard with its sections. Green: the section has been completed, Light blue: the section has not been started. Dark blue:

the section has been started but not completed. In the corner of each section is the estimated time to complete the section.

Figure 4:Video activity with button to click within a section.

The activity in Figure 4 is explained in the blue information box above the video. It consists of clicking on a button (under the video, not seen here) when the doctor makes rapport with the carer. On the next page, the user will be able to see how well they have done on the activity.

Figure 5:Sliders interaction for the clinicians within a section.

The white circle of the sliders in Figure 5 should be moved to a white line, according to how much the user agrees with each statement. At the bottom of the page, a text will appear describing their attitude towards carers depending on their answers.

Figure 6:Bubbles interaction for the clinicians.

Clicking on the initially blue bubbles in Figure 6 reveals more information and makes them turn purple. In this case, the blue bubbles are sayings the carers and patients might say, the purple side of the bubble gives an example answer.

Figure 7:Scenario activity for clinicians with text-box.

The green box in Figure 7 gives a scenario about a patient and a carer. In the free text box underneath, the clinicians can reflect whether they would encourage the carer to come. On

(8)

the next, examples of what can be done in this scenario are given. The clinicians, carers and patients have the same layout, interactions and activities but with different content.

To determine user satisfaction, the participants were welcome to give free comments about their experience interacting with the e-learning module. Lastly, the users got to fill out a System Usability Scale (SUS) questionnaire [5]. The SUS question- naire has ten statements to which the participants answered with a Likert scale from strongly disagree to strongly agree (see Appendix B for detailed statements). A version with only positive statements was used as research after Brooke has shown that there is little evidence that the advantages of having an alternation of positive and negative statements overcome the disadvantages [19]. This version will ensure fewer mistakes in the participants’ answers. The creator of SUS, Brooke [4], also later approves this new version. The SUS score, obtained through an algorithm, determines if the website is usable or not. The algorithm consists of subtracting 1 from each answer, add up the total score and multiply by 2.5 to get a score out of 100 [19].

To analyse the collected data, the think-alouds have been tran- scribed and important usability aspects highlighted which al- lowed gathering interesting quotes which are presented in the results. For an overview of the results, a table (Appendix D) with the mentioned strengths and weaknesses was made.

These were then classified into categories and how many users had mentioned each strength and weakness. The results of the heuristic evaluation and think-alouds were then compared.

4. RESULTS

4.1 Heuristic evaluation

Using the heuristic evaluation, 44 problems were found. The areas of the identified problems were:

• Inconsistency of icons and redundancy in buttons.

• Buttons and interactions not working.

• Responsive design problems in layout.

• Presentation of content.

In these categories, 37 problems were classified (see Appendix C); less severe content problems such as spelling mistakes or inconsistencies in the use of terms are not included in this classification. Twelve out of 37 problems were seen as easy to solve with the rating 5. Seventeen out of 37 problems were very prevalent with the rating 5. Twelve out of 37 problems were classified as severe with the rating 5.

Following this evaluation, eight severe problems were fixed before conducting think-alouds. As shown in Table 1, all problems were ranked as very severe, ranked 4 to 5, and six of them had a high prevalence, ranked 4-5.

Problem S P E

Nothing happens when clicking on print button 5 5 3 Clicking on complete on last page of any part

makes the whole dashboard go green although several parts are not completed

5 5 3

Nothing happens in card activity 5 1 3 What is written in textbox does not get saved 5 4 3

"Save and exit" button is only on the first page

of each part 5 5 4

Progress bar is not showing the correct progress 5 5 2 Asterisk after names lead to nowhere 5 5 5 Green colour on slider for the "bad" part 4 1 4 Table 1: Problems fixed after the heuristic evaluation. S: Severity (1-not severe, 5-severe), P: Prevalence (1-not often, 5-often), E: Ease to solve (1-hard to solve, 5-easy to solve).

Fourteen problems identified with the heuristic evaluation were also mentioned in the think-alouds. Table 2 below, presents the problems that heuristic evaluation and think- alouds have in common. The problems’ severity ranking varies from 2 to 5, prevalence varies from 1 to 5, and ease to solve varies from 3 to 5.

Problem S P E

Arrows and back/next buttons have the same

function 4 1 5

No link on guidelines that are referred 2 2 5 Can be unclear what the button "save and exit"

means 4 5 4

Not possible to save on what page in a part the

user stopped 5 5 3

No margin on left side of the text on pages, text

is almost cut on bigger screens 4 5 3

Text is not aligned with the textbox/bar, text is overlapping each other when opening several information boxes

5 4 3

Some boxes get thinner than others when

opened 4 4 3

Star when bookmarked goes from green to pur- ple, when unbookmarked does not go back to green

2 5 3

Fading in of text 3 5 5

No hover highlighted feedback on bubbles to

click on 3 2 4

Clicking on button of video activity gives no

feedback 3 4 4

Spelling mistakes 3 3 5

Unclear what you get for completing the train-

ing 3 1 5

Table 2: Problems that the heuristic evaluation and think-alouds have in common. S: Severity (1-not severe, 5-severe), P: Prevalence (1-not often, 5-often), E: Ease to solve (1-hard to solve, 5-easy to solve).

Further results of the think-alouds will be presented in the next section.

4.2. Think-alouds

Eleven think-alouds have been conducted, three with patients, three with carers and five with clinicians. They are being kept anonymous and will be mentioned as carer (Ca1-3), patient (P1-3) and clinician (Cl1-5).

The frequency of strengths and weaknesses mentioned by all participants in think-alouds has been analysed (see Appendix D for full table). In table 3, frequently mentioned comments that were mentioned by at least four out of eleven participants

(9)

(36% of the participants) are presented. They are ordered from strengths to weaknesses.

Comments about T + [P,Ca,Cl] - [P,Ca,Cl]

Content the user can relate

to 10 10 [2,3,5] 0

Content emphasising the

importance of the carers 8 8 [3,3,2] 0

Quotes 8 8 [2,3,3] 0

Short videos 8 6 [0,2,4] 2 [2,0,0]

Easy navigation 7 6 [1,2,3] 1 [1,0,0]

Popping bubbles to reveal

more information 6 4 [1,1,2] 2 [1,0,1]

Estimated time on each sec-

tion 5 4 [1,2,1] 1 [0,0,1]

References 5 4 [1,1,2] 1 [0,0,1]

Pictures 5 4 [1,1,2] 1 [0,0,1]

Colours used throughout

the e-learning 4 3 [1,1,0] 2 [0,1,1]

Busy slides 6 0 6 [1,2,3]

Text size in general 7 2 [1,1,0] 5 [1,2,2]

Table 3:Frequency of mentioned comments by participants in think-alouds.

T: Total amount of participants mentioning the comment, +: Total amount of participants mentioning the comment as a strength, -: Total amount of participants mentioning the comment as to improve. [P,Ca,Cl]: Amount of [patients,carers,clinicians] mentioning the comments as a strength or weak- ness.

The next sections will present more details about the frequently mentioned comments with testimonies and will finish with other notable comments also mentioned by participants. The quotes will show views of the participants in their own words to provide a richer picture of the tables above.

4.2.1. Comments about content 4.2.1.1. Content the user can relate to

Ten participants agree with the content, especially when they can relate to it.

“I think this is a very useful slide. When we went in to our first meeting we were just there me and my son did this.”- Carer 1

“It is definitely something that we deal with a lot day to day.

And I think it’s taken my attention because I think understand- ing these things more definitely has a direct impact day to day on what we do.”- Clinician 2

One participant mentions they can not relate with some content but still think it’s useful information and good advice.

“It says group text messages or emails summarising the consul- tation, that never happened with us, although I agree it should happen.”- Carer 3

4.2.1.2. Content emphasising the importance of carers Eight participants believe it is essential to emphasise how im- portant the carer is. Here are statements from one participant of each user group:

“I get your message about including the family in consultations and a couple of tips and stuff like that. They are the gems.”- Patient 1

“The importance of carers like their role, I mean some people may not know what they should and shouldn’t do in order to help the person, is good first knowledge. And then I think that would make the other parts very productive for them.”- Carer 2

“Family need to be involved and there is a lot of stress involved so that was good it was pointed out“- Clinician 3

4.2.1.3. Quotes

Eight participants mention enjoying the quotes included in the e-learning. For example, a clinician mentions to like them for the evidence they give.

“I like the use of the quotes. It gives a bit of a supportive evidence to it so I like that.”- Clinician 5

A patient mentions the quotes make the e-learning more per- sonal and likes the reality of it. Another patient mentions it gives a good break between heavier pages.

“Lovely, really nice. So breaks it up with little quotation.”- Patient 1

4.2.1.4. Short videos

Six participants mention liking the videos.

“I think really the use of videos is so important now. People are expecting to be able to click on stuff so yeah I think that’s great.”- Carer 2

One of the clinicians likes that it gives some variety to the e-learning.

“Oh a video that’s interesting, it’s sort of mixing it up, it’s nice to have the different things.”- Clinician 2

The clinician also mentions the content of the videos and the actors are good for the learning.

“I think that’s good. I think listening to her language, I think that’s what I struggle with. A lot of the struggle is how to choose the right words to say what she said. So she’s done that very eloquently and it’s good to hear that point.”- Clinician 2 Two carers mention they like that the videos are not too long.

Meanwhile, two patients do not like videos for learning.

“I’m not going to play that. I’m not a very video type people.”- Patient 1

4.2.1.5. References

Four participants like the references. One clinician like being able to see the references by hovering the reference number.

“I quite like the way the little references pop up, that’s quite easy to use.”- Clinician 2

One carer likes to have a document with all references.

“I’m just gonna click on the references and yeah that’s good you’ve got those.”- Carer 2

One clinician does not immediately understand how to see the references.

(10)

4.2.1.6. Pictures

Four participants like having pictures throughout the module, a patient explains the pictures’ benefits.

“So instantly more engaging because there is a photo, visually appealing.”- Patient 1

Although, one clinician and one carer that like pictures does not understand what some of them represent and why those have been chosen.

“What’s going on with that picture, why have you chosen that?”

- Carer 2

4.2.1.7. Busy slides

Six participants mention disliking busy and text-heavy slides.

“I think you are making this page very busy with text and it’s a bit confrontational.”- Carer 2

One patient would rather have more visually appealing con- tent.

“It is pretty text-heavy and I guess that I am more of a visual learner so it might be nice to have some more pictures, icons, to make it a little bit more visually appealing.”- Patient 1 Two clinicians mention they prefer pages where they quickly can get the message of the text.

“When I read something apart from patients notes, I skim it.

So it’s gotta be something that I can get the message with a glance.”- Clinician 1

4.2.2. Comments about usability 4.2.2.1. Easy navigation

Six participants found the navigation through the website easy and straightforward like one carer mentions here:

“I think it’s pretty easy and straightforward. I think anybody who’s used to doing online training, modules and so on will probably find it really easy.”- Carer 1

On the other hand, one patient had troubles with the navigation, it is overwhelming mentions the patient below.

“Overwhelming, once you get into the thing, I have no idea what I have to do.”- Patient 2

4.2.2.2. Popping bubbles to reveal more information Four participants like the interaction of clicking on bubbles to reveal more information. The interactivity of it is appreciated as one clinician mentions below.

“See this makes me wanna click on it which is what you want us to do yeah. I like the bubbles! Yeah I quite like that.”- Clinician 5

One clinician does not understand the purpose of the bubbles and one patient does understand that they should click on them.

“I’m not sure why there are 3 bubbles here.”- Clinician 1

4.2.2.3. Estimated time on each module

Four participants mention liking to be able to see how much time each section will take. This helps them to plan how to go through the module as a carer states below. They call section a part.

“Looking at this dashboard I like it that it tells you how long each part is going to take just so you know in advance. You’re busy and maybe you just have time to do half of it and then you can sort of plan how you’re going to tackle it.”- Carer 1 Another clinician likes this feature but found it odd that the estimated time is so precise, for example, an estimated time of 4 minutes instead of 5 minutes.

“But the time in the top right corner here, is that like how long it should take you to complete that module? It’s very precise, isn’t it?”- Clinician 5

4.2.2.4. Colours used throughout the website

Three participants mention positive aspects of the colours and three mention colour features to improve. The colours mentioned regard both background colours and colours of text.

One carer mentions they like the colour theme of the interface.

"I think blue is a good colour, the purple in the back is pleas- ant"- Carer 2

Another carer is unsure if the colours, purple and turquoise, of the interface, work well together. Although, both these carers like the colours of the dashboard and the feature of colours changing when a section is completed.

“I’m really impressed by the dashboard, I like the colours and it looks clear to me where you start and where you finish."- Carer 2

“I like the colours changing as you’ve completed it”- Carer 1 One patient likes when important text is written in a different colour.

“Even with heavy text, the fact that you have highlighted them in different colours immediately makes them more accessible to focus.”- Patient 1

A carer, on the other hand, does not like text in blue, they would prefer more consistency in choosing text colour, bolding and making text italic.

"It’s just in blue and in italics, it’s not bolded or yeah there’s one bolded word. I think you can probably present that better.”

- Carer 2

One clinician prefers when text is written on a coloured back- ground, it feels less harsh than black on white.

4.2.2.5. Text size in general

Two participants found the text size alright. Five other partici- pants mention problems with the text size. One patient found the text too big.

“Fonts are a tiny bit big so it comes across as a header rather than a textbox.”- Patient 1

(11)

In contrary, two clinicians and two carers found the text too small and wonder if they can change it. Two carers mention specifically at which places it is too small, like this carer mentions:

"The text might be a little bit small in these little boxes. I know you are trying to create subtext, but maybe if you just use the same font and size but just indent it.”- Carer 2

4.2.2.6. Other notable comments Comments with positive tendencies

Overall, all participants have positive comments about interac- tive layout and activities. For example, one clinician mentions they are more engaged with text when it is interactive.

“On that slide you get the principles you just go yeah yeah.

This is the kind of slides you pass over a lot more quickly than the interactiveness that draws you to engage much more with what’s being written.”- Clinician 2

Three participants mention it is practical to complete the mod- ule in their own pace and in the order they prefer. Three also mention they like the dashboard layout. Two clinicians also mention appreciation of having a variety of different activities:

“The activities were good and it was good to have like a mixture of different activities in there as well. You definitely engage like a thousand percent more like I said with the activities”- Clinician 2

“I think it’s nice to have a few interactive things throughout, it breaks up it when you are just reading and reading things all the time.”- Clinician 5

Three participants enjoy being able to build and download a question list of questions to ask clinicians. The participants believe it is a very important thing to be able to do.

“You can click on whatever ones you want, so this is really good. Then you download the checklist. I like that a lot.”- Patient 1

The patient mentions this question list tool is great as well as being able to build a list of their carers and their roles.

“Some of the activities like the questions, I really liked. The ones where you wrote down what you thought the carer might do for you if you then use it as a communication tool, really good as well.”- Patient 1

Five participants have shared opinions on different video activ- ities. Clicking on a button when something specific happens in a video is enjoyed by two participants, although its use has also shown to be confusing for two of them. It is also an activity they have not seen before, which they enjoy.

“Oh it’s pretty good. So that comes up on every time that I’ve clicked on it. I like this section it’s really good. I like that activity. I’ve never done one of those before that’s really good.“- Clinician 2

“Did I press enough for this one? Oh it’s every time I pressed.

I only got 6 out of 12. Sometimes he was just going on with the same thing, so I didn’t want to overpress. How do I feel

about pressing every time, I’m not sure about that. I wasn’t that comfortable with that.“- Clinician 3

One clinician mentions the positive aspects of having text boxes to fill in with free text.

“I think that’s quite good because at least you are giving people a little bit more of themselves, they can write free text instead of just clicking on things. That you ask people to actually write something is good.”- Clinician 4

One carer did acknowledge the possibility to bookmark pages and liked the feature.

“Oh! So I can bookmark my pages. So this means even when I’ve done the module, if I can’t remember everything I can read it again at a later stage so that might be handy.”- Carer 1

Three participants mentioned the content has a simple lan- guage without being too basic. Although, one participant did find that it goes from simple to complex on some pages.

“I find it easy and I’m impressed with the content. I’m glad it’s not too basic, you know it’s got useful information in there.”- Carer 1

Two participants mentioned that eTRIO’s content covers im- portant topics.

“From what I’ve seen it looks like you’ve put together a very useful package.”- Carer 3

There are 19 main strengths, the mentioned nine here and ten in the frequently mentioned comments.

Comments with negative tendencies

Sliders do not work like three participants would have ex- pected. It is not possible to slide to wherever the user wants, which has shown to not be intuitive.

“Let’s see if you can just click and drag hopefully. Ah so just on the lines ok. I think it would be nicer if you could move it wherever you wanted.”- Clinician 2

Three participants have mentioned the redundancy of arrows and next/back buttons to navigate through the pages. They mention the downsides of having arrows in the middle of the page is that content further down that page can easily be skipped.

“I could easily just press next and I would then miss the most important thing.”- Carer 1

Boxes showing more content when clicked, appeared to have glitches. Text is overlapping and the boxes are changing sizes with no reason. This has been identified in the heuristic evalu- ation as well as being mentioned by two participants. It is not appealing for the user as a carer states below.

“This oncologist one has overlapped with the surgeon one. And why is his box so big? Was it like this when the page first came up, I can’t remember?”- Carer 1

The meaning of the "Save and exit" button has shown to be unclear for two participants.

(12)

“How do I get back to the other screen? Save and exit. Oh really?”- Clinician 5

There are six main weaknesses, these are four of them (sliders, button redundancy, bugs, meaning of "Save and Exit" but- ton) on top of the two mentioned in the frequently mentioned comments (busy slides, text-size).

4.3. System Usability Scale questionnaire

All participants have answered ten questions from the System Usability Scale questionnaire. In Table 4, it is shown how the SUS score can be translated into an adjective average, going from awful to excellent.

Awful Poor Okay Good Excellent

< 51 51–68 68 68–80.3 > 80.3

Table 4:System Usability Scale scores translation into an adjective average.

Table 5 shows that four out of five clinicians give a good ad- jective average of the page and one results in an OK adjective average. One patient gives a good adjective average of the page, one an OK adjective average and one a poor adjective average. Two out of three carers give an excellent adjective average of the page and one a good adjective average. In aver- age, the interface has a System Usability Scale score of 76.8 which is a good adjective average.

Cl1 Cl2 Cl3 Cl4 Cl5 P1 P2 P3

score 80 80 67.5 75 72.5 67.5 77.5 55

Ca1 Ca2 Ca3 Total average

97.5 95 77.5 76.8

Table 5: System Usability Scale scores of all participants. Cl1: clinician 1, Cl2: clinician 2, Cl3: clinician 3, Cl4: clinician 4, Cl5: clinician 5. P1:

patient 1, P2: patient 2, P3: patient 3. Ca1: carer 1, Ca2: carer 2, Ca3: carer 3.

5. DISCUSSION

In general, all participants have shown appreciation about the interface, especially its important aim on educating how to include the carers in the challenging cancer experience.

Users enjoyed visual representations of the text in form of pictures and icons as long as they match the topic of the page.

Quotes were also appreciated by many of the participants, as it gives some reality to the content. Having a layout with a page containing only a quote, gave the participants a break from pages with more content. The participants found activities very engaging and appreciated them as long as the users knew how to interact with them. This corresponds well with the study from Huang [8] that interactive and multimedia content makes the user feel more engaged, as long as the usability is adequate. Think-alouds also showed that seeing references was important to several participants, especially clinicians.

Only one clinician did not understand from the beginning how the references were presented. To see how much time a section takes was appreciated by many, to be able to plan how to fulfil their learning.

Things to improve content-wise are text-heavy slides and con- sistency in the use of terms. Within usability, consistency is

also needed between buttons and their icons. It needs to be clear what is clickable or not as well as what the buttons and interactions do as some participants had some trouble with that. There was of course content and activities that users had different opinions on. For example, almost half of the users mentioned liking short videos whilst two users would skip them as it is not their type of learning. Bubbles to click on and reveal more content were appreciated by many users, only two were not sure about them and how to interact with them. Therefore, with a mix of different activities, everyone could find something they like. Several participants wanted bigger text although one participant mentioned some text was too big and could come across as a header. A participant also mentioned it would be great to be able to choose the text size. Activities of building question list and a carer team were shown not only to be engaging but were also a practical tool for participants, which they highly enjoyed.

Several problems within the interfaces were identified in both the heuristic evaluation and the think-alouds (see Appendix F for full comparison). These problems were mostly within usability such as redundant and unclear buttons or missing user feedback when clicking on some buttons. The activity of clicking on a button during a video was appreciated for its originality. As mentioned in the heuristic evaluation, the button does not give any feedback when being clicked. In the think-alouds, some participants did mention that they were unsure if they had clicked and if they had clicked enough times. The heuristic evaluation mentioned the progression bar’s placement at the bottom of each page was not in favour of the system while in the think-alouds no participant acknowl- edged this bar.

Most participants started by the introduction and would go through the parts in order but they also liked the freedom of being able to choose the order they want to complete the module in. This way they could go through the topics that are the most interesting to them first. The participants went through the sections but did not open any hamburger menus, use the bookmark function or explored their profile page.

Using heuristic evaluations, some problems might not be men- tioned by users and all problems found by users might not be found in the heuristic evaluation [17]. In this study, prob- lems only mentioned in the heuristic evaluation were mostly about pages the users did not visit or problems that had al- ready been fixed. On the other hand, problems not identified in the heuristic evaluation that users mentioned, were mostly content problems which were not the main focus of the heuris- tic evaluation. This shows good use of heuristic evaluation that allowed to identify major flaws of the program without having to recruit any users. Heuristic evaluation is a good method to use before think-alouds, but it is also important to consider additional usability testing methods to reinforce the results’ credibility. Exploring these methods enabled dis- covering which problems each reveals. Heuristic evaluation was done first while think-alouds were being organised and participants recruited.

The System Usability Scale score that classifies the eTRIO interfaces as good confirmed the overall feeling revealed by

(13)

the users. The interfaces have good usability and content but have indubitably room for improvement. The scores were significantly lower for patients, especially one of them (P3).

This could be explained by their higher age and lower com- puter literacy than the other users. The patient (P3) with the lowest score had troubles starting the e-learning and sharing their screen via Zoom, therefore, in this particular case only, they used the e-learning through the screen sharing of the in- terviewer. This made the e-learning quite slow to react and could affect the overall feeling of the e-learning for the user.

Carers had overall high scores. Two of them were participants who mentioned many things (see Appendix D to see how much each participant mentioned), they were very talkative and found most of the content and activities very useful. Their positive attitude is reflected in high System Usability Scale scores. The third carer did not mention as many things as the other two. This carer was not very computer literate but still mentioned they found the navigation straightforward. Al- though most comments were not negative, this carer’s score was lower. Clinicians are used to completing other e-learnings regularly in their profession, therefore they could compare this to what they regularly do. Some of the clinicians were in general more critical than the other participants, thus giving lower scores than carers.

A patient with lower computer literacy showed a lower SUS score while a carer also with low computer literacy found the navigation of eTRIO particularly easy. This shows that computer literacy can, but does not have to, affect the users’

experience with eTRIO. Different computer literacy’s effect could be further researched.

5.1. Method criticism

In this study’s method, think-alouds started by being con- ducted in-person physically and were then pursued online through Zoom. Using Zoom had both benefits and disadvan- tages. With Zoom, more people from areas further away could participate in the study. It also enabled recording the partic- ipants’ actions on their screen on top of the audio with their comments. The use of Zoom was new for both the interview- ers as well as the participants. This resulted in technological difficulties such as not finding how to share screen, which eventually was solved by launching the interface on the inter- viewer’s computer.

Like mentioned earlier, interesting observations during the think-alouds were made: users did not explore the hamburger menus and pages the menu leads to. Flaws of think-alouds are that the users might not interact or navigate the interface as they would by themselves outside of the space of think- alouds. The setting of someone watching what you are doing and talking while using the e-learning can be an unnatural environment for the participants. Using it by their own could have resulted in different time spent on each page, use more functions such as bookmarking and perhaps explore more pages such as their profile, what they have bookmarked, etc.

The think-alouds took 40 minutes to one hour, depending on the participant. The participants went through different sections, some went through more than others (see Appendix E). Having the participants not go through the exact same

sections may have resulted that some strengths and weaknesses have not been mentioned by as many participants as if they would all have gone through the same things.

Other methods for evaluating the interfaces are for example to log the users’ actions and time spent on each page. This could have solved the disadvantages of think-alouds and could have given different results. Nonetheless, this study has used a combination of several usability testing methods to rein- force the results’ credibility. Using only one of these, fewer problems would have been found, and the identified problems would have been less justified. Each interface could have been evaluated with more people in each group as big differences between the groups has not been found. Instead of evaluating the whole e-learning, the study could have focused on some key interactions. These methods would have given different results, perhaps more detailed.

The System Usability Scale questionnaire originally has an alternation of positive and negative statements. In this study, a SUS questionnaire with positive statements only was used.

Having both positive and negative statements alternated could have given less satisfying results as the alternation is to reduce bias into only positive answers [5]. Both the arrangement and the state of the questions can alter answers, however later research has also shown that the benefits of the alternation might not out-weight the disadvantages of misinterpreting statements or forgetting to reverse their answer by mistake [19, 4].

5.2. Future research

Future research could focus on evaluating mobile medical e- learnings. This has shown to be of interest by the users of this study. Another orientation could be to investigate more in the activities and interactions specifically. As this study can be interesting to developers of e-learning and their content, research could be done in to which extent this can be applied to e-learnings outside the medical system.

6. CONCLUSION

The outcome has given criteria to improve e-learning’s us- ability and content. Strengths of the website are the overall easy navigation with several different interactive activities and layouts. Weaknesses are text-heavy slides, unclear instructions or buttons and unclear purposes of some content and activities.

Users want valuable and useful content that is easily read, visually appealing and helps them in their everyday life of cancer experiences.

ACKNOWLEDGMENTS

I would like to express my sincere gratitude to my supervisor Judy Kay at the University of Sydney for giving me the oppor- tunity to work on this project, for her continuous and valuable support and feedback. A special thank you to the TRIO team for the insightful collaboration, especially Rebekah Laidsaar- Powell for conducting most of the think-alouds and providing support during the entire project. I would also like to thank my institution KTH and my supervisor there, Kjetil Falkenberg, for their help. Last but not least, a big thank you to all the participants for sharing their valuable thoughts.

(14)

REFERENCES

[1] Margaret Bevans and Esther M. Sternberg. 2012.

Caregiving burden, stress, and health effects among family caregivers of adult cancer patients. JAMA - Journal of the American Medical Association(2012).

DOI:http://dx.doi.org/10.1001/jama.2012.29

[2] Mike Bracher, Simon Stewart, Claire Reidy, Chris Allen, Kay Townsend, and Lucy Brindle. 2019. Partner involvement in treatment-related decision making in triadic clinical consultations – A systematic review of qualitative and quantitative studies. (2019). DOI:

http://dx.doi.org/10.1016/j.pec.2019.08.031

[3] Freddie Bray, Jacques Ferlay, Isabelle Soerjomataram, Rebecca L. Siegel, Lindsey A. Torre, and Ahmedin Jemal. 2018. Global cancer statistics 2018:

GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA: A Cancer Journal for Clinicians(2018). DOI:

http://dx.doi.org/10.3322/caac.21492

[4] John Brooke. 2013. SUS: a retrospective. Journal of usability studies8, 2 (2013), 29–40.

[5] John Brooke and others. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996), 4–7.

[6] Monty Hammontree, Paul Weiler, and Nandini Nayak.

1994. Remote usability testing. Interactions 1, 3 (1994), 21–25.

[7] Laura. J. Hodges, Gerry. M. Humphris, and Gary.

Macfarlane. 2005. A meta-analytic investigation of the relationship between the psychological distress of cancer patients and their carers. Social Science and Medicine (2005). DOI:

http://dx.doi.org/10.1016/j.socscimed.2004.04.018

[8] Camillan Huang. 2005. Designing high-quality

interactive multimedia learning modules. Computerized Medical Imaging and Graphics(2005). DOI:

http://dx.doi.org/10.1016/j.compmedimag.2004.09.017

[9] Rebekah Laidsaar-Powell, Phyllis Butow, Frances Boyle, and Ilona Juraskova. 2018a. Facilitating collaborative and effective family involvement in the cancer setting:

Guidelines for clinicians (TRIO Guidelines-1). (2018).

DOI:http://dx.doi.org/10.1016/j.pec.2018.01.019

[10] Rebekah Laidsaar-Powell, Phyllis Butow, Frances Boyle, and Ilona Juraskova. 2018b. Managing challenging interactions with family caregivers in the cancer setting:

Guidelines for clinicians (TRIO Guidelines-2). (2018).

DOI:http://dx.doi.org/10.1016/j.pec.2018.01.020

[11] Rebekah Laidsaar-Powell, Phyllis Butow, Stella Bu, Cathy Charles, Amiram Gafni, Wendy Wing Tak Lam, Jesse Jansen, Kirsten Jo McCaffery, Heather Shepherd, Martin Tattersall, and Ilona Juraskova. 2013.

Physician-patient-companion communication and decision-making: A systematic review of triadic medical

consultations. (2013). DOI:

http://dx.doi.org/10.1016/j.pec.2012.11.007

[12] Rebekah Laidsaar-Powell, Phyllis Butow, Stella Bu, Alana Fisher, and Ilona Juraskova. 2016. Attitudes and experiences of family involvement in cancer

consultations: a qualitative exploration of patient and family member perspectives. Supportive care in cancer 24, 10 (2016), 4131–4140.

[13] Rebekah Laidsaar-Powell, Phyllis Butow, Cathy Charles, Amiram Gafni, Vikki Entwistle, Ronald Epstein, and Ilona Juraskova. 2017. The TRIO Framework:

Conceptual insights into family caregiver involvement and influence throughout cancer treatment

decision-making. (2017). DOI:

http://dx.doi.org/10.1016/j.pec.2017.05.014

[14] Kristopher Lamore, Lucile Montalescot, and Aurélie Untas. 2017. Treatment decision-making in chronic diseases: What are the family members’ roles, needs and attitudes? A systematic review. (2017). DOI:

http://dx.doi.org/10.1016/j.pec.2017.08.003

[15] Bridie McCarthy. 2011. Family members of patients with cancer: What they know, how they know and what they want to know. (2011). DOI:

http://dx.doi.org/10.1016/j.ejon.2010.10.009

[16] Jakob Nielsen. 2012. Thinking aloud: The# 1 usability tool. Nielsen Norman Group 16 (2012).

[17] Jakob Nielsen and Rolf Molich. 1990. Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems. 249–256.

[18] Jorge G. Ruiz, Michael J. Mintzer, and Rosanne M.

Leipzig. 2006. The impact of e-learning in medical education. (2006). DOI:

http://dx.doi.org/10.1097/00001888-200603000-00002

[19] Jeff Sauro and James R Lewis. 2011. When designing usability questionnaires, does it hurt to be positive?. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2215–2224.

[20] Keith Soothill, Sara M. Morris, Carol Thomas, Juliet C.

Harman, Brian Francis, and Malcolm B. McIllmurray.

2003. The universal, situational, and personal needs of cancer patients and their main carers. European Journal of Oncology Nursing7, 1 (2003), 5–13.

[21] Sebastiaan M. Stuij, Nanon H.M. Labrie, Sandra Van Dulmen, Marie José Kersten, Noor Christoph, Robert L.

Hulsman, Ellen Smets, Stans Drossaert, Hanneke De Haes, Arwen Pieterse, and Julia Van Weert. 2018.

Developing a digital communication training tool on information-provision in oncology: Uncovering learning needs and training preferences. BMC Medical Education (2018). DOI:

http://dx.doi.org/10.1186/s12909-018-1308-x

(15)

Appendix A eTRIO walkthrough

Appendix A shows the eTRIO, its layout, features, interactions and activities.

eTrio interface walkthrough

The clinician module is presented here. The modules for patients and carers have the same layout and activities but with different content.

Clinician module

Figure 1: Clinician dashboard

In figure 1, the dashboard with its different sections is shown. In the clinician module, they are called Guidelines.

• Light blue sections have not been started.

• This section is compulsory not compulsory.

• Dark blue sections have been started but not completed.

• This section is compulsory not compulsory.

• Green sections have been completed.

• This section is compulsory.

The estimated time to

complete the section is

shown in the top right

corner.

(16)

Appendix A eTRIO walkthrough

• At the bottom of each page is a progression bar.

• One little bar per section.

o Grey: not started o Green: completed o Blue: started but not

completed

• EST~ : estimated time to complete section.

An introduction page with explanations of different buttons.

Arrows and “Back” and “Next”

buttons have the same of function to navigate through pages.

• Stars for bookmarking o Green: not

bookmarked o Purple:

bookmarked

• Print button to print the page

“Save and exit” to save progress and get back to the Dashboard.

Left Hamburger menu.

Navigate to different pages of the section.

Right Hamburger menu.

(17)

Appendix A eTRIO walkthrough

Layout

• Introductory info boxes in light blue.

• Guiding principles in boxes.

• Introductory navigation instructions in light blue.

• Icons for different sections within the page, text appearing

underneath.

• Page with bullet points giving an overview of the program.

• Important information

highlighted by bolding

text.

(18)

Appendix A eTRIO walkthrough

• Layout of the summary page.

• Possibility to

download summary as PDF.

• There is a summary on the last page of each section.

• Quote from academic papers.

• Logo of the journal where the quote comes from.

• Quotes have their own page.

• Boxes with information.

• Clicking on the logo situated on the right side of the box reveals more information.

• On the top, the box is opened.

• On the bottom, the

box is closed.

(19)

Appendix A eTRIO walkthrough

• A map with the different states of Australia (the e- learning is developed for cancer care in Australia).

• Clicking on each state gives more

information about the procedures in the state.

• A page with a picture illustrating the text.

• When starting the e- learning, clinicians get a message about how they can complete the sections and what is mandatory.

• When the compulsory

parts of the e-learning

are completed for

clinicians, they get a

message. It also

encourages them to

continue with not

mandatory sections.

(20)

Appendix A eTRIO walkthrough

Activities

• Questions with true or false answers.

• Sliders to respond to agreement or non- agreement of statements.

• At the end, it shows a short summary about your attitude towards family members according to your answers.

• Bubbles with information on.

• Clicking on the

bubbles makes them

turn purple and

reveals more

information.

(21)

Appendix A eTRIO walkthrough

• Cards to select the ones with the most important information needs of family carers.

• Blue contour: card selected

• Blue card: card not selected

• Feedback of card activity.

• Pink: wrong answer.

• Green: right answer.

• White: answer was not selected but should have been.

• Clicking on numbers located on the picture reveals more

information.

• The information is shown in a box like shown on the picture.

• Clicking on “+” located on the picture reveals more information.

• The information is

shown in a box like

shown on the

previous picture.

(22)

Appendix A eTRIO walkthrough

• In the green box a scenario is described.

• In the textbox the user can write how they would respond to this scenario.

• On the next page, feedback about the activity is given.

• Two important points that should have been mentioned in the answer.

• Two example answers.

• In this video activity,

the user should click

the pink button,

located under the

video, every time they

see the doctor build

rapport with a carer.

(23)

Appendix A eTRIO walkthrough

• Feedback on the video with button activity.

• Green tick: they clicked the button at the right time.

• Pink cross: a moment that should have been clicked but has not.

• Another video activity.

• The user watches the video and will be asked on the next page how they would handle the situation presented in the video.

• The user gets

suggestion on how to handle the situation and should chose the most accurate one.

• The user gets

feedback on the

answers.

(24)

Appendix A eTRIO walkthrough

• Finally, the user gets to see a video of how the situation can be handled.

• Another video activity.

• The user gets to watch a video and will be asked how they would handle the situation.

• The user gets to reflect in open text how they would handle the situation.

• Finally, the user gets

to see a video of how

the situation can be

handled.

(25)

Appendix B System Usability Score Questionnaire

Appendix B shows the detailed results of the SUS questionnaire.

Carers Patients Clinicians

Ca1 Ca2 Ca3* Pa1* Pa2* Pa3 Cl1* Cl2* Cl3* Cl4 Cl5 1. I think that I would

like to use the eTRIO website frequently.

3 2 0 0 4 2 1 3 3 3 2

2. I found the eTRIO

website to be simple. 4 4 4 3 3 3 2 3 3 3 3

3. I thought the eTRIO website was easy to use.

4 4 3 2 3 3 3 3 3 3 2

4. I think that I could use the eTRIO website without the support of a technical person.

4 4 4 4 3 1 4 4 3 4 4

5. I found the various functions in the eTRIO website were well integrated.

4 4 4 2 3 2 3 3 2 2 3

6. I thought there was a lot of consistency in the eTRIO website.

4 4 3 1 3 3 3 3 3 3 3

7. I would imagine that most people would learn to use the eTRIO website very quickly.

4 4 4 3 2 1 4 4 3 3 2

8. I found the eTRIO

website very intuitive. 4 4 2 4 4 3 4 2 2 3 3

9. I felt very confident using the eTRIO website.

4 4 3 4 3 2 4 3 2 3 3

10. I could use the eTRIO website without having to learn anything new.

4 4 4 4 3 2 4 4 3 3 4

Sum 39 38 31 27 31 22 32 32 27 30 29 Average > 80.3 Excellent

SUS score 97,5 95 77,5 67,5 77,5 55 80 80 67,5 75 72,5 77 68 > sus > 80.3 Good

Score rounded 98 95 78 68 78 55 80 80 68 75 73 68 OK

1: strongly disagree 51 > sus > 67 Poor

5: strongly agree < 51 Awful

1-5 scores converted to 0-4

* audio recorded

References

Related documents

[r]

However, our interests also lie in how health and social care services for older people and their carers address carer outcomes (meso level) as well as generalizable results

The respondents were asked to rate certain topics that they find key in their choice of event platform: how easy it is to create an event site, how pleased they are with the

Integrated secure element with card emulation support for MIFARE 4K and ISO/Global Platform smart card for service providers to install application specific data, for

Assuming that the algorithm should be implemented as streamlined as possible, in this case one row of blocks at a time, the interpolation and equalization of pixels in the first

Even though we did not have specific hypotheses pertaining to the effects of brain volume on RT for happy, neutral and angry facial emotions, respectively, or as a function of the

By compiling the results from our own and other published preclinical reports on the effects of combining immunotherapy with steel therapy, we concluded that the combination of

Criteria description: The system affords a repertoire of possible actions (functionality) to the user. The sum of these actions can be viewed as the e-service’s