• No results found

How does the UX Design of video conferencing software affect student engagement in online education?

N/A
N/A
Protected

Academic year: 2021

Share "How does the UX Design of video conferencing software affect student engagement in online education?"

Copied!
67
0
0

Loading.... (view fulltext now)

Full text

(1)

How does the UX Design of video

conferencing software affect student

engagement in online education?

Main Subject area: Informatics

Specialisation in: User Experience Design and IT Architecture Author(s): Jing Zhang and Vlad Vamoș

(2)

Certificate of Completion

This final thesis has been carried out at the School of Engineering in Jönköping University within User Experience Design and IT Architecture. The authors take full responsibility for the presented opinions, conclusions, and results.

Examiner: Vladimir Tarasov Supervisor: Bruce Ferwerda

Scope: 30 hp (second-cycle education) Date: 2021-06-10

(3)

Attestation of Authorship

We hereby declare that this submission is our work, the best upon research that we have conducted.

To the best of our knowledge and belief, it contains no material published or written by another person – except where explicitly defined in the Acknowledgements or listed in the References and properly cited.

Nor does it contain any material of ours that, to a substantial extent, has been submitted for the award of any other degree or diploma of a university or other institution of higher learning.

Jing Zhang Vlad Vamoș

(4)

Acknowledgments

We would like to express our gratitude to our supervisor Bruce Ferwerda for his guidance and swift feedback. This allowed us to move on without fear.

We would also like to thank Vladimir Tarasov, Robert Ian Day and all the “opposite voices” from the “Comrades-in-arms” for their advice and feedback offered to us throughout the making of this paper. Especially during the pandemic, because they provided an organized environment, so that the entire thesis process proceeded smoothly. Moreover, we appreciate all the kind participants helping us to complete the surveys and interviews.

The author Jing Zhang thanks her family, boyfriend, and friends for their understanding and support during the two years studying abroad.

(5)

Abstract

Even before the spread of COVID-19 video conferencing software has seen a steady rise in use. Due to their convenient way of offering a way of seeing the other participants live while talking to them, it is quite easy to see why this kind of software became more and more used throughout the years. Now, during the pandemic, video conferencing software is more used than ever before, especially in learning environments. Nevertheless, studies show that student engagement is rather low with university students who take part in online learning. Throughout this paper, we venture into discovering the reasons behind this lack of engagement and how it can be improved from a User Experience Design standpoint. With findings resulted from several previous studies and identified student problems and needs from those papers we created a prototype to test which features and design elements affected student engagement.

Keywords

User Experience (UX) design, User Interface, Online student engagement, User Engagement Scale, Video Conference (VC) System, Synchronous Online Learning, Asynchronous Online Learning

(6)

Table of Contents

1

Introduction ... 9

1.1 PROBLEM STATEMENT ... 10

1.2 PURPOSE AND RESEARCH QUESTIONS ... 11

1.3 SCOPE AND DELIMITATIONS ... 11

1.4 OUTLINE ... 12

2

Theoretical Framework ... 13

2.1 ONLINE LEARNING ... 13

2.2 VIDEO CONFERENCING SOFTWARE ... 13

2.3 ONLINE STUDENT ENGAGEMENT ... 16

2.4 USER EXPERIENCE (UX)DESIGN ... 16

2.5 USER ENGAGEMENT (UE) ... 17

3

Method and Implementation ... 18

3.1 METHOD CHOICE... 18 3.1.1 Literature review ... 18 3.1.2 Survey ... 19 3.1.3 Semi-structured interview ... 20 3.2 ARTIFACT CREATION... 21 3.2.1 Prototype ... 21 3.2.2 Competitor Benchmark ... 22 3.2.3 Scenarios ... 22 3.3 SAMPLING ... 25 3.4 DATA COLLECTION ... 26 3.5 DATA ANALYSIS... 26

(7)

3.5.2 Thematic analysis ... 27

3.6 ETHICAL CONSIDERATIONS ... 27

4

Results and analysis ... 29

4.1 LITERATURE REVIEW DATA ... 29

4.2 CONFIRMATORY SURVEY DATA ... 31

4.3 INTERVIEW DATA ... 32 4.3.1 FA (Focused attention)... 34 4.3.2 PU (Perceived usability) ... 35 4.3.3 AE (Aesthetic appeal) ... 38 4.3.4 RW (Reward factor) ... 38 4.3.5 S (Structure) ... 39

5

Discussion ... 40

5.1 METHOD DISCUSSION ... 40 5.1.1 Literature Review ... 40 5.1.2 Confirmatory Survey... 40 5.1.3 Interview ... 40 5.1.4 Prototype ... 41

5.1.5 Reliability and validity ... 41

5.2 RESULTS DISCUSSION ... 42

5.3 RELATED FINDINGS ... 47

6

Conclusions and Further Research ... 50

6.1 CONCLUSIONS ... 50

6.2 FURTHER RESEARCH ... 50

7

References ... 52

8

Appendices ... 57

8.1 APPENDIX 1:PROTOTYPE LINK VIA FIGMA ... 57

(8)

8.3 APPENDIX 3:CONFIRMATORY SURVEY QUESTIONS ... 57

8.4 APPENDIX 4:COURSE DESIGN FINDINGS FROM LITERATURE REVIEW ... 58

8.5 APPENDIX 5:COMPETITOR IMPLEMENTATION ... 60

8.6 APPENDIX 6:THEME ANALYSIS ... 62

List of Figures

Figure 1: Learning-related features and the experiential learning modes (Correia et al., 2020) ... 15

Figure 2: Whiteboard in the Meeting room ... 22

Figure 3: Access a specific course ... 23

Figure 4: Create a group in Chat ... 24

Figure 5: Schedule a meeting in Chat ... 24

Figure 6: Change accent color of the messages ... 25

Figure 7 Design process (Pacholczyk, n.d.) ... 27

Figure 8: Problems with online learning mentioned by students ... 32

Figure 9: Positive Features-Tagging “@” chat, Whiteboard & Annotations and Breakout rooms ... 43

Figure 10: Asynchronous features-progress bars and stats area ... 44

Figure 11: Asynchronous feature-timeline ... 45

Figure 12: Unnoticed features- countdown and friendlier UX writing ... 46

Figure 13: Asynchronous features ... 46

Figure 14: User behavior- checking the details of the most recent lecture ... 48

Figure 15: User behavior-joining the first breakout room ... 48

List of Tables

Table 1: Search combinations used ... 19

Table 2 Variables of UES (Heather et al., 2018) ... 20

Table 3: features translated from literature review ... 31

(9)
(10)

1

Introduction

With the quick spreading of the Internet around the world, the potential of video conferencing software (further referred to as VC) has to reach students around the world is greater than ever. Video conferencing software became a vital part of online learning which now offers educational resources and media besides ways for students to connect through a visual medium with their teachers and their colleagues.

As said by Farrell and Brunton (2020) that since online education provides access to education more conveniently and flexibly in more geographical regions, it has become one of the fastest-growing areas of education. Statistics present by Chen (2021) states that by June 2020, 97% of college students in the US have switched to online education due to the COVID-19 pandemic. This has put a lot of strain on the existing VC, developers needing to quickly adapt to the sudden surge in the number of users. Rop and Bett (2012) pointed out that VC has seen an increase in popularity due to cheaper internet connections and better technologies, despite that there are several disadvantages brought by this technology. One of those disadvantages as showcased by Rop and Bett (2012) is the fact that university students may become lazier due to them having their classes at home and consequently they end up lacking in self-discipline. In recent years, numerous studies have been devoted to how online courses influence student performance via such as questionnaire surveys in various regions and periods, as well as tried to present the different requirements and needs between online and traditional face-to-face college courses. One of those findings as Jaggars and Xu (2016) pointed out, is that more interaction opportunities should be created in online courses between students and students, as well as students and instructors. Most universities use VC which was not necessarily built to facilitate online education but just to facilitate a way of streaming video from one point to another. This could be a potential factor causing the discontent university students feel towards their new way of being instructed. Moreover, Farrell and Brunton (2020) have recorded that online degree programmes have had lower completion rates when compared to their traditional counterparts. One of the factors that lead to these low completion rates is student engagement and it is important that the needs of online university students are better understood for them to enjoy better education, as Raine & Gretton (2013) stated “learning gains are almost exclusively linked to engagement”. A large body of research has discussed student engagement in the online environment from the views of such as instructors, public health, educational psychology and students. However, defining engagement in any satisfactory and comprehensive manner is controversial, just as Maroco et al. (2016) explained that “engagement is a multidimensional construct with both behavioral, emotional and psychological components.” In addition, there are also

(11)

large differences in the way of measuring engagement, such as interview and observations.

Due to the nature of online learning, the medium with which students interact with their teachers but also with their study material is completely digital. So, the interaction between the different software used and the students has to be as good as possible to facilitate a good education. This is where UX comes in, to offer an optimized experience for students and thus helping with the facilitation of quality education in an online environment.

Since the third wave in human-computer interaction, user experience (UX) design is defined as a process of designing the interaction between a user and a product and addresses many current challenges in various areas, to meet the specific needs of the user. Although the term “user engagement” is part of UX and according to Sutcliffe (2016), it is used when speaking of short-term user experience and is influenced by interaction design. User engagement focuses on getting the users more involved with a product, this is done by using user-centered design and the techniques that come with it, namely scenarios, storyboards, mockups, and prototypes to test designs with users and explore their reactions.

1.1 Problem Statement

Since the beginning of the pandemic, VC has become the main medium through which education is done. Despite the convenience brought by this learning from home situation, many university students feel like the quality of the education has degraded compared to in-person teaching. One of the possible factors causing this feeling of degraded education could be, as pointed out by Zou et al. (2020) that current VC like Microsoft Teams, Zoom and Cisco Webex are not designed specifically with eLearning in mind. These pieces of software are simply made to facilitate video calling and not much more than that.

Teachers have adapted as best as possible to this new and mostly unfamiliar teaching environment, but statistics presented on EducationData.org (2021) state that for 13% of the university students who have converted to online education, the quality of the instruction received is worse compared to in-person instruction.

As found by Hill and Fitzgerald (2020), moving all education completely online due to the pandemic has taken its toll on the engagement between students and lecturers. They have discovered that with emotional engagement the students experience a sense of belonging, enjoyment and interest which boosts their active participation in the lectures. However, because of the nature of online learning the students spend a lot less time with their peers and this makes it hard to achieve those feelings. Especially more so due to them having to deal with all the recent changes happening in their lives.

(12)

We could not find research being done on how the UX of VC can influence student engagement during online education. We have found studies related to how to improve student engagement but none of them looked into the UX perspective, some studies revolved around how teaching methods can be changed to maximize engagement. We believe that these studies failed to explore the problem of software, which seemed to be seen as just a means to an end and not as a relevant actor on the stage of e-learning. Because of that lack of studies being done and due to some research pointing towards that gap, we strongly believe that this paper could fill the knowledge gap.

1.2 Purpose and Research Questions

Following from the problem statement, student engagement can benefit from better UX. Consequently, the purpose of this study is: to understand how VC affects student engagement in online classes. We are looking for ways to influence online university student engagement through new possible features and user-centered design.

The first part of our research consists of identifying students’ problems and pain-points regarding online learning. Identifying these will then contribute to us being able to propose solutions to these problems and to test which solutions affect engagement. Hence, the study’s first Research Question is:

RQ 1: What are the students’ pain-points regarding online education?

It will be necessary to translate the findings from the previous research question into the prototype’s creation. Furthermore, the prototype will have to be tested with students to ascertain which findings proved to make students feel more engaged and which did not. Hence, the study’s second Research Question is:

RQ 2: Which UX features and design choices could affect the student’s engagement?

1.3 Scope and Delimitations

This thesis covers how VC affects student engagement in online classes and ways in which, from a UX standpoint, these could be improved. This means that aspects like programming, hardware, business approaches of existing VC, software awareness and the like are not the focus of this thesis. The whole research and concepts related to “student(s)” in this paper refers to the higher education students. The process of gathering data was be done by reviewing existing literature on the subject of student engagement in online mediums in order to ascertain the existing shortcomings and experiences while also taking into account previously reached conclusions and findings. The result being ways in which the engagement level is affected in online learning from a UX standpoint.

(13)

1.4 Outline

The paper proceeds as follows: Section 2 contains a series of theoretical frameworks, the review of relevant literature and the knowledge gained from the literature. In section 3, we described our methods and the prototype creation process that we tested with online students. Followed by the findings from three approaches and the description of data analysis in Section 4. In Section 5, we discussed the adopted methods, the main findings as well as the related findings. We concluded in Section 6 by summarizing our study together with some important future research directions.

(14)

2

Theoretical Framework

This section first outlines the overall concept of Online Learning. Then, it presents Online Student Engagement, User Experience Design and User Engagement in general as well as in the online education environment.

2.1 Online Learning

Today, web-based learning is used as an alternative to face-to-face education. Online learning (also referred to as e-learning) is, according to El-Seoud el at. (2014): “any learning that involves using internet or intranet.” This tool is used in most international universities around the globe. The “e” in e-learning is more understood as an abbreviation for “evolving, enhanced, everywhere, every time and every-body” (Jaggars and Xu, 2016). There has been more effort at advancing technology than ever before. Zou et al. (2020) state that there are two types of online learning: synchronous and asynchronous. The synchronous refers to virtual classes which make use of VC. These classes happen in real-time, are live and scheduled as well as being influenced by three factors: the classroom, the media and the conference (Shahabadi and Uplane, 2015). The asynchronous refers to pre-recorded lectures and study materials that do not require the simultaneous presence of a teacher and the students for education to be conducted. Shahabadi and Uplane (2015) state that while synchronous online learning requires all participants to be available at the same time for a class done through VC, asynchronous online learning facilitates the sharing of information without the previously named disadvantage. Platforms that offer asynchronous online learning do so by pre-recorded videos of lectures, study materials, threaded discussions, instant messaging and blogs (Shahabadi and Uplane, 2015). Universities make use of similar products for their students to hand in assignments, access documents, pre-recorded lectures and more. Such popular tools are Canvas and Moodle. The features offered by both synchronous and asynchronous e-learning provide the classroom experience of exchanging information not only between teachers and students but between students and their peers as well (Shahabadi and Uplane, 2015). Online learning has become the predominant way of learning amidst the COVID-19 outbreak.

When choosing from different e-learning platforms based on features and purposes, the faculty has numerous options (Zou et al., 2020). Once the online learning process has begun, maintaining online student engagement remotely becomes the next major challenge. In this thesis, we are mainly focusing on the synchronous aspects of online learning while touching on some asynchronous aspects as well.

2.2 Video Conferencing Software

Rop and Bett (2012) defined VC as a method of communication between two or more locations where video, audio and data signals are exchanged electronically to provide

(15)

interactive communication to their users. Compared with traditional audio conferences, VC is a synchronous communication tool (Correia et al., 2020) that makes real-time communication between faculty and students more personalized and effective (Rop and Bett, 2012).

It is a cost-effective solution (Martin and Parker, 2014) becoming more and more popular. Even before the COVID-19 pandemic, some fields such as meetings, data sharing and interviews had already adopted the use of VC instead of the traditional in-person meeting. There are several widely known VCs currently available for the public: Zoom, Cisco Webex, Microsoft Teams, Skype, Google Meet and the list can go on. All of these offer an alternative way of communication that is safe and convenient for the users. With education having moved online, VC have seen a surge in their usage and making them as easy to run and easy to use has been the main focus of the companies which provide these pieces of software. Correia et al. (2020) examined VCs such as Zoom, Skype and Teams. They summarized the 14 learning-related features which are directly related to e-learning activities as well as cross-referenced with experiential learning cycle elements: concrete experience, reflective observation, abstract conceptualization, and active experimentation. The details can be seen in Figure 1 below:

(16)
(17)

2.3 Online Student Engagement

There are several pieces of literature that have attempted to understand the needs and learning styles of individual learners and instructional design from various views. Student engagement is defined in different ways by different authors. Farrell and Brunton (2020) defined it as “a student’s emotional, behavioral and cognitive connection to their study” and stated that it has a direct impact on student success and achievement. Maroco et al. (2016) defined student engagement as the willingness of a student to take part in school activities. Activities which refer to attending classes, submitting their assignments, and following teacher instructions during class. Online student engagement follows the same definitions with the added fact that the experiences of the students have changed drastically during the pandemic when everything has been moved online.

Farrell and Brunton (2020) stated that there are several psychological factors that influence online student engagement. Some of those factors are the peer community they find themselves in, which in the case of online classes the amount of time spent with that community is low. This influences the sense of belonging felt by students which in turn affects their engagement level. Another factor would be the teacher itself which has to put in more effort than ever to keep the students engaged and interested, especially since it is a lot easier to get distracted during online classes. Lastly, course design is a structural factor that has a high degree of influence on engagement. As mentioned by Farrell and Brunton (2020) online students would benefit even more from deliberately orchestrated opportunities to engage with their peers.

2.4 User Experience (UX) Design

UX design describes a series of iterative decisions that achieve successful results through interaction and an efficient and satisfactory process in achieving the result. Roth (2017) mentioned that UX represents a set of concepts, guidelines and workflows, and introduced an addition of UX frameworks such as graphic design, human-computer interaction, information visualization and usability engineering. Xu (2011) stated that UX is committed to effectively solving overall user experience problems, predictively considering the development to influence the strategic direction of the product, and actively explore UX to identify a new UX landing area.

Patil et al. (2016) summarized that five UX activities to fulfill the key fundamentals of UX are:

• User interview and persona which is a set of users that are different from each other

• User context helps to consider the features and limitations • Low and high-resolution wireframes

(18)

• Graphical user interface (GUI) design • Usability test via mock-ups

2.5 User Engagement (UE)

Lalmas et al. (2014) define user engagement as the attribute of UX which emphasizes the positive aspects of interacting with an application. Particularly, the desire felt by users to use that application for long periods of time and repeatedly. With the fact that “successful applications are not just used but are engaged with” (Lamas et al. 2014) in mind, the two terms (UX and UE) should not be confused. The fact is that just because an application has great user experience does not automatically imply a high engagement degree and vice versa.

Sutcliffe (2016) has stated that UE is part of UX. Where UX is concerned with the reason why people adopt and then continue to use a certain design over the span of the years UE is more concerned with how people feel a certain attraction to using interactive products. Moreover, UE aims to explain some applications that are more attractive for people to use and how good UE design can make interaction exciting and fun.

As mentioned by Sutcliffe (2016) that UE has three main components: interaction, media and presence. UE’s purpose is to excite and attract the user. This is done by understanding the perceived experience is affected by motivation and emotion resulted from design cues that promote interest, novelty, and the potential to fulfill goals of a task-oriented/experiential nature.

(19)

3

Method and Implementation

This section outlines the methods used and the implementation followed while carrying out this research. The first part describes the methods that supported the overall research process. The second part describes the artifact that has been created as a byproduct of this work. The third to fifth parts describe the research strategies and techniques used for collecting and analyzing the data resulted from testing the artifact and survey together with the selection of research participants. The last part of this section discusses the ethical considerations that have been adopted in this study.

3.1 Method Choice

This part covers the concepts and implementation processes. The literature review method gains insights from the existing work, insights which are confirmed by the confirmatory survey. These two methods provide an answer to the first research question. The Semi-structured interview and the User Engagement Scale (UES) which measures the user engagement and proposes a questionnaire design, are the methods used to collect data via testing the prototype to answer the second research question.

3.1.1 Literature review

Snyder (2019) states that literature reviews are an important building block of all kinds of research. Literature can serve as a knowledge basis, provide evidence of an effect or occurrence, and can even spark new ideas for further research. By studying literature, research gaps can be found, and the scope of research can get more precise. Literature review is “a more or less systematic way of collecting and synthesizing previous research” (Snyder, 2019) with further mentioning that a literature review can address research questions with a lot more power than a single study could. This would be achieved by making use of the empirical findings and perspectives discovered throughout the process. We chose this method because there is a number of studies that have been made on online student engagement which covered many areas and purposes. Table 1 lists search combinations, databases and filters that have been used to find student needs and pain-points in an online learning environment.

(20)

Search engine Search Combination Filters Used

Google Scholar

“Evaluate videoconferencing system” OR “Evaluate user engagement” OR “Evaluate online student engagement”

" User experience design " AND ("eLearning" OR “Online Learning” OR “Remote study” OR “Synchronous eLearning system” OR “Online environment”)

“eLearning” AND (“Online student motivation” OR “Students’ motivation” OR “User engagement” OR “Online student engagement”)

From 2000 to 2021

IEEE Xplore

" User experience design " AND ("eLearning" OR “Online Learning” OR “Remote study” OR “Synchronous eLearning system” OR “Online environment”)

“Videoconferencing” AND “User experience” OR “User Engagement”

From 2000 to 2021

SpringerLink

“eLearning” AND (“Online student motivation” OR “User engagement” OR “Online student engagement”)

" User experience design " AND ("eLearning" OR “Online Learning” OR “Remote study” OR “Synchronous eLearning system” OR “Online environment”)

From 2000 to 2021

Table 1: Search combinations used

3.1.2 Survey

Ponto (2015) defined the survey research as “the collection of information from a sample of individuals through their responses to questions”. Saunders et al. (2016) also pointed out that the survey is easy to interpret and understand, which helps to collect quantitative data and ensure that the sample is representative. Survey research is associated with a variety of strategies. According to the specific scope and purpose, survey research can use quantitative research strategies (such as a questionnaire), qualitative research strategies (semi-structured interviews), or mixed methods.

An online survey was distributed through a link (the survey questions can be found in Appendix 3) to students at Jönköping University to confirm or deny the needs and pain points identified from the literature review as well as possible during the ongoing pandemic. This survey made use of the Likert-type scale, where the questions inquire the respondents about their opinion on the identified data. Questions 4 to 12 were the ones inquiring about the 9 key findings from the literature review, questions 1 to 3 inquired about students’ age, whether they were current students or not and about their past learning experience. The last question was an open-ended one, this gave them a chance to provide us with data that we might not have found through the literature review. Some problems with this method could be once a survey is out, it cannot be

(21)

changed; ensuring a good sample size could be challenging and due to the surveys being unsupervised their validity might be questionable.

3.1.3 Semi-structured interview

Saunders et al. (2016) explained semi-structured interviews to be conducted as follows: “the researcher has a list of themes and possibly some key questions to be covered, although their use may vary from interview to interview”. Meaning that depending on the direction the interview is going, the researcher(s) can choose whether to leave out some questions from that specific interview. In our case, the semi-structured interviews proceeded by testing our prototype with students, listening to their feedback as they execute predefined scenarios, asking them to complete the short version of the UES questionnaire followed by a free discussion session. This helped us assess the effect of the design choices and features we have implemented from the literature review and conclude which ones have made students feel more engaged.

In general, there are two ways to measure student engagement, Heather et al. (2018) have elaborated two questionnaires in their paper which they proved to be useful when measuring user engagement on the e-shopping platforms tested. According to the authors, the questions have been designed to provide a score regarding the engagement felt by the users regarding any kind of software not just e-shopping websites. The UES measures 4 aspects of user engagement, their abbreviations, names and explanations can be found in Table 2 below.

Abbreviation Name Explanation

FA Focused Attention Refers to the user feeling absorbed in the interaction with the software.

PU Perceived Usability

Refers to the aspects experienced by the user as the result of interacting with the software and the amount of effort expended to use the software.

AE Aesthetic appeal Refers to the perceived attractiveness and visual appeal of the interface.

RW Reward factor

Refers to the feeling of being rewarded for their actions that users might experience while using the software.

Table 2 Variables of UES (Heather et al., 2018)

Measuring the User Engagement of the interviewed users was done by making use of the short form made by Heather et al. (2018). Despite there being another, longer form

(22)

available in their paper, Heather et al. (2018) argue that the form present in Appendix 2 is “sufficiently short to be useful to other researchers without being open to the problems of single-item scales”. Furthermore, it is proved in their paper that the short form provides good internal reliability.

With a total of 12 questions with answers ranging from “Strongly Disagree” to “Strongly Agree” and encoded with a score from 1 to 5, each set of 3 questions starting from the beginning have the purpose of measuring one of the 4 aspects of user engagement: FA, PU, AE, RW. Heather et al., (2018) specify that for questions 4 to 6 which regard PU the scoring should be reversed, thus if a student would mark a “Strongly Disagree” it would translate to a score of 5 instead of 1. This is due to the questions being formulated in a rather negative tone.

When conducting this form, the authors state that results will be influenced by the order in which the participants will be asked about their experiences, with the examples of “immediately following an interaction with a system” or “versus completing a knowledge retention test”. In our case, the UES form was conducted first thing after the interviewees complete the usability test with the addition that after the completion of the form the participants would be asked to argument their answer to provide us with a deeper understanding as to why we are faced with the scores we are faced.

3.2 Artifact Creation

Based on the student needs and pain-points, proposed features and suggestions gathered from our literature review which were confirmed by a survey, we followed the UX design process taught to us during our studies and created a prototype for the purpose of testing during the interview to help us conclude what ideas would benefit student engagement.

3.2.1 Prototype

A prototype was made in Figma where we put in practice our new UX ideas for the interface of VC’s. The goal of this prototype is to transition the possible solutions that the students face from a concept point to a practical point where the users can get a more well-defined idea of how those features/solutions could help improve their experience.

Figma is a web-based prototyping tool that we used to work together on building our mock-ups. We have chosen this software due to its very good performance in team projects and due to both of us having a fair degree of experience with it. Another reason for our choice is the fact that Figma allows the sharing of the prototype online through a link that allows the users to experience the design first-hand instead of being limited to just screenshots.

(23)

3.2.2 Competitor Benchmark

During this process we studied the way other software implements and makes use of features that affect student engagement during online lectures. The features studied are presented in chapter 4.1 and several of them are present in different software but none of them made use of the entire set.

The way existing platforms such as Zoom, Skype and Teams solve the identified pain-points from the literature review are presented in the table from Appendix 5.

3.2.3 Scenarios

To test the usability of our prototype we have devised four test scenarios that users went through. Throughout these scenarios we have asked them to think out loud to understand their process of exploring the interface and understanding the application. This is a relevant process since usability plays a role in user engagement and easier-to-use programs have a higher chance of becoming engaging.

Thus the 4 scenarios are as follows: • Scenario 1: Join a lecture.

In this scenario the user is going to start from the home page and is supposed to join a lecture from a specific course. From there the users had several tasks: to access the whiteboard (as can be seen in Figure 2), to open the chat section and send a message where the student tags the teacher and lastly, to choose a breakout room to join in.

(24)

• Scenario 2: Check the latest lecture that you had.

This scenario involves the user accessing a specific course (Figure 3), checking the chat and participants area, an attached video with chapters and expressing their opinion and feelings about it. A sub-scenario is accessing an upcoming lecture from the upcoming section without us, the testers, specifying how to do it to assess the usability of our navigation solution.

(25)

• Scenario 3: Schedule a meeting with Jody Gun.

During this scenario users are supposed to navigate to the chat section of the program and access a chat with a specific user where they are supposed to create a scheduled meeting with the user Jody Gun (as illustrated in Figures 4 and 5). As in the other scenarios, it is crucial that we remind users to think out loud during every step of the way.

Figure 4: Create a group in Chat

(26)

• Scenario 4: Check the number of lectures you have participated in this semester. Lastly, this scenario involves the users navigating to their profile page and checking different stats like time spent in lectures or messages sent in lecture chats and changing an appearance setting such as the theme (light/dark) or the accent color of the messages (Figure 6).

Figure 6: Change accent color of the messages 3.3 Sampling

Determining the sample is a significant part of research. Since we used interview and questionnaire to collect data from fewer cases this also means that the information is more detailed (Saunders et al., 2016), the probability samples are associated most with our research strategies. Following the process of probability sampling (Saunders et al., 2016), we sorted out our semi-structured interview sampling information:

• Identify the sampling frame: The target population in our case is higher education students who are involved in online courses, which means they should already have experience in using VC daily.

• Sample size: 10

• Select sampling technique and the sample: The sampling was done by asking students at random which were on campus from different schools if they would like to participate in an interview. All the students were studying at Jönköping University since this is the university we are studying at and it was the most likely place for us to find students to interview and test our prototype with, especially

(27)

• Check the sample is representative of the target population: One advantage of Jönköping University is that, according to the official website (2021), there were 796 international students from close to 100 countries last year. The participating students have different nationalities and study different subjects. This provides us with a representative and varied sample, made out only of students, which means we could ensure validity in a small sample size.

Like the process above, with the differences in the survey being:

• The survey was distributed through social media communities of Jönköping University students and whoever wanted to answer it could do so anonymously. • Sample size: 23

3.4 Data Collection

Throughout this thesis data was collected through several methods. Firstly, data about student pain-points and needs was gathered through literature review. Next up, the data collected from the literature review had to be confirmed, thus a survey was made via Google Forms and shared among students with the data being collected on Google Drive. Following the aforementioned processes, the interview sessions data was collected by recording the interviews, transcribing them, taking notes during the user testing part of the interview and by asking the participants to complete the UES questionnaire after the user test section of the interview. The raw data for each user, meaning interview transcripts and notes, was stored inside a document together with the UES scores resulted from the questionnaire.

3.5 Data Analysis

There are three parts of data that should be analyzed. We chose the User-centered analysis method to translate the literature review insights such as the previous research samples and learning-related features into a final prototype. For the data resulted from the interview and user testing we used the thematic analysis method. While for the data resulted from the UES questionnaire and survey we used the Likert scale scoring method to obtain a score ranging from 1 to 5.

3.5.1 User-centered analysis

Pacholczyk (n.d.) states that since user analysis works in such a way that allows user influence upon the product, the insights gained can aid in avoiding time-consuming processes of making decisions. User analysis provides indubitable facts which cannot be argued with, they are just truths which after careful analysis can bring several advantages to the table like superior products and greater ease of use sometimes with a

(28)

very gentle learning curve for the users. In Figure 7, Pacholczyk (n.d.) illustrated the design process from which user-centered analysis is part of:

Figure 7 Design process (Pacholczyk, n.d.) In our case, we followed the steps below:

• Creating personas to understand the mindsets of potential users.

• Creating a Moodboard for better understanding the mood the prototype should transmit to the users.

• Drawing user stories to map out how they might use VC.

• User tests for the prototype via completing the specific scenarios.

3.5.2 Thematic analysis

Deductive and inductive analysis are two basic approaches to analyzing qualitative data. Researchers can use the deductive method to look at “what all respondents answered to the same question”, or “to confirm or refute research hypotheses, or interactions within the data that the researcher presumed” (Harrell and Bradley, 2009). In general, the text analysis process includes:

• Selecting the text to be examined and identifying the themes • Matching the text with a specific basic theme and subthemes

• Identifying patterns among the themes to support and understand relationships between themes and what led to a specific theme.

In our case, we selected the text from the interview and deductively identified the themes which are associated with each question.

3.6 Ethical Considerations

To ensure an ethical way of work during this study the participants to the confirmation survey the only personal data asked of them was their age. As for the interview, they were assured that their names were not going to be used, and each interviewee was asked for consent prior to starting the recordings. Consent was also asked about the

(29)

usage of their nationality and the location of their previous places of study if they had any.

For the sake of academic writing, profanities and strongly-worded sentences resulted from the interviews were either not used or rephrased in such a manner that the meaning was not altered or lost.

(30)

4

Results and analysis

This section presents results from the literature review, survey and interview. The literature review results are presented in the form of a table that showcases the identified needs and pain-points, the features which would solve them and the implementation of those features. The survey presents confirmatory data for the identified needs and pain-points from the literature review together with other pain-pain-points specified by the responding students. Lastly, the interview data section presents the identified themes and subthemes that resulted from performing theme analysis on the interviews.

4.1 Literature review data

As mentioned in the introduction section, there is no truly correct way of defining engagement due to it being “a multidimensional construct with both behavioral, emotional and psychological components” (Maroco et al., 2016). The literature data gathered in this chapter aims to summarize and make use of previously made studies done for the sake of providing students with more engaging and fulfilling e-learning experiences. Thus, we highlighted the 9 key findings from the papers regarding students’ experiences with online learning and transformed them into 12 features and design ideas (Table 3) guiding our design work. This data has been collected from several studies regarding student engagement, synchronous and asynchronous e-learning, online learning platforms, and ways of gamifying learning. Totaling at over 2000 students at over 7 universities from over 7 countries with studies and papers ranging from 2011 to 2020.

(31)

Needs &Pain-points Features Usage/Implementation

1

One main requirement stated by students is being able to view the lecture chat after the lecture is over. (Laskaris, 2016).

Save chat

The lecture chat would be saved and made accessible to students automatically.

2

One drawback of the existing real-time chat is that it could not manage “the effective

retrieval of information from the participant after the accession to the historical question and answer”. (Luo et al., 2016, Martin and Parker, 2014)

Tagging “@” chat

Tagging users and/or instructor(s) with an “@” sign when discussing things in the chat could be a useful feature to draw their attention and to give a more person-centered feeling to the conversation.

3

To simulate a more similar learning

environment to traditional learning. (Correia et al., 2020)

Whiteboard

Teachers could make use of a whiteboard area where they would be able to sketch whatever they needed for the lecture.

Annotations

Teachers could use this feature when presenting information-rich slides by highlighting certain places from the presentation or the screenshare.

4

Teachers should encourage the students to interact with each other through rewarding systems such as bonus points which could have a positive effect on the final grade of a specific course. (Najmul Islam, 2011). Rewards for proper behavior and social engagement are mentioned as a good motivator. (Muntean, 2011)

Stats area

A way of keeping track of engagement stats such as time spent contributing to an online lecture (time spent speaking for example), contribution in the online chat, approved messages by the teacher.

Can come in the shape of virtual rewards (such as badges) or as real rewards (such as extra points).

5

Making use of shortcuts and search features would help minimizing the amount of time spent finding the specific bit of information students require. Moreover, helping students use less time for finding relevant information aids to their engagement. (De Lera et al., 2013)

Searching

Search fields used throughout the prototype to find messages in the chat and to search general items through a large scope.

Recording timestamps

The option for the person recording to add timestamps on recorded lectures that can be seen inside the program so students don’t have to go through an entire lecture when they need to re-watch a certain portion of it.

(32)

6

Students need constant and prompt feedback to keep them informed on their course progression. (Muntean, 2011)

Stated that the addition of a countdown before the start of a lecture would be beneficial to the engagement of the students. (Gonguet et al., 2013)

Progress bars

Shows how many online lectures will be conducted during a course, how many have already taken place and how many did the student take part in.

Countdown

A countdown could be shown in the waiting room before the students present before the beginning of the lecture.

7

Suggested a solution that to meet students’ requirements such as need a small group space to study and needed to achieve a sense of belonging which is harder to achieve with virtual social interaction (Hill and Fitzgerald, 2020)

Breakout rooms

Instead of randomly being assigned to a group, students could benefit from being able to choose who to go in a break-out room with as well.

8

If students feel that they have participated in the look, feel and design of their learning environment they will feel more in control of it. The point that the authors are heading towards is that it would be beneficial for student engagement if each student would have their own personalized design. (De Lera et al., 2013)

Customizability

Customizable features could be such as changing the accent colors, dark/light theme.

9

Make use of more community-oriented language to make the experience feel more involving for the students. (De Lera et al., 2013)

More friendly UX writing in the waiting room

Instead of using impartial machine tones like “joining meeting”, we adopt a more friendly tone like “[the name of the teacher], [one or 2 students] and [the number of students currently present in the meeting] are waiting for you.”

Table 3: features translated from literature review 4.2 Confirmatory Survey Data

The following are the results gathered from the survey used to confirm the identified student needs and pain-points from the literature review. This confirmatory survey was completed by 23 students from Jönköping University. Questions 1 to 3 revealed 100% of the respondents were students with ages ranging from 19 to 34 and 91.7% of them had their learning experience mostly in an online environment in the past 2 years. Thus, the respondents are representative of the target population, both being online students.

(33)

Table 4 shows the scores calculated by using the Likert scale. Since the questions inquire about their level of agreement or disagreement with the statements, we developed our process by taking 3 (± 0.5) as a neutral value when calculating the average scores. This means that if an average score is between 2.5 to 3.5 then the average student response is indifferent. Anything above 3.5 or below 2.5 that score means either an average opinion that agrees to the question or one that disagrees with it. If the average opinion of the responders agrees with the statement from the survey this means that the identified pain-point or need from literature review is indeed real for the majority of the responders.

The score was calculated by assigning values from 1 to 5 to the answers to the question which ranged from “strongly disagree” to “strongly agree”. Then for each response the result of each question was added to the sum then divided by the number of responders, using the arithmetic mean. For example: for 23 responders we have 23 scores for question 4 ranging from 1 to 5, all the scores are added and then the sum is divided by 23 resulting in an average score of 4.04. The score 4.04 above 3.5 which means students are agree Q4 (to read the chat of an online lecture even after it is over), so the feature “save chat” from the literature review is confirmed.

Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12

4.04 3.56 4 4.22 4.74 4.09 3.91 4 3.57

Table 4: Survey Likert scale score

Out of all the participants only 5 provided an answer to the open-ended question inquiring about other problems encountered during their online learning experience. The answers can be seen in Figure 8 below:

Figure 8: Problems with online learning mentioned by students 4.3 Interview Data

The interview data was gathered from 10 university students from 7 countries. The questionnaire data was calculated in Table 5.

(34)

As for the UES questionnaire, scores for each of the four subscales can be calculated by adding the values of responses for the three items contained in each subscale and dividing them by three. For example, “Focused Attention” would be calculated by adding the FA-related questions and then dividing them by three.

The Focused Attention score for U1 is calculated like this: FA (U1) = (Q1+ Q2+ Q3)/3= (3+4+3)/3 = 3.33

An average score for each subclass can be calculated by adding all the scores from the 10 users for that subclass and then dividing them by 10.

The average Focused Attention score is calculated like so:

FA (Average) = (U1 + U2…+U10) /10 = (3.33 + 2.67 + 3.33 + 4.33 + 4.67 + 4.33 + 3 + 3.67 + 4.67 + 4.33) / 10 = 3.83

Heather et al., (2018) advise on using only the subscales of the UES due to engagement as a holistic construct not being measured. Each score represents the performance of the prototype in affecting a type of engagement for the interviewees. The interpretation process is the same as previous survey. However, a higher score (above 3.5) means a higher engagement felt by the users, otherwise no influence (between 2.5 to 3.5) or negative influence (below 2.5) of the engagement. For instance, the students felt more engagement from AE (4.23) then FA (3.83). Each user has their own scores shown in order to showcase how not all students are the same and they do not perceive the prototype identically. Finally, the average scores are representing the overall effect of the subscales on all 10 participants.

(35)

Participants FA PU AE RW U1 3.33 4.33 3.33 3.67 U2 2.67 3.67 3.67 3.67 U3 3.33 2.33 3.67 4 U4 4.33 4 3.67 4.33 U5 4.67 4.33 4 3 U6 4.33 5 5 4.33 U7 3 4 4.33 5 U8 3.67 3.67 5 3.67 U9 4.67 4.33 5 5 U10 4.33 4.33 4.67 4 Average 3.83 4 4.23 4.07

Table 5: UES Scores

The text data was processed by the themes analysis method and into 5 main themes: FA, PU, AE, RF, S(structure), and a series of subthemes (Appendix 6).

4.3.1 FA (Focused attention)

The themes identified are the ones that, according to the interviewees, affect their feeling of being absorbed in the interaction with the software. These come in the form of features that they stated to affect them in either a positive or a negative manner. This theme has 3 subthemes that refer to students’ opinions and feelings related to the features which, according to them, would make them feel absorbed in their software interaction.

• R@: with the associated feature of tagging a person inside the chat section of the lecture. Out of 5 users who expressed their opinion on this feature, U4, 7 and 9 stated that “it would be easier to attract the teacher’s attention” by tagging them in the chat. U10 explained that they would not make use of this feature to talk to a teacher but instead to a fellow student. While U8 commented on a problem encountered previously with his teachers by saying: “it’s the best to see who you

(36)

send a message to and then they see a notification, teachers don’t see the messages sent by students early enough.”

• FA & FW: associated with the annotations and whiteboard features, which would enable the teacher to highlight information on the shared presentation live during the lecture. 6 students expressed their opinion on this feature with 4 out of 6 stating that it would help them to have their attention directed by the teacher through this feature, comparing it to how a teacher would use a laser pointer during traditional lectures. U7and U10 also stated that this would aid towards their engagement, U7mentioning “I am a visual learner, and it would be easier for me to learn by seeing the teacher draw on the slide”. U10 stated that “it would be useful to highlight details because when the teacher reads the PowerPoint you can get lost and this could highlight details, especially when there is a big slide presented”.

• FC: referring to the student perception of the countdown and list of participants present before entering the meeting room in the prototype. Out of 4 interviewees who expressed themselves on this matter, U7 and U10 consider this feature to be unnecessary due to them always joining the lectures on time. U8 and U9 found it useful for two different reasons, U9 explaining that the countdown would help them getting psychically ready, especially if there would be a sound effect after each passing second during the last 10 to 15 seconds. U8 stated that they could feel more prepared by being able to see who else will be present in the lecture before entering it.

4.3.2 PU (Perceived usability)

This theme refers to the aspects experienced by users as the result of interacting with the software and the amount of effort expended to use the software. It is composed of subthemes that all refer to the users’ opinions and feelings related to several features encountered throughout the prototype and observed behaviors by the interviewers. • RGB: refers to the behavior observed in users when asked to join the first available

lecture. All users used the green button present near the available lecture as a clue that is the available lecture since all the other buttons with the label “Join” were gray.

• FT: with the associated timestamp feature which would enable teachers to edit recorded lectures to have different chapters easily accessible on the scrolling timeline. All 10 of the interviewed users expressed positive opinions on this feature,

(37)

the general opinion being that it would help them save great amounts of time when searching only for a specific part of a lecture. Furthermore, there were 3 extra mentions the suggestion of being able to control the playback speed for students to save even more time (U6); how they appreciated having only the shared screen part of the lecture recorded and not the teacher and the other students’ cameras mentioning that they have no need to see anything else but the presentation slides during a recorded lecture (U7, 10). Despite this positive feedback, 3 interviewees expressed their doubts about teachers being willing to put in the time to divide a recording into chapters with one of them stating that “no one ever put in the effort to do that” (U3).

• FCh: refers to the students’ expressed opinion about the retention of chat sessions from online lectures. Out of 4 students who expressed themselves, all saw this as a useful feature with one of them (U5) mentioning that they would also like to be able to see the chat section while watching the recorded lecture as well.

• RMT-UI & RMT-P: two related themes which refer to 5 users’ comment on how they identified the teacher inside the meeting room. RMT-UI refers to students realizing the teacher’s identity through the layout of the camera viewfinder, this process was identified in 3 students. While the other 2identified the teacher through the picture used inside the viewfinder.

• UBW: refers to the behavior adopted by users when asked to open the whiteboard that was similar: All the users looked through the icons present on the bottom row and after not finding the desired icon directed their attention to the “+” button on the right of the interface. There was one outlier (U3) who tried to right-click on the share screen icon first before observing the plus button and thinking that it could be there. Two other users (U7, U5) mentioned that they couldn’t see the “+” from the start since it was far out of their peripheral vision. U2 clicked on the “+” button almost instantly after being asked to open the whiteboard, when asked about this behavior the user stated that “I have used teams before, and I am used to software hiding buttons in places”.

• RBR: refers to the users’ behavior when asked to join the first available breakout room. During this task we noticed that all the participants hovered the cursor on top of the first breakout room and then noticed the lock icon, thus realizing that the room was locked. Out of the 10 users 2 clicked on it, one of them (U1) did not

(38)

realize that the room is locked and the other one (U3) thought that they were already assigned to that room. One user stated their preference of having a written indicator like “Locked” instead of a lock icon.

• Predictability: was a theme that all the users mentioned and we as interviewers noticed. It refers to the quality of the interface which makes it easy to predict where certain elements could be found. A good example of this predictability was that when users were asked to open a recording of a past lecture, they all “knew” that the recording would be in the attachments section.

• ETU: discovered among all of the 10 participants was in relation to the “Ease Of Use” of the prototype. With mainly positive comments such as: “I could find everything easily” (U6), “I could deduce where things are” (U4), “Stuff was easily accessible” (U10), “I experienced no distractions from the prototype, I saw exactly what I needed” (U10). There were of course some negative opinions as well: “It wasn’t so easy to use when I had to find the stats page” (U3), “The plus button was in my peripheral view and I didn’t notice it” (U7), “It was not obvious that I can click on profile” (U8).

The last subtheme was UB (user behavior) that examined in different tasks.

• UBC, when asked to find a person called Jody there was a subtheme that 6 users followed identically as their first option. They had similar thinking processes like: “Should probably go into the chat section” (U1) and “I should go into my chats because I’m chatting with a private person” (U2). Out of the 4 remaining users, 3 tried looking for Jody by using the search bar as their first option, following by looking into the participants section of a previous lecture. As for the last participant, the process was mirrored to the previous 3.

• UBS: when asked to find stats such as lectures attended during scenario 4, 4 students thought almost immediately to check the profile page with 3 of them stating that they expected to find such information due to their previous experience with Duolingo and videogames while the 4th one stated that they couldn’t see it in the

homepage so they thought that it must be in the profile page. 3 students wanted to count the lectures present in the courses section by adding up the numbers in the progress bars, 3 students wanted to count the lectures in the upcoming section. • UB-ROPU & UB-ROP: reverse the order of upcoming and previous sections and

(39)

previous lectures section to be in the lower part of the page and the upcoming section in the upper part. 5 participants expected a descending order of the previous lectures while the other 5 checked the date of the lectures and deduced the ascending order. During the user testing 7 users hovered over the first course in the previous section despite only 5 mentioning that they expected it to be there.

4.3.3 AE (Aesthetic appeal)

This theme refers to the perceived attractiveness and visual appeal of the interface. This theme is built on the comments and opinions identified during the interviews.

Out of 10 users, 6 expressed themselves on this theme. The comments were mostly positive with 2 users mentioning that the visual look resembled a game, although the opinions were quite opposite: “Looks like a game so I like it more” (U9); “It should be less colorful, it looks like a game for kids” (U4). U1 stated that they feel like the use of rounded edges makes the interface more pleasing to look at. U6,7 and 8 mentioned that the colors, the minimal design, and the ability to change chat colors make the software feel more friendly and personal while also looking clean and modern. It is also worth noting that they also stated that the personal feeling is also influenced by the course progress bars due to them realizing that everyone might be more or less full than theirs.

4.3.4 RW (Reward factor)

This theme refers to the observed trends where the students identify the feeling of being rewarded for their actions while using the software. During the interview, users mentioned that they will be motivated by progress bars (MB-P), stats (MB-S), unity (MB-U) and feature timeline (MB-FT).

• MB-P: seemed to be the main motivating factor during the interviews with 7 out of 10 interviewees expressing their opinion on them. The main trend was that they would feel more motivated to complete a course and attend more lectures as they would notice the bar filling up. The constant feedback provided by the progress bars seemed to also provide a feeling of satisfaction as proved by the following quotes: “when you see there is not much left of the course it is more appealing to finish it” (U2); “It is very satisfying to see that you’re getting towards the end, you’re seeing where you are” (U3); “I would feel super motivated by seeing my progress in all courses in one place” (U6). However, the negative voices from U8 “if they are empty but should be fuller, wouldn’t want to have this feature despite recognizing its usefulness” and U10 “would motivate me to join more lectures if they do not count the final grade. Otherwise, I would be stressed by them”.

(40)

• MB-S: U1 thought the feature could make him more engaged with his studies. Meantime, U4 thought that it was weird to see how many minutes they have spoken during lectures and asked about privacy issues. Also mentioned that they couldn’t see the point in having such a section unless those numbers would count towards the final grade; U6 stated that it would be more useful to compare stats data with the one from previous semesters to help them realize if they are doing better or worse compared to semesters prior; U7 stated that such a feature would be useless for them, but they could see the point if they were competitive students, going on to say that “it would be a good way to motivate yourself.”

4.3.5 S (Structure)

The last identified theme is the structure theme. This one is related to the trends observed in the interviews where the users commented on structural design choices throughout the interface. As part of this theme we identified 2 subthemes:

• IA: Information Architecture refers to the way we have arranged the information inside the prototype to facilitate a good and intuitive user experience. With 6 students expressing their opinion about several IA related item, they have stated that the prototype “looks much more organized than the competition” (U1), “eliminated the need to constantly look for a lecture link” (U3), “provided short paths for users to do what they wanted” (U5) and “presented just the right amount of information” (U8).

• Un: unity is related to 5 users’ comments of how much they appreciate having lectures, courses, workshops, lecture materials, lecture descriptions, lecture dates and lecture durations all in one place. Students liked having all information in one place with 5 users stating that they loved seeing lecture duration, date, and description before joining it and all in the same software used to participate in the lecture. With U10 stating that seeing the description would benefit them when deciding if they should join a lecture or not. U5 also stated that often they chose not to join a lecture due to them needing to open too many platforms: timetable app, Canvas/Moodle, Zoom/Teams/Skype. Further stating that “During the course you need to open too many platforms, everything should be combined”.

(41)

5

Discussion

This section presents the discussion of the methods used and how they affected the reliability and validity of the study. Following the method discussion comes the discussion of the results and how they provide an answer to the second research question together with possible implications. Lastly, the related findings section discusses findings that did not provide an answer to any research question but are worth mentioning nonetheless.

5.1 Method Discussion

This part discusses the methods used throughout the study and their implication on reliability and validity. Several limitations and unforeseen events were met, and their implications were discussed.

5.1.1 Literature Review

Some of the literature review sources date back to 2011 and could have provided outdated findings. The pain-points of previous generations could have mismatched with the ones of the current generation. Despite that, after conducting the interviews and analyzing the data the contrary was proved. With the feature of progress bars inspired by Munteanu’s findings from 2011 being appreciated and desired by all participants, thus showing that the student needs seem to be yet unsolved by current platforms.

5.1.2 Confirmatory Survey

The survey is comprised of 13 questions attached in the link present in Appendix 4. This was shared within the community of Jönköping university over a span of 4 days for us to confirm or deny the findings found from the literature review. With 23 respondents the sample size might not be representative enough. Regardless, the results further prove that pain-points and needs experienced by online students years ago are still real and unsolved.

5.1.3 Interview

The interviews were performed with the users being briefed about the structure of the sessions, this structure being: performing the tasks, completing the UES questionnaire and having a free discussion session. They were made aware that they would be required to think out loud and that they will be reminded to think out loud if needed, throughout the whole task solving process. They had access to the interviewer’s laptop where the Figma prototype was the only thing available on the screen. The laptop was connected to an external display which mirrored the laptop’s display and was observed by the interviewers at the opposite end of the table. To preserve the data collected as-is, the interviews were recorded and we as interviewers also took notes according to what behaviors were noticed.

References

Related documents

Mathcoach (www.mattecoach.se) and Swedishcoach (www.svenskacoach.se) offers K-12 students help with their homework in Swedish for immigrants and mathematics online.. The coaches

This result indicates that people want to be considered more generous when identifiable online compared to when being fully anonymous.. The sentiment analysis for the blue

However, the approach to exclusionary screening adopted by the Swedish Ap-funds and Norwegian GPFG that we analyse in our study are remarkably different to the ones studied in

Therefore, this study aims increase the understanding of local adaptation, regarding both type and grade, and to fill the theoretical gap concerning the connection between

The importance of traceability has been well studied in the last few years [2]. The main goal of using traceability.. Traceability plays an important role in analyzing,

Instead in advanced economies market concentration is a negative determinant of profits in the financial crisis period 2008-2010, indicating that a high market concentration

between two or more people, in which news and ideas are exchanged”. Culture is something very subjective and personal with an almost hidden meaning to an external eye. For this

According to costly signalling hypothesis and the female choice, females will rate pictures of people shown with high-level of musical skills higher than men do, especially when