• No results found

Designing a Verification Tool for Easier Quality Assurance of Interoperable Master Format Packages

N/A
N/A
Protected

Academic year: 2021

Share "Designing a Verification Tool for Easier Quality Assurance of Interoperable Master Format Packages"

Copied!
45
0
0

Loading.... (view fulltext now)

Full text

(1)

Designing a Verification Tool for Easier Quality Assurance of

Interoperable Master Format Packages

Martin Sj¨ olund

VT20

Master Thesis in Interaction Technology and Design Supervisor: Ulrik S¨oderstr¨om

External Supervisor: Emil Edsk¨ar Examiner: Thomas Mejtoft

Department of Applied Physics and Electronics

(2)

Abstract

With today’s global distribution of movies, series, documentaries, and more, the need for a standardised system for storing content has emerged. Over-the-top media services such as Netflix, HBO, and Amazon Prime are storing large amounts of content, and by providing it internationally, the content multiplies when it has to conform to regional standards and regulations. The organisa- tion Society of Motion Picture and Television Engineers (SMPTE) has, in the light of this, created a standard called the Interoper- able Master Format (IMF). This component-based media lowers storage costs drastically by only storing and managing the media elements that are unique between versions.

In management of media content, one of the tasks is verification, a process where the content is checked for errors. By incorporation this process into an IMF workflow, the efficiency could be consid- erably improved. The objective of this thesis is to explore the use of IMF today and design a tool used for verification of IMF pack- age data, solving present problems in the verification workflow.

By looking more deeply into the IMF standard and the needs of people working with verification, a prototype could be created that attends to the needs of the user while simultaneously conforming to the IMF workflow. The prototype was received well by design experts and there is a potential of the further development of it.

Svensk Sammanfattning

Med dagens globala distribution av filmer, serier, dokument¨arer, m.m., s˚a har behovet av ett standardiserat system f¨or att lagra inneh˚all uppst˚att. Over-the-top mediatj¨anster som Netflix, HBO och Amazon Prime lagrar stora m¨angder inneh˚all, och eftersom de

¨

ar internationellt verksamma s˚a blir den m¨angden inneh˚all mycket st¨orre n¨ar det m˚aste r¨atta sig efter de olika regionala standarder och regler som finns. Organisationen Society of Motion Picture and Television Engineers (SMPTE) har p˚a grund av det skapat standarden Interoperable Master Format (IMF). Den standarden g¨or inneh˚allet komponentbaserad, vilket s¨anker lagringskostnader drastiskt genom att bara lagra och hantera media-element som ¨ar

(3)

prototyp skapats som m¨oter anv¨andarens behov samtidigt som den ¨ar anpassad f¨or ett IMF-arbetss¨att. Prototypen mottogs v¨al av designexperter och det finns potential f¨or vidare utveckling av den.

(4)

List of Abbreviations

CPL Composition Playlist.

IMF Interoperable Master Format.

IMP Interoperable Master Package.

OPL Output Profile List.

PKL Packing List.

QA Quality Assurance.

QC Quality Control.

SMPTE The Society of Motion Picture and Television Engineers.

UCD User Centered Design.

UI User Interface.

UX User Experience.

(5)

Contents

1 Introduction 1

1.1 Objective 2

1.2 Codemill 2

1.3 Research questions 2

2 Background 3

2.1 The transition from 35mm film to digital video 3 2.2 Mastering of audiovisual content and the use of IMF 3

2.3 Quality control and quality assurance 4

3 Theory 5

3.1 Interoperable master format 5

3.1.1 Interoperable master package 5

3.1.2 Composition 6

3.1.3 Transcoding and distribution 7

3.2 User Experience 8

3.2.1 User centered design 9

4 Method 12

4.1 The general method approach 12

4.2 Literature study 13

4.3 Expert interview 13

4.4 Remote workshop 14

4.4.1 Workshop agenda 14

4.4.2 Workshop tools 15

4.5 Design iteration one - hand-drawn sketches 15

4.6 Design iteration two - digital wireframes 16

4.6.1 Expert evaluation of digital wireframes 16

4.7 Design iteration three - final prototype 16

(6)

5 Results 18

5.1 Summary of the expert interview 18

5.2 Results of the remote workshop 18

5.2.1 Analysis 20

5.3 “How Might We”-statements 20

5.4 Results of design iteration one - hand-drawn sketches 21

5.4.1 Analysis of design iteration one 21

5.5 Results of design iteration two - digital wireframes 22

5.5.1 Results of the expert evaluation 25

5.5.2 Analysis of design iteration two 26

5.6 Results of design iteration three - final prototype 26

6 Discussion 31

6.1 Discussion of results 31

6.2 Future work 32

6.2.1 The prototype 32

6.3 Testing methodology 33

6.4 The Coronavirus pandemic and changes to the methodology 33

6.5 Difficult interview with SVT 34

7 Conclusion 35

8 Acknowledgements 36

(7)

1 Introduction

Cameras used for cinematography has for over 100 years been using the 35mm film stock for capturing the frames that make up a movie or TV show [1]. Even though the image quality of the 35mm film is of no concern, the capture and distribution of movies and series has in the 2010s become mostly digital. The transition to dig- ital has resulted in a shift in how we handle finished content, from rolls of film to files on a hard drive. Considering how sizeable high-definition footage can become, this can be challenging to file management systems. Especially today, when movies and TV-series are distributed to so many different platforms and territories world- wide, destinations that require various adaptations such as dubbing of language, or restrictions of explicit content.

The need for a standardized specification for file-based workflows arose with the difficult task of managing all of these versions and files. This need gave birth to the standard Interoperable Master Format (IMF), used for managing and processing multiple content versions, created by the Society of Motion Picture and Television Engineers (SMPTE)1 [2]. This standard made it possible to re-use common footage and audio between versions, thus saving a significant amount of storage space and reducing time spent on quality management. IMF allows for a master version of a show to be created and stored, and when a different version is needed, only the difference between the new version and the master is added to the file system [3].

This is made possible mainly by the Composition Play List (CPL), an XML file with instructions on how to combine the available video and audio elements that make up a version [4].

IMF is used for internal and business-to-business relations and not intended to be used to deliver directly to the consumer. That means that a production company can package its finished content and any existing versions in an IMF package and send it to e.g. Netflix, who will ingest the content and process it into files suitable for streaming to the consumer [5]. However, before sending an IMF package of finished content, a Quality Assurance (QA) needs to be performed. The QA verifies the content of the IMF package, assuring it contains the correct files, and no errors has occurred to the files [6]. This paper aims to make that process more efficient by designing a tool for the professionals working with that verification process. The design process will consist of a literature review, interviews with professionals and a workshop with experts in the field, identifying the requirements of the tool.

1https://www.smpte.org

(8)

1.1 Objective

The objective of this thesis is to explore the use of IMF today and design a tool used for verification of IMF package data, solving present problems in the verification workflow.

1.2 Codemill

This master thesis is written in collaboration with Codemill2 at their office in Ume˚a during the spring of 2020. Codemill is a Swedish IT consulting firm founded in 2007, experts in technical video solutions. With colleagues having MSc degrees in computer science or interaction technology and design, they work with major inter- national companies, mostly in the media and broadcasting industry. They aim to keep building their footprints across the media and broadcast industry internation- ally while keeping the heart and head office in Ume˚a.

1.3 Research questions

This master thesis will explore the field of IMF, used in the business of video con- tent production and distribution. With the knowledge gathered, a tool used for the verification and quality assurance of IMF packages will be designed with the help of UX methodologies and principles. To make this possible, information about IMF needs to be acquired from a technical standpoint but also from the point-of-view of professionals in the field. To assist the gathering of information and the master the- sis as a whole, some research questions have been composed with the main question followed by a couple of sub-questions.

How would a tool used for verification of IMF packages be designed to solve present issues while verifying content?

• What is IMF, and how is it used today?

• How is verification of IMF packages performed today?

• Who is performing quality assurance and would be the user of a QA-tool?

(9)

2 Background

This chapter will cover some brief history of video production to then narrow it down to the process of quality management. The purpose is to give some context to the task of verification and the need for the Interoperable Master Format (IMF) standard in the field of video production.

2.1 The transition from 35mm film to digital video

About a decade ago, the majority of image capture in movie productions was done using a 35mm gauge [1]. 35mm gauge is a film format used for capturing and project- ing movies, documentaries, series, etc. The process of either capturing or projecting requires several rolls of the plastic film [7]. However, in the last decade, the conver- sion to digital screens in cinema has made film stock a minority format [8]. Even though the opinion is split between which format is better video quality-wise [9], it is difficult to compete against the simplicity of the digital workflow versus handling a large number of rolls in film stock.

2.2 Mastering of audiovisual content and the use of IMF

Audiovisual content today, such as movies and TV-series, is distributed to many different territories and devices [10]. Because of this, the content must comply to the different rules and regulations of those territories and the technical constraints of the various devices [11]. This is done with the concept of mastering. When producing a piece of content, e.g. a movie, a master version is created. This version is the original and contains the director’s artistic intention, not affected by external restraints. This master later needs to be altered to fit the regulations of an area, maybe censoring of profanity or suggestive content, or dubbed to the local language. The master must also conform to the different technical requirements of video and audio, depending on which devices the content is supposed to be consumed. For example, a 1080p version for BluRay, a 4K version for Netflix streaming, and other codecs and formats for various end destinations [12]. All these requirements lead to a large number of versions of the original master, a phenomenon at Netflix, which they have chosen to call “Versionitis” [13].

Storing all these masters and versions require a lot of storage; however, the use of an IMF (see section 3.1) workflow can create dramatic savings in storage and file transfer. This is possible because IMF allows for the storage system to only house the content that is unique between versions. This storage system avoids the

(10)

replication of the same content which, other than saves on storage space, simplifies content tracking and content quality management [14].

2.3 Quality control and quality assurance

Quality control (QC) and quality assurance (QA) are components in the quality management process that exist to ensure the quality of a product. QC seeks to identify and correct defects in a product, whereas QA tries to prevent defects by looking into the making of the product [6]. Validation is an example of QC while verification is an example of QA and could be defined as [15]:

Validation: “The assurance that a product, service, or system meets the needs of the customer and other identified stakeholders. It often involves acceptance and suitability with external customers. Contrast with verification.”

Verification: “The evaluation of whether or not a product, service, or system complies with a regulation, requirement, specification, or imposed condition. It is often an internal process. Contrast with validation.”

This thesis will look more into the QA and verification process where the task is to assure the quality of video, audio and metadata files in an IMF package after the process of QC.

(11)

3 Theory

This chapter will contain the results of an extensive literature study conducted aimed to grasp the technical and practical aspects of the Interoperable Master Format standard, as well as the process of the User Experience design methodology.

3.1 Interoperable master format

Interoperable Master Format (IMF) is a standard created by the Society of Motion Picture and Television Engineers (SMPTE) used for the mastering and interchange of file-based, multi version, audiovisual content [14]. Or:

“The IMF is a file-based framework that allows these high-quality ver- sions, called Compositions, to be efficiently represented, managed, played back, processed, and transformed on file-based systems.” [16]

3.1.1 Interoperable master package

For delivery of content between businesses, the unit Interoperable Master Package (IMP) is used. [14, p. 8] It contains a Packing List (PLK), which is an XML file listing all elements contained in the IMP [4, p. 144], along with video, audio, and metadata elements. An IMP can be complete or partial. A complete IMP contains all files necessary for one or more compositions, and a partial IMP does not contain all files necessary for one or more compositions. A practical use for this is if a new version is to be sent and the recipient already has all the files for the original version, only the difference between the versions needs to be sent with a partial IMP [14, p.9].

The files contained by the IMP are essence data wrapped in MXF containers, and metadata written in XML files. An IMP will at least contain the following [14, p. 8]:

• Packing List (PKL) XML file

• Assetmap (ASSETMAP) XML file

• Essence MXF file(s)

• Composition Play List (CPL) XML file(s)

(12)

Figure 3.1: An Interoperable Master Package (IMP) containing MXF and XML files.

3.1.2 Composition

A version of some content in IMF is called composition and is defined by a unique Composition Play List (CPL) and the track files it references [13]. So, e.g. a movie’s original composition (version) is defined by a CPL and the original video and audio track. The German version is instead defined by another CPL but the same original video track. The German composition may also contain a German opening video track and a German audio dub that replaces the original audio track.

The CPL is an XML file that contains references to essence track files and instruc- tions on how to combine them to create a composition.

(13)

Figure 3.2: A composition where the CPL is referencing the track files.

3.1.3 Transcoding and distribution

Video content is not only consumed on television screens, and there are several different video output devices, i.e., mobile phones, airline entertainment systems, and Blu-ray discs. Each is having different transcoding requirements. The transcoding instructions, such as image cropping and colour transforms, can be stored in XML files called Output Profile Lists (OPLs). They come contained in the IMP, and there can be one or more OPLs per composition; however, an OPL is only referencing one CPL. For example, one OPL may contain instructions for the Blu-ray transcoding of a composition, and another OPL may contain transcoding instructions for HDR TV transmission of the same composition.

(14)

Figure 3.3: The process of transcoding composition to different outputs using an OPL processor.

3.2 User Experience

User experience (UX) design is a process used to create products that provide rel- evant and meaningful experiences to the customer by including all aspects of the product from the User Interface (UI) to marketing and branding [17]. The total user experience distinguishes itself from just being the UI, even though it is also an important part. To have a good user experience the product must also, other than having a pleasant UI, deliver in the areas of branding, usability, design, and function. Alternatively, as the Nielsen Norman Group puts it:

“User experience encompasses all aspects of the end-user’s interaction with the company, its services, and its products.” [18]

The seven facets of UX by Peter Morville [19] has also been widely adapted and cited as a framework for UX design [20]. It is also used to describe UX as more than just usability (see figure 3.4).

(15)

Figure 3.4: The UX honeycomb by Peter Morville [19].

The seven facets are described below with the help of Macpherson [21].

Useful Does the product, service or feature serve a purpose for your target cus- tomers?

Usable Is the user able to effectively and efficiently achieve their end goal?

Desirable The branding, image, identity and other elements of emotional de- sign [22].

Findable The ease of naviagation. Is the information the user seeks easy to find?

Is the navigational structure intuitive?

Accessible The product or service should be accessible to users with disabilities.

Credible The ability of the user to trust in the product or service provided.

Valuable The product or service must deliver value to both the business and the user.

3.2.1 User centered design

User centered design (UCD) is an iterative design process that looks for ways to satisfy the needs of the user with the thing being designed (e.g. a product or service) [23]. The process involves the user by various research and design techniques

(16)

such as interviews, usability testing, and observations [24].

The Double Diamond

One way to look at the design process is with the visual representation of it called the Double Diamond (figure 3.5). The process is divided into two diamonds. The first is one of exploration where an issue is looked into widely or deeply. The second diamond is of taking focused action and delivering based on what is found out during the first diamond. The design process is also not a linear one. Discoveries in the exploration phase or testing of solutions may send the researcher back to the beginning, where the process is reiterated for improvement. The diamonds are divided into two sections each which tell more about the process.

Discover is about researching the issue and understanding the problem, rather than assuming it. This is done by speaking to and spending time with the people affected by the issues.

Define is about defining the problem in a new way from the insight gathered from the discover phase.

Develop is through collaboration with a wide range of people seeking different answers to the clearly defined problem. This is often started with some sort of brainstorm to gather many ideas.

Deliver is the further improvement or rejection of a solution whether it passes small-scale testing or not. The testing will determine if the solution will work or not and is performed on an incrementally increased number of people based on how close to the finished product you are.

(17)

Discover Define

Develop Deliver

Connecting the dots and building relationships between different citizens, stakeholders and partners.

Creating the conditions that allow innovation, including culture change, skills and mindset.

OUTCOME DESIGN

PRINCIPLES

1. Be People Centred 2. Communicate (Visually & Inclusively)

3. Collaborate & Co-Create 4. Iterate, Iterate, Iterate

METHODS BANK

Explore, Shape, Build CHALLENGE

© Design Council 2019

Figure 3.5: Design Council’s Framework for Innovation, 2019 [25].

(18)

4 Method

This section will describe the method this thesis has deployed to get its final re- sult. First, the general method approach is explained, then the specific methods are described further in their own sections. Chapter 5 will follow a similar layout like this one, which should enable the reader to jump between the two to see the results of previous steps and the decisions made before continuing with the next step (see figure 4.1).

Figure 4.1: Visualisation of recommended reading order.

(19)

studies. Also, an understanding is achieved of who the users are, what their tasks are and what issues they face performing their tasks. This understanding was achieved by an interview with a professional, and conducting a workshop with experts in the field.

Define

Here, the problem(s) was defined using “How Might We”-statements [26].

Develop

Using the “How Might We”-statements from before, solutions to the problem is created in form of prototypes that are later tested in the Deliver-phase. These prototypes were wireframes created using pen and paper, Balsamiq Wireframes1 and Sketch2.

Deliver

Here, testing of the wireframes was performed by conducting expert evaluations. The step between Develop and Delivery is iterative and is performed until a desired result is achieved.

4.2 Literature study

By performing a literature study, a better understanding of the field could be achieved. The aim of the literature study was mainly to get a deeper understanding of the Interoperable Master Format (IMF) by reading technical specifications and re- ports. The aim was also to look into current research in the field of User Experience (UX) and incorporate it into the report and design process. Relevant research and technical specifications have been predominantly found using the following sources.

1. SMPTE3 2. DPP4

3. Google Scholar5 4. Diva-Portal6

4.3 Expert interview

To get some real information on how people work with verification, an interview was conducted with a person working at SVT7 in the TV production department.

1https://www.balsamiq.com

2https://www.sketch.com

3https://www.smpte.org

4https://www.thedpp.com/

5https://www.scholar.google.com

6https://www.diva-portal.org

7https://www.svtplay.se

(20)

The interview was an exploratory and qualitative one. The questions were more of discussion points that we could talk over, i.e., “how does the verification process function at SVT?” or “Who are the people working with verification at SVT?”. This approach to interviewing is called “semi-structured interviews” [27]. The goal of such an interview is to gather information about a set of central topics while also allowing exploration when new topics present themselves. This approach was used because it was the first real look into the field, and thus, limited knowledge had been attained prior, making it too difficult to prepare more specific interview questions.

4.4 Remote workshop

To get a rough understanding of who the users of the verification tool are, a work- shop [28] was put together with experienced participants from Codemill. The choice of conducting a workshop was because only one interview was managed to be done, and more information about the users and their tasks needed to be acquired. The goal of the workshop was to identify who the users could be and what kind of prob- lems they are and could be facing when working. For the workshop to be effective, a couple of things needed to be planned.

The objective of the workshop

With a predefined goal for the workshop, it becomes much easier to plan the activities and invite suitable participants. The goal for this workshop was, with the help of the Codemill employees define a persona of a user of a verification tool within the video production business. The use of that persona will be to formulate a use case which to make a design proposal about.

Which participants to invite

All participants were from Codemill, and to get as much out of the workshop as possible, the participants were invited based on their role in the company. Two designers and two salespeople were invited.

How the activities where designed

With the workshop goal in focus, the workshop activities were designed so the pre- vious activity helped the participants do the next activity.

4.4.1 Workshop agenda

(21)

After the first two activities, the participants now had one or more places and people on their minds, which would make the third activity more approachable. The third activity asked the participants to think of potential issues and obstacles a user would encounter while verifying content. Sticky notes would also be used here for participants to share their thoughts.

The last activity focused on getting the participants to come up with ideas for solutions to the problems found in the previous activity. This activity was done by splitting the participants into groups of two where whey could discuss solutions and ideas together and later on present them to everyone in the workshop.

Main activity questions:

1. How did previous or current clients of yours work with verification?

2. Who are the people working with verification at these companies?

3. What issues are these workers facing while performing their tasks?

4. What does a verification tool need to resolve these issues?

4.4.2 Workshop tools

There existed some requirements of the tools used in the workshop because it was being performed remotely with colleagues participating from home on their com- puter. Working remotely meant there was no office space in which to communicate and no whiteboard to collect ideas on. To overcome this obstacle, two tools were used to digitally emulate a physical meeting.

Zoom8

Zoom is a video conference tool that would be used for communication by web camera, microphone, and screen share. The tool also has a functionality where you can create breakout rooms where you can send participants to discuss in smaller groups. The breakout rooms would become very useful for the last activity.

Miro9

An online collaborative whiteboard with much functionality, e.g. frames, sticky notes, and timers. This tool would be the centre of the workshop. Every activity had its allocated area on the Miro-board. By screen sharing in Zoom, the workshop facil- itator could communicate to all participants how to continue with the next activity, and participants could share their screen and present their ideas.

4.5 Design iteration one - hand-drawn sketches

With the results of the workshop summarised, simple hand-drawn sketches could be made. These sketches would be used as an early concept proposal to show the thesis supervisor at Codemill. The use of hand-drawn sketches was chosen because of the ease of iteration when creating them. Additionally, according to Walker et al. [29],

8https://www.zoom.us

9https://www.miro.com

(22)

there are few differences in feedback given when comparing low and high-fidelity sketches on paper or monitor.

With the approval of the direction of the concept and some feedback given, it would be possible to continue with a second iteration.

4.6 Design iteration two - digital wireframes

To quickly test different versions of the same idea, the next iteration used a digital tool to create wireframes from the original sketch of iteration one. This allowed for easy modification of design elements without having to redraw all other parts.

The tool used was Balsamiq Wireframes. The tool consists of pre-made elements drawn in a low-fidelity fashion, which made it easier to focus more on layout and hierarchy than the aesthetic finish of the prototype. The purpose of the wireframes where to have an expert evaluate them before starting with the third and final iteration.

4.6.1 Expert evaluation of digital wireframes

To test the validity of the wireframes, an expert evaluation was deployed by getting feedback from four design experts at Codemill. The amount of people evaluating the wireframes where decided because Nielsen et al. [30] states that when having up to five evaluations the return in evaluation results are growing rapidly but reaches the point of diminishing returns when having about ten evaluators.

The evaluation process was that the facilitator explained the use case of which the wireframes are based and then let the evaluator look at the wireframes, ask questions if needed, and give input on things that are good and things that could be improved.

The input from the evaluators was noted and considered for the final iteration.

4.7 Design iteration three - final prototype

With the feedback received from the expert evaluations (see section 5.5.1) a more refined prototype could be made using the vector-based design tool Sketch. This

(23)

scratch; however, it requires more time.

(24)

5 Results

This chapter covers the results of the thesis. As mentioned in chapter 4, the layout of this chapter will follow a similar layout as the method chapter allowing for the reader to jump between and follow the decisions made before continuing with the next iteration.

5.1 Summary of the expert interview

The following are the main takeaways from the interview with SVT.

The person interviewed was a technical producer with several years of experience at SVT. He explained how, when receiving material, such as an episode of a TV show, it is sent to an in-house media centre. Before a Quality Control (QC) of the material can be performed, they check the codec of the files and transcode them to their “house format”, which enables them to perform QC. The QC is a linear and manual process where employees check the quality of the material in real-time using a media composer, which is a very time-consuming task. The material is checked e.g. “loudness”, head and tail can be cut, and audio can be remapped. All this is done before the material can be sent to their digital archive.

If there is something severely wrong with the material, SVT asks for a new delivery with the problems fixed; however, if there is a minor error, it can be taken care of by the SVT staff, i.e., lowering the volume of parts in the material.

The process of verifying the material is something SVT wants to look more into in the future and possibly develop a better solution.

5.2 Results of the remote workshop

(25)

• Review of audio, video and subtitles.

• Reporting of errors and issues.

• May resolve small issues by themselves.

• Mark files as approved or not.

Traits of a person working with verification

• Some sort of education in media. Not necessarily a technical one.

• 25-45 years old.

• Working with repetitive tasks.

• Uses a big screen (at least 24”), but sometimes a laptop.

• Needs to minimise distractions.

• Is thorough and determined.

Issues a person working with verification may face

• Tight deadlines.

• Large amounts of content to verify.

• Collaboration can be difficult.

• Software not working properly.

• Working in old and outdated systems.

• Functionality in the software is hidden behind “hot keys”.

• Must work in several different systems.

Things that could help a person working with verification

• Responsive software.

• Software tailor made for the purpose of the task.

• Customisable UI for specific users.

• Self explaining UI, no hidden functions.

• Make communication easier.

• Automatic and manual verification in harmony.

Figure 5.1: The template used in the workshop for creating personas.

(26)

Figure 5.2: Participants filled the board with issues a person working with verifi- cation may face.

5.2.1 Analysis

Using the workshop summary, an adequate picture of a suitable persona, and a good understanding of the day to day tasks a person working with verification has been acquired. Going forward with prototyping, tasks such as reviewing audio and video in a quality assurance manner will be explored. This means how quality assurance can quickly verify the work done in the previous and more thorough task of quality control.

5.3 “How Might We”-statements

“How Might We”-statements could be created based on the results of the interview and workshop (HWM = How Might We, QAW = Quality Assurance Workers):

• HMW make it easy for QAW to step between episodes/movies?

• HMW make it easy for QAW to select parts of content to watch?

(27)

5.4 Results of design iteration one - hand-drawn sketches

Because the focus of the thesis is to design a tool to be used by users who do quick final checks before sending or right after receiving content, the tool needed to enable the user to go through the content of an IMF package easily. Because the content of an IMF package is in itself difficult to comprehend only using a file explorer, the tool would have to help the user understand the compositions by using graphical elements.

The idea of this design concept (shown in figure 5.3) was to enable the user to quickly review each composition (CPL) one at a time and colour code them de- pending on the status of the review. Also, each essence file is linearly shown on the timeline with the purpose of clearly show how the composition is composed with each track also being separated. The idea is that the user will be able to review the tracks individually, and already reviewed tracks in previous compositions do not need to be reviewed again.

Figure 5.3: First sketch of the verification tool.

5.4.1 Analysis of design iteration one

The concept was received well, and the decision to continue on that path was made.

Some feedback was given, namely that the order of the UI elements should be re- considered to reflect their importance in the tool better. For example, the metadata field can be closer to the CPL list.

(28)

Also, the use of a CPL list to go through as a user was a feature that should be looked more into going forward. It may prove to be useful.

5.5 Results of design iteration two - digital wireframes

The layout of the wireframes (figure 5.4-5.8) is based on the first sketch, which is similar to the design of a traditional media composer application which is familiar to the people working with verification, as mentioned by the expert interview in section 5.1. The wireframes are made around the use case that the user has received a package of episodes from a TV show and is supposed to verify that the quality control has been done right. The timeline is supposed to change depending on which composition the user chooses in the composition queue (top left in figure 5.4 and top row in figure 5.5-5.8) and the purpose of the timeline is to visualise the

“stitching together” of video and audio files. Additionally, the fields “Essence track metadata” and “Audio levels” are information deemed important according to the workshop (see section 5.2) and expert interview, and the information displayed in those fields are supposed to be the information of the tracks that the play head is currently on.

Another feature (seen in figure 5.7) is the ability to mark tracks as approved or not after the user has verified them. This is supposed to make it easier for the user to keep track of which tracks have been verified and which have not by changing the colour of the track according to the decision. Because IMF can re-use tracks from another composition without duplicating it, the user should not need to verify already verified content. One thought is, therefore, to have the application remember which tracks are verified already and have those tracks already colour coded on the timeline, saving the user some time.

In summary, the user is supposed to be able to easily verify the sound and video quality, the metadata, and subtitles of all compositions in an IMF package. Mark them as verified and not have to verify the same content more than once.

(29)

Figure 5.4: First version of the QA tool wireframe in Balsamiq.

Figure 5.5: Second version of the QA tool wireframe created in Balsamiq.

(30)

Figure 5.6: Third version of the QA tool wireframe created in Balsamiq.

(31)

Figure 5.8: Third version of the QA tool wireframe with right-click functionality.

Marking as “flawed”.

5.5.1 Results of the expert evaluation

This section will present the summarised results of the expert evaluations (see section 4.6.1) conducted at Codemill.

The use of tabs (see figure 5.5-5.7) to keep track of compositions in the IMF package got mixed reviews leaning to it being understandable and preferred over the file explorer alternative (see figure 5.4). However the size of the tabs could be smaller, giving more space to other elements.

The use of several metadata fields could make the interface messy while reviewing metadata or moving a lot in the timeline. While the metadata field in itself was understood by the evaluators, the number of metadata fields could be reduced from two to one by default, and the metadata within the fields needs separating.

The user might want to go full screen with the video preview window or detach it completely from the timeline. This allows the user to see the video in full screen on one monitor and the timeline on another.

Using coloured bars in the timeline to visualise the IMF “stitching together” of essence tracks worked well. Also, the concept of colour code the tracks depending on their status, was well received. However, an argument was made that when changing the status of a bar (as seen in figure 5.7), the whole essence track should be seen as reviewed and change colour depending on the status. For example, if an error was found in the first part of the track, the entire file should be seen as defected because

(32)

it must still be replaced entirely.

The findability [31] of the “mark as approved/flawed”-function may be low consid- ering the high importance of the function.

Other remarks:

1. The top file directory in the file explorer is redundant.

2. The review of audio and video transitions may be interesting.

3. You can not see the name of the IMF package being reviewed in the tool.

4. The audio tracks may use more audio channels, requiring more audio levels to be verified.

5. Having subtitles in the video player instead of in a separate field was preferred.

6. There is no ability to see which track is currently active.

7. Use the work “rejected” instead of “flawed”.

8. The ability to mute tracks or audio all together could be useful.

5.5.2 Analysis of design iteration two

The feedback received from the expert evaluations proved themselves very valuable, and it was all to be addressed in the final design iteration. The point of marking the entire essence track instead of just a part of it was a major insight, as well as how the current state (design iteration two) of the metadata fields could be confusing.

With the evaluations summarised, a more refined prototype could be created with changes to combat the design weaknesses discussed with the evaluators.

5.6 Results of design iteration three - final prototype

Figure 5.9 to figure 5.14 shows the results of the final design iteration. These wireframes made in the design tool Sketch are the approved upon versions of the wireframes made in design iteration two (see section 5.5), based on the expert evaluations conducted (see section 5.5.1). The use case is still that the user is to verify an IMF package containing six compositions of a TV show. However, in this use case, the show is a dubbed version in German.

Following list specifies the changes made from previous iteration:

(33)

7. The audio track in the audio levels field is changing depending on where the play head is positioned.

8. The verification options are now “Mark as approved” and “Mark as rejected”

(figure 5.11).

9. When marking a track as either approved or rejected, the whole track is now coloured instead of just the track block (figure 5.10 & 5.11).

10. A check mark (approved) or a block sign (rejected) is added next to the track file name when marking tracks approved or rejected.

11. When all tracks are approved, the colour of the composition tab is changed to green (figure 5.14).

Figure 5.9

(34)

Figure 5.10

Figure 5.11

(35)

Figure 5.12

Figure 5.13

(36)

Figure 5.14

(37)

6 Discussion

6.1 Discussion of results

There are three things I would like to analyse to determine whether the results of the thesis are acceptable or not. Those three things are connected to the theory used in this thesis:

1. Is the solution adapted for use of IMF packages?

2. How do Peter Morvilles seven facets of UX correspond to the results?

3. Has a User Centered Design process been used?

Is the solution adapted to use with IMF packages?

The tool is designed to allow the user to go through an IMF package of CPLs as well as verifying individual essence tracks connected to the CPLs. The tool also takes into consideration that the essence tracks can have been previously verified. This plays on the great strength of IMF, the need to only store content that is unique between versions of audiovisual content. By having those features, metadata and visual aid to make compositions comprehensible, I would argue that the solution is well adapted for verification of IMF packages.

How do Peter Morvilles seven facets of UX correspond to the results?

Because the results are still in the prototype phase, some of Morvilles’s facets are redundant in this case. However, if the solution is Useful, Usable and Findable, are facets worth discussing. I believe the usefulness of the solution is best described in the paragraph above, furthermore, the purpose of the tool is well established in the research phase. However, that the tool is useful does not mean that it is usable, because no user tests have been performed, it is difficult to deem the tool usable just on the opinion from expert evaluations.

The findability of the “approve/reject” feature was briefly mentioned in the result section. There is a findability weakness to it because it is hidden behind a “right- click”, which means that new users can have difficulty finding it. The findability of all features should be further looked into in future work.

Has a User Centered Design process been used?

The Design Council’s Double Diamond has inspired the entire design process from research to final prototype. The research was done to understand the users and their issues as accurately as possible instead of using assumptions. The prototypes were made to solve the issues found in the research phase, which are in extension, the users’ issues. The weakness in this user centered design process was the lack of real users to interview and test prototypes on, which forced adaptations to the method.

(38)

In conclusion for the last point, while the method was user centered, it could have been better if access to real users would have been made possible.

6.2 Future work

The final prototype for this thesis is finished; however, it does not mean that it is the final solution for the tool. It just means that there is no more time to test and reiterate. If it was continued to be worked upon, three things could be done:

1. Test it on real users.

2. Adapt it to the workflow of a client.

3. Design it using Codemill’s design system1.

More valuable input could be gathered and used to improve the prototype by testing it on real people working with verification tasks in production. More importantly, the validity of the concept would also be tested to find out if the use case is as legitimate as first thought. This ties to the second point of the list, adapting it to the workflow of the client. As companies have different workflows and routines, the validity of the concept may change. It may work better in some companies than others, depending on how their workflow aligns with the prototype. The use case for this prototype is based upon the readings made on quality management of some companies, the technical specifications of the IMF standard, the results from the remote workshop, and the interview made with SVT. However, if one would continue designing and developing this tool for a client, an extensive study of that client’s workflow would have to be made to align the tool with their needs. As there are many parties involved and systems used in the production of audiovisual media, it is almost impossible to have a solution that fits for all.

Lastly, when reiterating this prototype, effort should be put in to make it follow the design system. That way, if it were to be developed, much less effort would have to be put in to make it harmonise with the rest of Codemill’s products than if the adaptation to the design system would be made afterwards.

6.2.1 The prototype

This section will discuss the further designing of the prototype.

(39)

thesis.

The visibility of the “approve” and “reject” function could be further explored. One could argue that the function is hidden in its current state, and given the importance of the task in this use case, the tool may benefit from having the function more findable. One suggestion is to have it in the same row as the play and skip buttons, or having it next to the essence track name as a checkmark and “X”. The benefit of having it as a “right-click” option is that it is more simple to describe the function in text rather than symbols.

The file explorer field is in this prototype has not gone through any real thought of function other than an overview of the IMF package. The field would much likely also be suitable for file selection. Instead of just being able to select files on the timeline, the user would also be able to select files in the explorer to review. Maybe also being able to approve tracks directly from the file explorer.

More functions will have to be added to the video player, simple things such as mute, go full screen, and detaching the player. This would add to the overall legitimacy of the tool.

6.3 Testing methodology

The original plan was to contact companies with potential users of the tool and test the prototype on them. This would probably have given some valuable insight.

However, it proved difficult to reach such people, and a backup plan had to be de- ployed because of time constraints. The backup plan was to have design experts at Codemill evaluate the prototype, experts who also have an insight into the ver- ification process because of the clients they have been working with. The feedback received was regarded as valuable, even though the feedback from real users would have been more valuable. However, because users may not be used to giving design feedback, the amount of feedback may had been less if only users would have been involved.

6.4 The Coronavirus pandemic and changes to the methodology When this thesis was written during the spring of 2020, the world was undergoing a pandemic caused by the Coronavirus2. This pandemic led to global measures being taken to prevent the rapid spread of the virus, one of which were the act of “social distancing” which led to many offices in Sweden started having their employees working from home, Codemill included.

Working from home meant some changes had to be made continuing with this thesis, also while some challenges presented themselves. The original plan was to interview some companies working with verification to gather information about users and

2https://www.who.int

(40)

workflows; however, getting a response proved itself difficult. Because of the diffi- culty of organising interviews and being past its time frame for the thesis, it was decided that a workshop was to be conducted instead. Instead of having the work- shop conducted at the office, it instead had to be done remotely using digital tools because of the work from home recommendations. Having the workshop being per- formed remotely did not prove itself to be a significant disadvantage, compared to performing it at the office; however, the planning of it is believed to have taken one or two weeks longer.

Also, the feedback and expert evaluations had to be received via voice and video chat. Even though this worked, it was a bit more of a hassle organising the meetings and creating the digital workspaces instead of meeting in an office.

The effect of the Coronavirus pandemic on this thesis could be summarised to having to perform interviews, the workshop and evaluations remote, while also having the time frame delayed a couple of weeks because of readjusting to working from home and requiring more planning time for thesis tasks.

6.5 Difficult interview with SVT

I would like to add that the interview with SVT proved itself to be quite difficult.

Because it was the first insight on a real verification workflow, the questions had to be more discussion points for exploration of the verification area. The goal of the interview was to get more of an insight into the workflow, and who the people were; however, the conversation steered more into technical aspects I had no to little knowledge of. It would be interesting to talk with them again and ask some more specific questions about their process and compare it to the knowledge I have gained from last time.

(41)

7 Conclusion

The objective of this thesis was mainly to design a tool used for verification of IMF package data. That part of the objective is met with the prototype presented in the results, evident by the tool being adapted for IMF, which is explained further in section 6.1. The other part of the objective is that the tool is to be designed to solve present problems in the verification workflow. The requirements for that part of the thesis is mostly met by research made on the verification workflow by interview and workshop, however, user tests on the prototypes would have made the results more reliable.

Because of the user centered design approach that was used for this thesis, the main research question formulated in the introduction could be answered in a legitimate way. Three of the four sub-questions could also be answered and helped in creating the final solution.

• What is IMF, and how is it used today?

• Who is performing quality assurance and would be the user of a QA-tool?

• What current issues are people working with verification facing today?

The second sub-question could not be answered because of the lack of access to people working with the IMF. Instead, more general research about verification had to be done, which gave adequate results.

• How is verification of IMF packages performed today?

(42)

8 Acknowledgements

I would like to thank the Codemill team for welcoming me to their office and helping me throughout the thesis. A special thanks go out to my supervisor at Codemill, Emil Edsk¨ar, who guided me through several issues. Last but not least, I would like to thank my dear friends Filip Bark, Fredrik ¨Ostlund, Petter Poucette and Oscar Thorwid, who provided support and motivation during the thesis, as well as during the five years of university.

(43)

References

[1] Ivan Radford. Don’t keep it reel: why there’s life after 35mm. The Guardian, 11 2011.

[2] Telestream. A guide to the interoperable master format (imf). http:

//www.telestream.net/pdfs/datasheets/App-brief-Vantage-IMF.pdf.

Accessed: 2020-02-21.

[3] DPP. The interoperable mastering format.

https://www.thedpp.com/imf/imf-overview. Accessed: 2020-06-10.

[4] Entertainment Technology Center. Interoperable master format (imf).

Technical Report 1, Entertainment Technology Center, 509 West 29th Street, Los Angeles, California 90007, USA, 2 2011.

[5] Sreeram Chakrovorthy Rohit Puri, Andy Schuler. The netflix imf workflow.

Netflix Tech Blog, 4 2016.

[6] Diffen.com. Quality assurance vs. quality control. https:

//www.diffen.com/difference/Quality_Assurance_vs_Quality_Control.

Accessed: 2020-06-01.

[7] Matthew Wegenknecht. The actual costs of film.

http://www.matthewwagenknecht.com/the-actual-costs-of-film/.

Accessed: 2020-06-01.

[8] Leo Barraclough. Digital cinema conversion nears end game. Veriety, 6 2013.

[9] Caroline Siede. Maybe the war between digital and film isn’t a war at all. AV club, 8 2018.

[10] Olsberg/SPI. Building sustainable film businesses: the challenges for industry and government, pages 27–30. Olsberg/SPI, 2012.

[11] CLAi.tv. Film, video internet mastering.

https://www.clai.tv/film-video-internet-mastering/. [Accessed:

2020-05-30].

[12] Charles S Swartz. Understanding digital cinema: a professional handbook, pages 83–115. Taylor & Francis, 2005.

[13] Sreeram Chakrovorthy Rohit Puri, Andy Schuler. Imf: A prescription for versionitis. Netflix Tech Blog, 3 2016.

(44)

[14] Digital Production Partnership. Imf operational guidance.

https://cdn.digitalproductionpartnership.co.uk/wp-content/

uploads/2019/07/DPP005-IMF-Operational-Guidance-2019-07-08.pdf.

Accessed: 2020-02-18.

[15] Ieee draft guide: Adoption of the project management institute (pmi) standard: A guide to the project management body of knowledge (pmbok guide)-2008 (4th edition). IEEE P1490/D1, May 2011, pages 1–505, 2011.

[16] Dom Jackson. The interoperable mastering format.

https://www.smpte.org/sites/default/files/section-files/

BBTB2017-W04DominicJohnson-IMF.pdf. Accessed: 2020-02-19.

[17] Interaction Design Foundation. User experience (ux) design.

https://www.interaction-design.org/literature/topics/ux-design.

[Accessed: 2020-05-30].

[18] Jakob Nielsen Don Norman. The definition of user experience (ux).

https://www.nngroup.com/articles/definition-user-experience/.

[Accessed: 2020-05-30].

[19] Peter Morville. User experience design.

http://semanticstudios.com/user_experience_design/, 2004. [Accessed:

2020-05-30].

[20] Peter Morville. User experience honeycomb.

https://intertwingled.org/user-experience-honeycomb/, 2016.

[Accessed: 2020-05-30].

[21] Ellen Macpherson. The ux honeycomb: Seven essential considerations for developers. Medium, 10 2019.

[22] Donald A Norman. Emotional design: Why we love (or hate) everyday things.

Basic Civitas Books, 2004.

[23] Elizabeth B-N Sanders. From user-centered to participatory design

approaches. In Design and the social sciences, pages 18–25. CRC Press, 2002.

[24] Interaction Design Foundation. User centered design. https://www.

interaction-design.org/literature/topics/user-centered-design.

[Accessed: 2020-05-31].

(45)

[28] Kate Kaplan. Ux workshops vs. meetings: What’s the difference? Nielsen Norman Group, 2 2020.

[29] Miriam Walker, Leila Takayama, and James A Landay. High-fidelity or low-fidelity, paper or computer? choosing attributes when testing web prototypes. In Proceedings of the human factors and ergonomics society annual meeting, volume 46, pages 661–665. SAGE Publications Sage CA: Los Angeles, CA, 2002.

[30] Jakob Nielsen and Rolf Molich. Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems, pages 249–256, 1990.

[31] Jen Cardello. Low findability and discoverability: Four testing methods to identify the causes. Nielsen Norman Group, 9 2019.

References

Related documents

the one chosen for the proposal above is the one second from the right, with houses organized in smaller clusters.. this is the one judged to best respond to the spatial

This section presents the resulting Unity asset of this project, its underlying system architecture and how a variety of methods for procedural content generation is utilized in

Therefore, it would not be only a program which enables the communication between the antenna, spectrum analyzer and turntable but also it is a GUI4 where different parameters

This paper highlights the hedonic pricing model as a useful instrument for managers and entrepreneurs, when they establish the pricing policy for their touristic products.

In this study I use information gathered from interviews with experienced designers and designer texts along with features from methods frequently used for aiding the designers

As in the cell model, all fragments longer than and including Aβ1-17 decreased upon γ-secretase inhibition, whereas the shorter isoforms, e.g.. These data, together with the cell

Effects of γ-secretase inhibition on the amyloid β isoform pattern in a mouse model of Alzheimer’s disease.. Distinct cerebrospinal fluid amyloid β peptide signatures in sporadic

The concepts behind a potential digital tool that could be used for placing IT forensic orders have been evaluated using a grounded theory approach.. It is important to take