• No results found

Mapping a digital-physical development process: Case study of a small company

N/A
N/A
Protected

Academic year: 2021

Share "Mapping a digital-physical development process: Case study of a small company"

Copied!
54
0
0

Loading.... (view fulltext now)

Full text

(1)

IN THE FIELD OF TECHNOLOGY DEGREE PROJECT

DESIGN AND PRODUCT REALISATION AND THE MAIN FIELD OF STUDY MECHANICAL ENGINEERING, SECOND CYCLE, 30 CREDITS

,

STOCKHOLM SWEDEN 2020

Mapping a digital-physical

development process

Case study of a small company

ELLA ANDERSSON

KTH ROYAL INSTITUTE OF TECHNOLOGY

(2)
(3)

Mapping a digital-physical development

process

Case study of a small company

Ella Andersson

Master of Science Thesis TRITA-ITM-EX 2020:407 KTH Industrial Engineering and Management

Machine Design SE-100 44 STOCKHOLM

(4)
(5)

Examensarbete TRITA-ITM-EX 2020:407

Kartläggning av en digital-fysisk utvecklingsprocess: En fallstudie på ett litet företag

Ella Andersson Godkänt 2020-06-21 Examinator Sofia Ritzén Handledare Johan Arekrans Uppdragsgivare Orexplore AB Kontaktperson Sandra Egardt

Sammanfattning

Gruvindustrin är en gammal industri med etablerade traditioner och arbetssätt. Trots att digitalisering har funnits på kartan för många andra industrier, står gruvindustrin inför det nu. Digitalisering har påverkat den typ av produkter som skapas och har skapat komplexa produkter genom användningen av elektronik och mjukvara i fysiska produkter, för att samla in och dela data. Dessa produkter påverkar utvecklingsprocesserna och skapar en komplex utveckling, där de olika processerna måste beaktas varandras begränsande faktorer.

Denna masteruppsats studerar området av den kombinerade digitala och fysiska utvecklingsprocessen genom att kartlägga utvecklingsprocessen hos ett litet företag i spetsen för branschens digitala omvandling. Kartan lägger grunden för att undersöka hur agila värderingar och arbetssätt används hos företaget. Detta i syfte att utforska hur de agila värderingarna och arbetssätten påverkar den komplexa utvecklingsprocessen på företaget. Detta är en kvalitativ studie med semistrukturerade intervjuer som primärdata och teoretiskramverk i form av litteratur inom forskningsområdet produktutveckling, processteori och agil anpassning.

Den resulterande processkartan går att se i kapitel 4.2 The Development Process. Kartan visualiserar beroenden mellan de olika funktionerna genom att beskriva aktiviteter, input, output, information och resurser för varje funktion. Processmognaden för den studerade processen var låg. Strukturen och dokumentationen anses enligt författaren vara de agila tillämpningar som påverkar den studerade processen mest. Organisationens storlek, det individuella drivet och minimal dokumentation möjliggör snabb kommunikation och problemlösning mellan funktionerna. Detta var en studie av ett litet företag med en låg mognadsnivå i sin utvecklingsprocess, vilket gjorde det svårt att göra några antaganden om effektiviteten i deras process. Eftersom detta är en fallstudie på ett litet företag inom ett växande forskningsområde kan ingen generalisering göras. Resultaten av denna studie kan dock vara värdefulla för det studerade företaget för att utveckla sina processer vidare.

(6)
(7)

Master of Science Thesis TRITA-ITM-EX 2020:407

Mapping a digital-physical development process: Case study of a small company

Ella Andersson Approved 2020-06-21 Examiner Sofia Ritzén Supervisor Johan Arekrans Commissioner Orexplore AB Contact person Sandra Egardt

Abstract

The mining industry is an old industry with established traditions and ways of working. Even though digitalisation has been well established in many other industries, has the mining industry just started adopting it. Digitalisation has impacted the types of products created and creating complex products that consist of electronic components and software that collect and share data. These products influence the development and create a complex development process when combined. However, the digital and the physical development processes need to be considered the limiting factors of the other.

This thesis is exploring the field of digital-physical development through mapping the development at a small case company in the lead of the digital transformation of the industry. The map sets the foundation to examine how agile practices are adopted at the case company in the purpose of exploring the effects on the complex development. This is a qualitative study with semi-structured interviews as primary data, and literature within the research field of product development, process theory and agile adoption as a theoretical framework.

The resulting process map can be found in Chapter 4.2 The Development Process. The map is visualising the dependencies between the different functions by describing activities, input, output, information, and resources of each function. The maturity of the processes found was low due to the ad hoc nature of the development. The structure and artefacts are considered to be agile practices that affect the studied process the most. The size of the organisation, the individual drive and minimal documentation allows fast communication and problem solving between the functions. This was a study of a small company with a low maturity level in their development process, which made it difficult to make any assumption about the efficiency of their process. Since this is a single case study in a growing research field, can a generalisation not be considered. Though, the results of this study can be found valuable for the case company for developing their processes and general practices.

(8)
(9)

NOMENCLATURE

Presented here are the Abbreviations that are used in this Master Thesis.

Abbreviations

CAD Computer Aided Design

DFA Design For Assembly

DFM Design For Manufacturing eBOM Engineering Bill of Materials Gitlab a code management tool

Hermod a Product Data Management tool Mattermost an online chat service

mBOM Manufacturing Bill of Materials

Monitor an Enterprise Resource Planning system tool SLP System Level Planning

(10)
(11)

TABLE OF CONTENTS

1

INTRODUCTION

1

1.1 Background

1

1.2 Purpose

1

1.3 Delimitations

2

2

FRAME OF REFERENCE

3

2.1 Process Theory

3

2.2 Product Development

5

2.3 Agile Development

8

3

METHODOLOGY

11

3.1 Research Approach

11

3.2 Data Collection

11

3.3 Method of Analysis

12

3.4 Validity and Reliability

13

4

RESULTS AND ANALYSIS

15

4.1 The organisation

15

4.2 The Development Process

17

4.3 Analysis

23

5

DISCUSSION

27

5.1 The Development Process Map

27

5.2 Agile Adoption

28

5.3 General

31

(12)

6

CONCLUSION AND FUTURE WORK

33

6.1 Conclusion

33

6.2 Future work

33

7

REFERENCES

35

APPENDIX A: The Agile Principles

i

(13)

1 INTRODUCTION

This is the introduction to the study for the master’s thesis at the Royal Institute of Technology and the company Orexplore AB. Following chapter starts with the background to this study in regard to the current research within the subject. It is later transitions to the purpose and delimitations of this study.

1.1 Background

The mining industry is an old industry with established traditions and ways of working. Even though digitalisation has been an established phenomenon in many other industries, has the mining industry just started adopting it, and Orexplore is working in this transformation. Orexplore is a small Swedish-Australian company developing products that produce and visualises drill core data to their customers. This data allows the customers to analyse the element and mineral concentrations faster than before. The products Orexplore provide are X-ray machines and a software tool called Insight. The products require both digital and physical solutions to create value to the customers.

Digitalisation has impacted the types of products created and creating complex products that consist of electronic components and software that collect and share data. These products influence the development and create a complex development process when combined. However, the digital and the physical development processes have limiting factors which need to be considered when merged. The traditional and mature physical development processes, such as waterfall and gate processes, focuses on the predictability, to plan and optimise the resources to minimise the lead-times and costs in the end (Measey, 2015; Ullman, 2010). While today’s software development, such as the practices under the umbrella of agile, focuses on the speed and flexibility to assure quality (Beck et al, 2001; Berger and Eklund, 2015). These are examples of such factors that make it difficult to optimise both types of development in the same setting. In literature have various development processes been uncovered and studied. However, when combining digital and physical development in one practice and still being efficient in both practices have the results varied (Berger and Eklund, 2015; Hendler and Boer, 2019). Adopting agile practices in the more traditional development is an example of such effort to combine, and different challenges and coordination mechanisms have been studied to some extent. The maturity of the literature that studies the phenomenon of digital-physical development and adopting agile in ha physical development is low, but it is emerging (Hendler and Boer, 2019).

Orexplore have explained that they do not have a formal product development process, but they have managed to establish an informal way of work with fragments of Scrum. Orexplore value themselves to be agile, though this is not perceived so throughout the whole organisation. The company have mentioned difficulties in integrating different functions activities and map out the dependencies of different activities. Orexplore’s development process is manageable on today’s scale but would not be sustainable if the organisation and its activities would expand more.

1.2 Purpose

This is a goal-oriented study and the purpose is to investigate Orexplore’s product development activities and relate dependencies in a process map. The map will serve as a foundation to track where the physical and the digital development interact and how it is managed by the case

(14)

company. The case company have mentioned to value themselves as agile, so to complement the finding of the map, the study will also investigate how agile practices and values are applied in the company’s development. The potential challenges and areas of improvements will also be examined. This will be put in the perspective of how those practices affect and support the development process. The findings will be compared to literature about adopting agile practices in physical development, to find similarities and differences. Through this comparison will this single case study strengthen and provide a small company’s perspective on the emerging research field of digital-physical development, and the field of process theory.

1.3 Delimitations

This study is delimiting the time period for investigating the development process to 20 weeks and the geographical delimitation of the study to Stockholm, Sweden. This is limiting the number of interviews to the employees in the organisation situated in the Stockholm area. The interviewees are delimited to full-time employees.

The developing process that will be studied is delimited to start when a new feature or data needs to be developed, and do not include the front-end development. The process is limited to end when the feature or data can be generated and analysed.

The map will be delimited to visualise the current state of the organisation’s operative development processes. Support and management processes will be discussed, but it not intending to be the focus of this report.

Since the case company develop digital-physical solutions and the area of research is emerging will there be no delimitations to focus on a certain industry in the literature collection.

(15)

2 FRAME OF REFERENCE

The following chapter presents the literature relevant for this study. It starts by explaining the subject of process theory to set the foundation on how to structure and conceptualise a process map. Then it moves on to exploring the subject of product development and the presentation of common frameworks and succeed to the potentials and limitations of digital-physical development. This is to see commonalities and identify the processes in Orexplores practices and give a body of knowledge. Lastly, the theory about the agile mindset, practices and adoption is presented. The company consider themselves to be agile, hence this approach.

2.1 Process Theory

A process is described as the repeating of sequential steps of activities transforming a number of inputs to a set of outputs (Ulrich and Eppinger, 2012; Ljunggren and Larsson, 2010; Egnell, 1994). An activity has a set of inputs in which is transformed into an output for a customer, internal or external (Anjard, 1998). The result of the activity, i.e. the output, will become the input to the activity that comes next (Egnell, 1994). A process is a part of a system, and a system consists of many processes or sub-systems (Stoner and Freeman, 1989). A customer is everyone that receives a process output, this can be internal and external customers (Anjard, 1998). There is always a set of activities that will repeat in an organisation, such as product development, even though the order and end-result (e.g. product) will be different (Bergman and Klefsjö, 1994).

Ljungberg and Larsson (2001) define a process by five key components: input, activity, resource, information, and output. See Figure 2.1. An input starts the activity and can take the shape of a need that is expressed by an internal or external customer, which the activity solves. An activity is the transformation of input to a result. A resource is everything that is needed to do the activity, e.g. time, money etc., and is fixed to activities and not the objects through a process. Information acts as a support for the process but is not viable for the process to start, e.g. methods, frameworks, tools. The output is the transformed result of the activity and resources invested, and it ends an activity.

Figure 2.1. The components of a process, adopted from Ljungberg and Larsson (2001, Figure 6.3)

Egnell (1994) describes three types of processes: operative, supporting, and managerial. The operative process is the process which refines organisational resources to fulfil customer needs, e.g. the product development process. Support processes are the activities that support the operative processes, such as information and maintenance processes. Management processes are

(16)

the activities that manage the operative and support processes, so the processes follow organisational goals and strategies. Strategic planning is an example of a management process. Anjard (1998) suggest the process map should be developed from a top-down approach, breaking down the process into smaller subprocesses, and the subprocesses into activities. The number of levels is dependent on the complexity and the scope of a process (Anjard, 1998; Egnell, 1994), Figure 2.2. shows how a process can be divided. As a rule of thumb, a subprocess can be divided into 5-15 activities. The number of levels is dependent on the purpose of the mapping and the complexity of the processes (Anjard, 1998).

Figure 2.2. The levels of a process adopted from Egnell (1994, Figure 3.10)

The structure of an organisation can outline how to divide and coordinate activities (Mintzberg, 1983). Process theory focuses on the cross-functional workflow rather than on single functional work, and the customer’s needs and requests are central (Loinder and Rentzhog, 1994). Quality, cost reduction and flexibility add value in a process (McCormack and Johnson, 2001). According to Melan (1992) is the physical manufacturing processes well established. The manufacturing process is described to be good if it has following characteristics: a process owner, well-defined interfaces between different processes within the organisation, activities and tasks are documented, and regular measurements of the process in order to manage it. According to Anjard (1998), processes in cross-functional business practises are rarely documented, standardised, measured or continuously improved.

A map of the processes in an organisation can help to create an understanding of the process and how each activity correlates to the end-result. This helps to achieve a common vision (Bergman and Klefsjö, 2002; Curtis et al., 1992). Transparency is increased by visualising the process (Klotz et al., 2008), and a map is according to Egnell (1994) the best tool for visualisation. Through mapping, connected activities can be identified and understood, and problems identified (Ljungberg and Larsson, 2001). Transparency can help with communication and collaboration within an organisation for continuous improvements recognised by lean theory, as well as help in decision-making for more involved stakeholders (Womack and Jones, 2003). In process theory, it is the process that produces the products and services in an organisation, and the performance of an individual is limited to the processes, i.e. an employee can only be as good as the process allows it to be (Anjard, 1998; Egnell 1994). Mapping a process does not implement change or improvements of it, but it is the first step to develop the organisation (Egnell, 1994). Understanding is the first step to change. From current-state maps, future-state process maps can be created, i.e. improvements on the current-state map (Rother and Shook, 1999).

Process maturity models can be used for organisations to get an understanding of their processes and help with its improvements and management (Okręglicka, 2015) The maturity of a process

(17)

can be described to which degree it is defined, managed, measured and continuously improved (Dooley et al., 2001). The process maturity levels are shown in Figure 2.3. The levels are Ad-hoc, Defined, Linked, Integrated and Extended, and a process can advance through the prerequisites also shown in Figure 2.3. For example, a process of the maturity Ad-hoc can move up the maturity chain to Defined once the process is specified with basic definitions, as it is the requirement for the adjacent higher maturity level.

Figure 2.3. Stages of process maturity (Dahlin, 2017, Figure 4)

2.2 Product Development

Product development is the set of activities an organisation deploys to commercialise an idea. Multiple activities set in a sequence of steps that transforms a series of inputs to a number of outputs is a process, and for a commercialised product these activities become the product development process (Ulrich and Eppinger, 2012). A well-defined development process can according to Ulrich and Eppinger (2012) help with quality assurance, coordination, plan, management, and improvements.

Following sections introduce the ideas of physical product and digital development and different approaches to the ideas. These are visualised in simplified models and frameworks, which in practice are often adapted to the organisation’s own practices.

2.2.1 Traditional product development

The traditional product development, i.e. the process describing the development of physical products, can the simplest be described through the linear waterfall plan or Stage-Gate (Ullman, 2010). Ulrich and Eppinger (2012) describe the generic product development process as consisting of six key activities such as planning, concept development, system-level design, detail design, testing and refinement, and production ramp-up. This is visualised in Figure 2.4.

Figure 2.4. The generic product development process (adopted from Ulrich and Eppinger, 2012, EXHIBIT 11-9)

Each phase ends with a decision point, or gate, where it is decided if the product project should move on to the next phase or not. The gates are sometimes referred to as design reviews, where the design team meet on formal meetings to describe their progress and how criteria are met (Cooper, 2008; Ullman, 2010). The gates act as a control mechanism to the development process so it delivers the right output in time and resources is not put in activities that will not be profitable. Gate processes create a standard and a common language to the developing teams (Cooper, 2008), and provide an efficient work through planning (Tonnquist, 2019). Teams in stage-gate models

(18)

work parallel to each other and are cross-functional, which allows early recognition of integration problems (Ullman, 2010). Though it is still a sequential and bureaucratic process with similarities to the waterfall process, which due to gates can create hold time in the workflow and according to researchers hinder creativity and innovation (Christensen, Kaufman and Shih, 2008). Iteration is not unused in the waterfall or Stage-Gate methods, it is built-in the process and planned within the phases (Ullman, 2010).

2.2.2 Software development

In software development are changes allowed late in the development process and can more easily be applied because time is the main resource in software development. In contrary to mechanical products, software code is easier and less expensive to prototype and limiting design decisions can be taken later in the development (Ullman, 2010). According to Kruchten (2000), software development has six best practices which are iterative development, requirements management, (component-based) architecture, visually model software, continuously verify quality and control changes. Pfleeger and Atlee (2006) describe the importance of customer and developer communication for successful software development, as the customer’s requirements often change during the development process.

The software waterfall plan is similar to the traditional waterfall method, where each step needs to be complete before moving to the next phase. This requires the developers to guarantee the completion of a step, which takes time to assure it and can cause bigger mistakes later on in for a more complex problem (Pfleeger &Atlee, 2006). Just as the waterfall method for physical product development, was the software version early discarded and other approaches to software development surfaced throughout the years, such as Rational Unified Process (RUP), Scrum and Hybrid models.

RUP is a software development framework created to adapt to requirement changes through a more incremental and iterative approach (Kruchten, 2004). The framework consists of four phases and nine disciplines, where three of them are support disciplines and six are technical. Figure 2.5. visualise the phases, disciplines, and iterations in RUP and how the activities change in intensity and what activities are executed parallel. Integration is implemented after each iteration in order to minimise the risks of bigger integration failures with one big integration at the end (Kruchten, 2004).

Figure 2.5. The RUP framework and its components (Kruchten, 2000, Figure 2)

During the 1990s and 2000s, the agile manifesto (Beck et al. 2001) was introduced as an alternative software development method as an uprising to the strict structures of the approaches at the time. Different frameworks were established from the agile manifesto. Measey (2015) describes that many agile frameworks overlap and uses different terminology. However, the frameworks still

(19)

share similarities in the structure of identifying what needs to be done, do it, measure and review, inspect and adapt. The agile approach on product development provides a flexible playground when the context is unclear, which allows the developers to form a product at the same time the customers get a better idea of what they truly need (Tonnquist, 2019: Schwaber and Sutherland, 2013). One of the most used agile methods is Scrum (Diebold and Dahlem, 2014). Scrum provides a different set of methods to manage product development. Scrum consists of a set of activities that is iterated in loops, or sprints, managed through daily meetings and reprioritisation (Boehm and Turner, 2005). This is summarised in three roles, four types of meetings and three artefacts (Ovensen, 2012). The Scrum cycle and its components are visualised in Figure 2.6.

Figure 2.6. Visualisation of the Scrum software development method (Ovensen, 2012, Figure 2.2).

2.2.3 Complex systems

In research, traditional product development is considered the development of physical products. Due to digitalisation, products have become more complex with integrated technological parts and embedded systems with electronics, mechanics, and software. A function is not set in one practice, but a system of many (Ullman, 2010). Developing physical products has its limitations in terms of physical resources, such as test resources to manufacturing. This requires time and money to allocate material from producers and suppliers. Digital tools, such as CAD and CAM, are available to test products without any physical materials, which allows a later decision point and setting of design. Though, the design needs to be set at some point to create a physical product. Integration and parallel work are a vital part of such systems, according to Ulrich and Eppinger (2012). The same authors visualise a complex system development approach, which is a development of the traditional, linear model, see Figure 2.7. In a complex system is the system-level design critical as it lay the foundation for the following phases, such as decompose into subsystems and components, and assign teams. This also requires an integration and test phase to combine the work of the parallel development (Ulrich and Eppinger, 2012).

(20)

There are a few studies on agile approaches integrated with more traditional models, so-called hybrids. Cooper and Sommer (2016) describe three cases on how scrum has been adopted in traditional Stage-gate models in various degrees, such as adopting the product development role of scrum, artefacts and/or the iterative stages for the project team while keeping the gate structure for projects.

2.3 Agile development

The term agile is an umbrella of definitions, methods, and practices. Agile can be considered a recent phenomenon, though agile aspects such as iterative and incremental practices have been used found earlier on (Abbas, Gravell and Willis, 2008; Measey, 2015). Toyota Production System or Lean, Total Quality, and Design Thinking are frameworks which Agile thinking and its practices stem from (Denning, 2018; Measey, 2015). The authors of the Agile Manifesto, Beck et al. (2001), summarised the agile values as:

• Individuals and interactions over processes and tools

• Working software over comprehensive documentation

• Customer collaboration over contract negotiation

• Responding to change over following a plan

To complement the values were twelve principles established as well (Beck et al., 2001), and can be found in Appendix A: The agile principles. In Diebold and Dahlem (2014) research over agile practices in practice from 2014 was following list found of universal and most common agile practices found in Figure 2.8.

Figure 2.8. Overall usage of agile practices from Diebold and Dahlem’s study (2014, Figure 2)

It is argued that agile is more of a mindset rather than frameworks and sets of tools and is therefore often described as agile thinking or an agile mindset (Measey, 2015; Denning, 2016). The agile mindset is accomplished through agile components, which is visualised in Figure 2.9. Though, the mindset is required to efficiently adapt agile values, principles, practices, and tools (Measey,

(21)

2015). To adopt an agile mindset is strong leadership and management required to accustom the values and principles of agile. Agile thinking is a cultural aspect and adopting to it takes time, which is in younger, born-agile organisations is it shown to be a criterion in recruitment (Denning, 2016).

Figure 2.9. The agile components (Measey, 2015, Figure 2.1)

There are studies on the implementation of more agile methodologies outside the pure software domain and to a combined setting of digital-physical development processes, with various results and no optimal solution (Hendler and Boer, 2019). It is found that software development in a combined setting adapts to physical development more easily and the physical development approach is more dominant when combined (Cordeiro et al., 2007; Eklund and Bosch, 2012). This is a result of the adaptiveness of agile development methods and can take the form of compensating physical design errors (Hendler and Boer, 2019). Hendler (2018) proposed that in a setting to where a physical stability-optimised and a digital adaptability-optimised development process are combined, must either adjust to the other, i.e. become more stable or adapting, or both change. In which all cases affect the performance of either one or both subprocesses. The same author proposes a combination of adjustments to minimise the negative effects of only adapt to one and the requirement of effective communication and coordination between stakeholders (Hendler, 2018).

In Berger and Eklund’s (2015) study on scaling agile outside software development were two main challenges found. These challenges were non-flexibility in the test environment and the need for an open mindset for agile principles in the existing organisational structure. The inflexibility of the test environment constraints the speed of feedback to new implementations and affects the cycle times. The different cycle times between physical and digital development practices create difficulties to find an optimal coordination frequency (Könnölä et al. 2016). Different suggestion on coordination mechanism is found in research, such as cross-functionality (Könnölä et al. 2016; Eklund and Berger, 2017), early collaboration, understanding of design and schedule constraints, the cost of delay and the impact of uncertainty (Hendler, 2018), and platform development to speed up the cycle time of the physical development (Eklund and Berger, 2017). Additionally, acceptance of slower software development and of incomplete components for the digital-physical prototypes, minimise interdependencies between digital and physical development (Eklund and Berger, 2017), and apply agile practices throughout the whole organisation (Könnölä et al, 2016), to align the digital process with the physical process cycles.

The designers’ knowledge of possible solutions and technologies increases during the new design process, but as the designer’s knowledge increases will the design freedom decreases. This is called the design process paradox and is visualised in Figure 2.10. In the beginning, the design freedom is high as no design decisions have been made and few resources have been invested. However, as knowledge is gained, decisions are made, and capital is invested will the design

(22)

freedom be gone. This is according to Ullman (2010) why the goal is to acquire as much information as early in the physical design process as possible and therefore minimising the increasing costs in the later stages in the process. Yet, early bindings that are common in physical development, limits the knowledge creation and adaptability in digital development and consequently reduces the potential value of the digital-physical product (Hendler, 2018).

(23)

3 METHODOLOGY

In this chapter, the working process is described. This includes the methodology to find information, collect data and transform it to the results presented in this report. It starts with describing the research approach, the methods for data collection and analysis, and ends with a discussion of the validity and reliability of this study.

3.1 Research Approach

Lee (1999) suggest a qualitative approach for studying organisations processes and structures. As this thesis studies one organisation's structure and way of working from an individual level, the data that needs to be gathered are subjective to the employees and to how they perceive the studied topic. The data obtained have a non-numerical nature and did not aim to quantify any data or results. Therefore, this study has a qualitative research approach for data collection. With a qualitative approach for data collection, an inductive approach is needed to analyse it.

3.2 Data Collection

The data used in this study was primary data collected through interviews with the employees at Orexplore in Kista, and the theoretical framework is attained through a literature research. Following sections describe the data collection methods further.

3.2.1 Literature

A literature study was carried out before the interview study to gain knowledge in order to create the interview guide, as well examine the existing body of knowledge within the subject of process mapping, development processes, and the agile mindset. This was both printed and digital materials found through several search engines, as suggested by Wu et al. (2012). The search tools used in this study was Google Scholar, KTH Primo, and Scopus, using the following keyword:

process theory, process mapping, product development, product developing process, agile, agile practices, scaling agile, digital-physical development, agile hardware, traditional product development, Stage-Gate

Through the literature found from the keyword search, a backward and forward references search was conducted. If the search result was not satisfying, a Boolean search was conducted to find more detailed results. The result from the searches was screened, first through reading the abstract, and continued to read the whole source if found relevant.

3.2.2 Interviews

The data was collected through interviews and conducted in two sprints, first a main interview study followed by a small follow-up interview study. The main study had 12 semi-structured interviews that were between 40-75 minutes. One of the interviews was with management, three interviews with the operations function, and seven with the software function. This is the organisational structure, hence the different number of interviews between functions. The interviewees are full-time employees. The interviews were semi-structured, i.e. the interview guide acted as a guide for the interview but allowed the interviewees to talk more freely around the

(24)

subject and to their own conditions. The interview guide had a small introduction of the purpose of the study and interview, as well as a description of the terminology used. It was presented before each main interview. The method of “Virtual walk through” for data collection suggested by Ljungberg and Larsson (2001) for process mapping was influencing the interview guide. This method consist of interviewees describes their part of the process step by step and explains each activity and its inputs and outputs. The interview guide was structured to allow the interviewees to describe the activities, input, output, resources, and information that influenced their work, which is the key components of a process (Ljungberg and Larsson, 2001). The first interview acted as a test interview to the interview guide, and small changes were made to the remaining interviews. The guide became shorter as the first version had questions that were answered in beforehand.

The follow-up study was conducted through four interviews with interviewees from the first study. The goal of the follow-up study was to verify the results in the main study and to clarify uncertainties. The interviewees for the second study was selected upon their knowledge of the subject from the impression in the first study. This decision was made by the author to get the most of the second interviews and to save time. The interviews were between 10-40 minutes. The follow-up interviews were conducted in a semi-structured approach as well, and the interview guide was created after the answers from the first study were summarised and analysed to find uncertainties. This interview guide was smaller and adopted to the interviewees to find specific information.

3.3. Method of Analysis

The analysis was intended to connect the collected data with literature in chapter 2. Frame of

Reference to find similarities, differences and produce new thoughts and ideas that can benefit the

research within the studied field.

The transcribed interviews were coded by using a method described by Attride-Stirling (2001). This method of coding the text by categorising the text into basic themes and follow it through organising the themes. The themes were associated with concepts within process theory, perceived problems, and agile practices. The themes regarding process theory were used for the conceptualisation of the process map, while the other themes were used to analyse the agile adoption around the development process.

To define and map the development process was the method described by Ljungberg and Larsson (2001) used. This method provides a step by step instruction on how to create the map in a structured and efficient manner (Ljungberg and Larsson, 2001). It consists of the following eight steps:

1. Define the scope of the process, its purpose and when it starts and ends. 2. Brainstorm all activities in the process.

3. Sort and arrange activities in the right order. 4. Merge and add activities.

5. Define inputs and outputs for each activity.

6. Control that each activity is connected through inputs/outputs. 7. Control that each activity is on the right and same level. 8. Analyse and correct process descriptions until satisfied.

The process was defined and limited in discussion with the industrial supervisor. The examined process was decided to the new product development process which starts with a customer request and ends at the decision of series production. The data that was gained through the interviews and

(25)

later coded was used as the brainstorming step, which was conducted by the author. The interviews can act as a brainstorm session for the interviews as the interviews were conducted in a semi-structured manner. Post-it was used to visualise the map of activities and dependencies. Additionally, a PDCA-analysis was used to sort and arrange the different subprocesses and activities. PDCA stands for Plan, Do, Check and Act (Ljungberg and Larsson, 2001).

The tools that were used to visualise the process map and figures in “4. Results and Analysis” were SmartArt in Microsoft Word, and the online map maker Lucidchart (Lucidchart.com).

3.4 Validity and Reliability

Since the qualitative nature of this study, there is subjectivities and biases to be considered. It can be argued that multiply interviewers can increase the reliability of the study and minimise the subjectivity of the author (Collis and Hussey, 2009), though it was not possible for this case due to the number of researchers. To minimise the human errors that can occur with only one researcher was interview guides made to keep an indifferent approach to each interview. By using the scripted introduction about the study and the terminologies used in the interview guide, did all interviewees get the same information and could, therefore, approach the subject equally.

All interviews were conducted in Swedish and the main interviews were recorded to later be transcribed. This approach allowed the author to be present during the interviews, as well as provide reliability to the interview answers as the interviews could be revisited. The follow-up interviews can be considered more informal than the main interviews as they were not recorded. However, the interviews in the follow-up study were held short and the answers were documented during the interview to assure reliability to the answers. In the analysis have the author tried to identify different values in formulations, and through recordings can the interviews be revisited to verify and spot personal opinions. The aim was to execute all interviews face to face, but one of the interviews had to be through video and it was for the main study.

The validity of this study can be discussed through the possibilities of generalising the finding (Sanders et al. 2016). This is a single case study with a qualitative data approach and goal-oriented purpose can the findings from this study be difficult to claim a generalisation of the results. However, the validity of the research was considered through the sample size of the interviews. Most employees were interviewed which allows a broad data collection within the case company and valid the map.

(26)
(27)

4 RESULTS AND ANALYSIS

The following chapter presents the results that are obtained with the methods described in the previous chapter are compiled, analysed, and compared with the existing theory presented in the frame of reference chapter. This chapter starts with a description of the organisation and the organisational functions. Its later transitions to the studied development process and ends with an analysis of how agile values and practices relating to the development process.

4.1 The organisation

The main product Orexplore offers to its customers is core data. They need two types of products to provide the customers with the data, i.e. the X-ray machine and their visualisation tool Insight. The customer accesses their data through Insight. The cores are scanned either by the customers themselves through a leasing contract of an X-ray machine, or in the office in Kista and is provided as a service.

The organisational structure is a flat organisation structure with one level of management. The structure of the development at the company is visualised in Figure 4.1. The development has three branches in Kista and one in Perth, Australia, though this branch is not studied in this report. The branches in Sweden are the main part of the development and will be described in more detailed in later sections. Everyone under the Stockholm-branch is situated in the same location office in Kista. The R&D Manager and Managing Director are main responsible for the organisation and the development. The R&D Manager is the CTO, and responsible for the development and has the final decision regarding technology. The Managing Director is responsible for the business development aspects and is responsible to the company owners and board. The Managing Director and the R&D Manager have the main contact with external stakeholders and customers. Together are they referred to as the management though the rest of the report.

Figure 4.1. The organizational structure

The organisational structure is set so it is required that the employees work outside their role description. This is mentioned as putting on different hats of functions and is an effect of the company size. The organisation under the management level is self-organised under supervision. The branches organise themselves into working teams to each project. They do not have a team

Managing Director R&D Manager Stockholm Software Physics Operation Perth Operations

(28)

leader or assigned roles. The following sections describe the different branches with their main functions, responsibilities, activities, and resources.

4.1.1 Software – Insight

The function’s main responsibility is the visualisation software tool, Insight. The main purpose of the development of Insight is to provide a working software that visualises the data generated by the X-ray machines. Insight makes it possible for the users to interact with their data in 3D, which require the development of the user interface and front-end design. The Insight function’s main activities are specifying requirements, concept creation, implementation, testing and review, merge code and release. The data used in the software tool is the data generated by the machine and the algorithms created by the physics function.

All ideas, requests and problems regarding Insight are collected in the source code management tool Gitlab as issues. Gitlab is also the tool used to implement code for both Insight and the machine, and they use a Gitflow workflow. Gitlab allows traceability of work and can monitor work through checkboxes. Gitlab and Mattermost are used by this function for communication, and the latter is used for fast communication through chat.

4.1.2 Software – Machine

The software for the machine can be divided into the core and physics function but is presented as one due to the similarities in activities. The main responsibility for the Core function is the embedded software, i.e. the software/drivers for the X-ray machine for it to generate and collect data. The Algorithms function uses this data and is responsible for creating algorithms that analyse the data. These two functions’ main activities are concept assessment and creation, implementation, testing (mechanical and analytical), review, merge, and release.

This physics function provides activities that support the organisation with information, such as chemical lab comparisons, contact with the Swedish Radiation Safety Authority (Strålsäkerhetsmyndigheten, SSM), writing scientific articles and code reviews to other inhouse software developers. This function provides the mechanical designers, i.e. the Operations function, with functional specifications and scientific input. The machine software function uses the same software as the Insight team for the same purposes, i.e. Gitlab and Mattermost. Also, the function needs an X-ray machine to perform tests on.

4.1.3 Operations – Mechanical Design

The mechanical design function is responsible for all development and design of physical products. The activities operated within this function is establish specifications and concept development, mechanical design, design review, create 2D and 3D documents, and tests, both computer-aided (Digital test assembly and Failure modes and effects analysis) and physical tests (Physical test assembly).

This function needs information from the Physics function regarding functional specifications such as form factors (i.e. tolerances, materials, geometric part specifications, tolerances). Information regarding assembling, manufacturing and suppliers is given as input to this function by the Production and Sourcing function. This is exchanged through review meetings designated the topics of Design for Assembly (DFA) and Design for Manufacturing (DFM). The mechanical designers have some contact with suppliers in order to develop manufacturing knowledge to create DFM for Orexplore’s parts and products.

(29)

The mechanical designers use the CAD-tool SolidWorks, Redmine to report issues, and Hermod for product data management, such as CAD-model and 2D documentation. Every physical change of the machine needs to be added to the CAD model and in 2D documents in Hermod. For a change to be made on a machine, such as an upgrade or new feature, is it required to create the CAD model and 2D-documentations for the change before it can be implemented or tested. This is a principle they do to minimise the development time. The operations function uses progress statuses, which are Progress, Approved, Committed and Release. This subprocess starts with Work-In-Process and ends at Committed. The studied process is considered to end once the status change to Committed.

4.1.4 Operations – Production and Sourcing

The production and sourcing function is responsible for the manufacturing, production, sourcing and ordering of materials and tools that are needed for assembly. The term production is their in-house assembly line. Most of the manufacturing is outsourced to different suppliers or they use off-the-shelf parts. This function acts as a filter between the company and the suppliers and distributes the supplier-knowledge to the rest of the organisation. This to ease the workload of the mechanical designers, and to keep all supplier contact to one function. Other functions can support this dialogue when detailed information is needed.

The main activities within this function are supplier contact (i.e. find, exchange information, make order, purchase, measurement protocol), design the assembling order, assembly tests, write instructions and develop the production line (i.e. tools, space). This function supports the mechanical design team through input during DFA and DFM meetings.

The output from this function is a manufacturing bill of materials, mBOM, which is a modified engineering BOM, eBOM with correct assembly order and additional working and assembly descriptions. This function uses Hermod for product data (2D- documentation, CAD-models, tolerances, article number), Monitor for managing supplier and production data. All tests require a machine or module to test new parts on.

Electronics is a function that is their own but not considered in this study as the electronics for the machine is mainly outsourced to a consultant firm. Though, Orexplore has recently started to implement this function in-house. The transition is starting with the development of a new electrical cabinet, and the development of the electrical parts is still outsourced but is reviewed inhouse. The communication with the external firm consists of daily meetings and documents are shared on a wiki-site. The documents are blueprints, smaller blocks of texts and photographs of the interfaces. The physical parts of the electronics have been made by consultants and the firmware is developed by the software function. Most of the electrical parts are already established and the new product development works around it.

4.2 The Development Process

The development process that allows the customer to access their core data is the main process of the case company. The products the company offers are continuously developed through new features, and this is the process for developing such a feature until it is accessed by the customers through data presented in Insight. This development process can be divided into two operative subprocesses, i.e. the development of the data, and the visualisation tool. The first subprocess can later be divided into the development of mechanical design, core software, and algorithms. All processes are visualised on a high level in Figure 4.2. It shows the order of work that is needed to create the final product and it is divided by the functions and ordered by the dependencies for finalising the product. In order to develop the data that is accessible for the customers must the development of the prior functions be achieved. Within the limitation of the studied process, the

(30)

physical product can be considered finished when the Insight development starts. The operative software development starts once the mechanical design is set.

Figure 4.2. High level visualisation of the development process.

Though, this is not a representation of how each function works throughout the whole development process. The figure does not show the dependencies between functions within each subprocess, such as information dependencies. Figure 4.3 was created to visualise the information dependencies is each subprocess presented as different lanes along with each other with the shared phases System Level Planning and Deploy. The development of the mechanical design and the visualisation tool do not interact, and the data software development acts as the intermediary between the two processes. Hence its placement in between. The following sections describes all subprocesses visualised in the figure and the information dependencies.

Figure 4.3 Information dependencies between the subprocesses.

There are several ways for the development project to be initiated. External input from users and customers from the industry are collected through workshops and direct contact through the company’s management. The physicists work with scientific research to find new technical opportunities and provide internal input. Technical ideas are tested through tests, both physical but

Mechanical design development Core software development Algorithm development Insight development Creating the physical product, the machine Mechanical design development Creating the software that generate data from the machine

Core software development Creating the alorithms and software to anlayse the generated data Algorithm development Creating the software that visualise the analysed data Insight development

(31)

also through simulations. The market potential is assessed through industry contact. It is the management that decides which ideas that should be developed. This is determined by which ideas creates most value to the customers and therefore profitable. These ideas are new physical features or new data collection method, and the idea becomes the initial input for the development process and is presented as EPICs.

4.2.1 System level planning, SLP

All development starts with a system level planning (SLP) which is shared to all functions. The input is the EPIC, which is the development idea and can classify into projects. The activity of this subprocess is to assess the idea and to find possible ways to approach the idea or problem and consider how it fits today’s products, such as brainstorming. The SLP is executed through a meeting usually with the machine software function attending, i.e. physicist and core. In this stage, the development planned, and a system-level design is decided, such as the character of change, i.e. physical or digital change, and how data is saved and stored etc. The EPIC is broken down into smaller issues in which all need to be deployed for a new feature is considered finished. The issues are written in the Gitlab and Redmine. The issues are designated to either appropriate employee or function or claimed by interested part. Teams are self-organising to the task. Basic specifications and requirements, concepts and system constraints are established in the SLP and acts as the output from the activities. The specifications are usually described as a function that is wanted, e.g. faster scanning of cores or depth adjustment after scanned core, but the idea of how to do it perceived either through the meeting or explained ad hoc after.

4.2.2 The Data Development

Following section will describe the process for developing data when a physical change is needed to generate new scanning data. This is the three first field in Figure 4.2. If only a digital change is needed, i.e. the data needed is already possible to generate, then the first phase can be overlooked. The development process for the data is more complex due to the data that the customers will access is generated by the machine and can therefore not be developed only by one function. Thus, allowing a digital-physical development. Many activities are executed parallel to each other and the functions work in varied intensities during the whole process. Each function work in an iterative manner and the activities can be summarised into specifying requirements, design and implementation, test, and review. See Figure 4.4.

Figure 4.4. Iterative activities.

Specify requirements Design and implementation Test Review

(32)

The development process for new data generation that requires a physical change is described in activities and information is visualised in Appendix B: Data Development Process, which is shared by the operations and the machine software functions. The Operations function’s activities are presented in Figure 4.5.

Figure 4.5. Operations functions workflow in development.

The physical development process initiates after the SLP and starts with a meeting between the mechanical designers and the machine software function to specify the requirement the physical part further. The machine software function supports the mechanical designers with technical and scientific input, so the mechanical designers know the important design parameters. The communication after this meeting is through ad hoc communication or through email if more input is needed. After the first meeting, the mechanical designers work on a concept, which is reviewed by the operations function through a meeting, i.e. DFA and DFM meetings. The concept development is iterated until it is found satisfied by the involved functions.

The concept is the input for production to create a preliminary mBOM for production, and to contact suppliers and manufactures. This is to start the manufacturing process at the suppliers and get feedback regarding the design. The feedback is given to the mechanical designers and if needed, changes are made. Once the concept is accepted by all parts, then the status is changed to

Approved. From this point will the eBOM and a purchase order be sent to the suppliers for

manufacturing. The output from the manufacturers is the physical product and a measurement protocol to assure manufacturing quality and traceability. Once the output from the manufacturers is approved will the new part be tested to the existing machine, i.e. physical test assembly. The physical assembly tests are performed by the operations function. The tests are to assure the quality of the design and manufacturing, and therefore verify the DFA assumptions in the beginning. If the results are satisfying the physical part can be considered finished and the mechanical designers’ work can be put on hold. The final eBOM is given to the Production and Sourcing function. Once the part is assembled, then the software function can start and presented in Figure 4.6. The core part of the software function develops drivers if needed in the similar procedure as shown in Figure 4.4.

(33)

Figure 4.6. Machine Software functions workflow.

The test they perform is to test the mechatronics and to assure the quality of the data generated. The test validates the physical design and is iterated back to the mechanical designers if not an acceptable result is found and cannot be fixed through implementing new code. This iteration depends on the character of the test result and can go back to the start, i.e. the concept generation, if needed. After the hardware tests the development of the algorithms can start and in the same manner as for core software. Once the right data is generated and analysed can the mechanical part be considered finish and the status is changed to Committed. To visualise how the information flows within the workflow and testes is Figure 4.7. created. The information is in the form of feedback on the mechanical design. Hardware Test is the software drivers for the machine to generate data, and the Data Test is the algorithms tests.

Figure 4.7. Information flow for physical development

The Production and sourcing function is involved in the creation of the eBOM by giving input regarding assembly to create a design that meets the criteria of DFA to the mechanical designers. Through the preliminary eBOM the production can create a preliminary assembling order. This order can later be changed after the assembling test when the new parts are delivered and assembly issues are found, as well later when the new feature is finished after development. This function acts as a support process to the mechanical designers during their development process. This function finalises the product by ordering, assemble and install the new part. The status changes to Release when suppliers are found for all involved parts and it is considered ready for serial production.

If there is no need for a physical change to the machine, then the development of new data can consist only of the process shown in Figure 4.6. If the data already exists, then only the algorithms function is needed to make sense of the data. Both subfunctions share similarities in activities and dependencies. The development needs to consider the existing design. The input for the processes is the new part and machine driver in turn, and the specifications have travelled with the product or found from SLP. There is no external input to the software development, and they utilise the internal knowledge of the developers.

(34)

Before implementing the new code to the main git branch and finishing the software development is it reviewed by two other developers within the organisation and passes with a “Looks good to me” or “LGTM” message. The new developed code is deployed to the machines and is planned together with the customers and working partners to fit their scanning schedule. This is the final step in the development process, i.e. Deploy.

4.2.3 The Insight Development

The development of the visualisation tool is seen as a separate development process to the rest of as it operates with the data developed in the previous described process. It is most common that the data is finished developed by the algorithms function before delivered to this process. The Insight development initiates if a new feature is needed in Insight, either that new data is generated, or it is a market request. One input is the market input from management and consist of what type of data should be presented in Insight. User input is collected from the internal geologist and partner projects and contain user experience. The input is collected and assessed in an Insight-meeting to specify the requirement. Customers can use the customer portal Zendesk to report issues or ideas, though it is not well established today, mainly because of the small customer base around 20-50 users. User information and input is also gained through the in-house users, such as the physicists. Today, the main input for user experience from one geologist, who also sits in the company’s board, that comes with suggestions either in a physical meeting or through email. The subprocess is visualised on a high level in Figure 4.8., with the information inputs. This function uses two branches in Gitlab, one master branch which is the one the users use, and a development branch which is later merged to the master one.

Figure 4.8. The Development Process of the Visualisation Tool

4.3 Analysis

The following section present finding found during the research that is related to agile practices. The findings were defined through an analysis of the interviews where it was mentioned or implied according to the author. The areas were grouped around the agile values, principles and practises referred in 2. Frame-of-Reference. The names of following sections are influenced by the agile values.

4.3.1 Structure

The organisational structure and location have been mentioned in interviews to provide a holistic awareness of the work that is going on and enable fast communication. They work in

(35)

cross-functional teams when digital-physical development is needed and have a high-level of self-governing. The self-governing can be considered the creation of working teams and making design decision. The size and structure of the organisation was suggested by an interviewee to be success factors to their way of working but saw the potential issues if the organisation would grow. Problems that occur are easily managed today through the fast communication and the knowledge of what everyone else is working on.

The organisational structure is dependent on the individual initiative of the employees and was mentioned to be an asset considering in new employment. The individual drive applies on the level of done, i.e. the qualification to move on in the process, and when and how activities and tasks are put through. All levels are set individually, which allows different levels between workers, but is controlled through code-review principles for the software development and concept meetings such as DFA and DFM for the mechanical designers. The code review requires a two people verification, and a status change is decided upon the meeting.

The development has fragments from Scrum, such as the terminologies of backlog, Roadmap, EPIC, and practices like backlog meetings. The company used to follow a Scrum framework before but uses it now more lightly when the organisation grew. In the company’s origin, most development was outsourced through consultant firms except the development of the software tool and algorithms. The company started employ new functions when they found it necessary to have the functional knowledge inhouse.

4.3.2 Artefacts

The documentation is minimal in at the case company’s development process, and can be summarised in code, 3D-models, and 2D-documents (BOM). This is collected on Gitlab, Hermod and Monitor. The operations function has policies about working front-heavy and creating the 3D and 2D basis before testing. This process was mentioned to cut time from the operations cycle as the front-heavy working process let the manufacturers get involved earlier. The platforms for documentation are different between the functions, as some information is on Gitlab, Redmine, Hermod or Monitor, and others on individual emails. This was described as a difficulty to derive activities and issues to parts of the product. Each function usually does not visit the platforms they do not work on.

The minimal documentation is allowing more time for development and creation, which was mentioned to be a positive aspect of the workplace, i.e. that it is fun to work here. Most days of the week are they allowed to continue their work throughout the whole day without meetings. This was thought to be more creative as more time can be put into the development, allowing an efficient workflow. It was expressed that the minimal documentation and formal processes and practices have caused problems, but that it was worth it because of the creativity and motivation that was gained.

Decisions are rarely documented, as it is deployed in the product right away. It was mentioned that this information sometimes could be forgotten but always found somewhere on the different platforms or asking people around. It is mentioned that on occasions was reviews forgotten and delayed as a consequence to the developers started on a new task and have not reminded the reviewers. The same applies to the introduction platform for new physical parts that needs to be tested.

4.3.3 Quality

The finalised products are not tested on customers before it is released. The customer requests are set in the beginning in the SLP phase together with the customers and end-users. Many interviewees explained that they rarely had contact with the customers or worked in direct user

References

Related documents

Flera studier om lärande i facket som har gjorts i Sverige har liksom denna uppsats koppling till industriarbetsplatser (Huzzard, 2000; Köpsén 2003; 2008). Det vore angeläget

Självfallet kan man hävda att en stor diktares privatliv äger egenintresse, och den som har att bedöma Meyers arbete bör besinna att Meyer skriver i en

Product quality was highlighted as the most prominent value driver across the customer segments, since the rain data is used in models, in which it is extra important to ensure

A pre-feasibility study is a preliminary systematic assessment of all critical elements of the project – from technologies and costs to environmental and social impacts. It is

This is in line with what Amabile (1998) argues, that an appropriate amount of resources need to be assigned after having considered the complexity of project. However,

Browning (2018) summarizes six point where PD process modelling differs from general business process modelling: 1) the intent is to do something new, once, rather than to model

So he asks if it is a quality issue why isn't Quality doing this on their own (i.e. as a QAC-project), the deputy project leader has the answer that the cause of the noise has

This enables users to browse SAVECCM repository from SAVECCM Integrated Development Environment using standard interface, to import and export existing components from remote and