• No results found

Towards Zero Defects in the Aerospace Industry through Statistical Process Control: A Case Study at GKN Aerospace Engine Systems

N/A
N/A
Protected

Academic year: 2021

Share "Towards Zero Defects in the Aerospace Industry through Statistical Process Control: A Case Study at GKN Aerospace Engine Systems"

Copied!
62
0
0

Loading.... (view fulltext now)

Full text

(1)

Towards Zero Defects

in the Aerospace Industry

through Statistical Process Control

A Case Study at GKN Aerospace Engine Systems

Hugo Andrén

Industrial and Management Engineering, master's level 2020

Luleå University of Technology

(2)

Acknowledgements

Among those who have helped me along the way, making it possible for me to conduct this master’s thesis and complete my degree in engineering, several people have my gratitude. I owe special thanks and appraisal to S¨oren Knuts, my supervisor and representative at GKN Aerospace Trollh¨attan. Without S¨oren’s tireless interest and efforts in pointing me in the right direction, this would not have been possible. Second, I would like to thank Professor Erik Vanhatalo at Lule˚a University of Technology for supervising my work and providing invaluable feedback through an academic lens. I would also like to thank my friend and opponent Albert Stenb¨ack Juhrich for the feedback throughout the duration of the work. Together, S¨oren, Erik, and Albert have helped me enhance both relevance and quality of my thesis. Lastly, I would like to thank all the people who participated in the interviews, enabling me to collect the qualitative data, which I see as a cornerstone for the thesis, and of course GKN Aerospace Trollh¨attan for letting me conduct my thesis despite the outbreak of the COVID-19 pandemic.

(3)

Abstract

With the ongoing transformation of modern manufacturing systems in an industry 4.0 environment, industrial actors may see great improvements with respect to quality towards a state of near zero defects. For the aerospace industry, where increased quality and reduced risk is strongly related, new technologies may be used in manufacturing to see to the increasing demands on products. The safety, as well as the manufacturing complexity of products and processes, make the collected measurement data an integral asset for enterprises within the aerospace industry. Collected data may be analysed using statistical tools and methods to improve process capability and, in extension, product quality. Communicating the need for zero defects, original equipment manufacturers demand increased capability from product and component manufacturers. Hence, zero defects are typically operationalised through exhibiting a process capability of Cpk= 2.0. In response to the challenge, GKN Aerospace need to raise the traditional process

capability targets of Cpk=1.33.

By employing an exploratory research strategy with a deductive approach, the thesis combines theoretical knowledge from the literature with empirical findings in a thematic analysis. The thematic analysis was conducted by employing six phases as suggested by Braun and Clarke (2006) and resulted in the identification of 50 codes from a total of 459 data extracts. Based on the empirical interview data, a framework for how zero defects is interpreted at GKN Aerospace was developed, which describes zero defects as a cycle. Taking into account that zero defects is operationalised through Cpk= 2.0, the cycle consists of six phases that start with a vision and is completed by

delivering a true and reliable Cpkof 2.0. In addition, the codes from the thematic analysis were collated into a thematic

mind map, focusing on key aspects of working with statistical process control (SPC) to support zero defects. Two main themes are presented in the mind map, statistical approach to improvement work; highlighting necessary aspects of statistical process control and measurability, and removing barriers for improvement; highlighting fundamental organisational barriers that impede proactive quality improvement.

To support the findings and give a practical example of how process data may be presented and analysed using tools and methods within statistical process control, an SPC study was conducted on a set of data. In the SPC study, the construction and analysis of individuals Shewhart control charts and moving range charts were described in detail. These procedures provide better insights about process behaviour through statistical thinking and thus better knowledge on how to approach more proactive process improvements.

KEYWORDS: Zero Defects, Statistical Process Control, Quality Management, Industry 4.0, Aerospace Industry.

(4)

Abbreviations

Abbreviation Meaning

AI Artificial Intelligence CPS Cyber Physical Systems

GAS GKN Aerospace Sweden Trollh¨attan IoT Internet of Things

KC Key Characteristic KPI Key Performance Index SPC Statistical Process Control

ZD Zero Defects

ZDM Zero Defects Manufacturing

QSYS Internal database of process data at GAS

(5)

Contents

1 Introduction . . . 1

1.1 Background . . . 1

1.2 GKN Aerospace Engine Systems . . . 3

1.3 Problem Discussion . . . 3

1.4 Aim . . . 5

1.5 Delimitations . . . 5

2 Literature Overview . . . 6

2.1 Industry 4.0 . . . 6

2.2 A Review on Quality within Industry . . . 6

2.3 Statistical Process Control . . . 8

2.4 Zero Defects . . . 11 2.5 Organisational Implications . . . 14 3 Methodology . . . 16 3.1 Research Approach . . . 16 3.2 Literature Overview . . . 17 3.3 Interviews . . . 18 3.4 Thematic Analysis . . . 19 3.5 SPC Study . . . 22 3.6 Research Quality . . . 23

4 Results and Analysis . . . 25

4.1 Zero Defects at GKN Aerospace . . . 25

4.2 Thematic Analysis . . . 33

5 SPC Study . . . 42

5.1 Current Process Control . . . 42

5.2 Distribution Fitting . . . 43

5.3 Control Charts and Analysis . . . 45

5.4 Capability Study . . . 48

6 Findings and Recommendations . . . 49

7 Discussion . . . 51

8 References . . . 52

Appendix A Interview Guide . . . i

(6)

1

Introduction

The introduction provides a background on zero defects and quality management as well as how Industry 4.0 is transforming modern manufacturing systems and affecting quality management practices. What follows is a short introduction of GKN Aerospace Engine Systems, a problem discussion, and finally the aim of the thesis, research questions, and delimitations.

1.1 Background

Quality is of critical concern for organisations in order to meet customer expectations and the threat of competitors’ products (Dogan & Gurcan, 2018). Foidl and Felderer (2015) argue that increasing customer requirements and competitiveness implies quality management being an essential prerequisite and key to sustained economic performance. For original equipment manufacturers (OEMs) supplying products that are subject to meticulous safety, the concern for product quality is amplified. The aerospace industry is an example where the safety of products is of great concern. Within the aerospace industry, the quality of a product is measured by data of multiple geometric specifications and the quality of the manufacturing processes are characterised by process data sets (Wang, 2013). Quality can be referred to as the ability to reduce variation to the point where customer expectations are met or even exceeded. Within industrial manufacturing however, defects are a fairly tangible way to measure quality. Defects are described by Montgomery (2012) as ”nonconformities that are serious enough to significantly affect the safe and effective use of the product” (p. 9). Being affected by facilities, equipment, and manufacturing processes, the quality of a product is subject to several sources that have the potential to cause errors that generate defects (Wang, 2013). To supply an organisation with the right prerequisites for enhancing quality monitoring and optimisation, Wang (2013) suggests shifting the focus from product data to process data.

Zero defects (ZD) is a concept practised within manufacturing for the purpose of minimising defects in a process by doing things right from the very beginning, ultimately aiming for zero defective products (Wang, 2013). It is, according to Tatipala et al. (2018), the increasing demands on produced parts that has led to the escalating importance of zero defects in manufacturing. Although it can be traced back to the 1960s (Montgomery, 2012), it was during the 1990s that ZD saw wide efforts of implementation when automotive companies wanted to cut costs by reducing quality inspections while simultaneously increasing demands on quality from suppliers. Wang (2013) address the necessity of a system for zero defect manufacturing (ZDM) to prevent failures and increase the reliability and safety for manufacturing industries. Tatipala et al. (2018) suggest that part of such a system is the ability to control product and process parameters with the use of connected manufacturing technologies and control systems that handle machine and other process data. It is however implied that such systems require the ability to collect and handle large amounts of data supported by advanced and reliable internet, IT and other technologies. These technologies have just recently become advanced and reliable enough to support the required scalability within industrial manufacturing systems. In 2014, Lasi, Fettke, Kemper, Feld, and Hoffmann (2014) anticipated that the industrial community was soon to experience a new paradigm. It was expected that a way for ”smart” machinery and products were to emerge as a result of recent advancement in digitalisation within industries combined with internet technologies (Lasi et al., 2014). The German government started preparing for a fourth industrial revolution and coined the term ”Industrie 4.0”. They expected that within a near future industrial actors could have the economic benefits of mass production while producing single products in a batch size of one. Producing with a high degree of customisation was to be made feasible by manufacturing systems where products control their own manufacturing (Foidl & Felderer, 2015; Lasi et al., 2014; R¨ußmann et al., 2015). Lasi et al. (2014) described Industry 4.0 as a future project defined by two directions of development: application-pull

(7)

triggered by social, political, and economic changes, and technology-push. The push for new technology can take different forms, one of them being a result of digitalisation and networking. According to Lasi et al. (2014), increasing digitalisation of manufacturing systems provide increasing amounts of data that may be used to control and analyse industrial processes. However, analysing massive sets of raw data is, a major challenge for ZDM as stressed by Wang (2013). Sall (2018) address the fact that many manufacturing processes are so complex that there are simply too many steps and responses to measure, calling for the importance of finding out which processes need monitoring and which ones to leave out. In doing so, Sall (2018) suggest that statistical process control (SPC) may be used to approach the problem through three areas:

• Process Change: concerning methods for examining how engineering changes result in different process variable changes.

• Process Stability: concerning methods for identifying unstable processes and how these shift over time. Quality engineers need to find patterns across multiple process variables such as simultaneous shifts within or between systems.

• Process Capability: concerning ways of determining which processes are meeting customer specifications.

Resource intensive compensation strategies that focus on rectifying quality post production has resulted in both practitioners and researchers to study methodologies that simultaneously provide increasing product quality and process capability (Magnanini et al., 2019). Being a structured and scientific collection of improvement tools that provide a statistical approach to process improvements, SPC has traditionally been considered a good option for identifying nonconforming products and process deviations through the identification of special (or assignable) causes of variability for a given quality characteristic (Eleftheriadis & Myklebust, 2016). Quality or key characteristics (KCs) are variables or attributes whose variation effect product fit, form, function, performance, or producibility significantly (SAE International, 2018a).

A common method for achieving higher quality is the use of sigma limits. The Greek letter sigma ( ) is often used as a statistical term for measuring how much a process is deviating from its target described in multiples of standard deviations, where six standard deviations result in reducing the defects in manufacturing to 3.4 parts per million (ppm) (Dogan & Gurcan, 2018). Products consisting of many components tend to provide many opportunities for defects or other failure along the different process stages (Arsuaga Berrueta et al., 2012; Magnanini et al., 2019), not least as a result of variation (Montgomery, 2012). In order to meet strict demands on such products, the six sigma concept was developed in the 1980s by Motorola (Montgomery, 2012). The six sigma concept seeks to reduce the variability of a given process by having statistically computed control limits located six standard deviations from the mean. According to Gultom and Wibisono (2019), six sigma seeks to reduce process variation and increase process capability by aiming for zero defects. However, having limitations in dealing with complex data sets, six sigma approaches are not sufficient to reach ZDM (Wang, 2013).

Within the aerospace industry, quality is often defined as conformance to specifications (Rolls Royce, 2016) It is thus customary to describe zero defects in terms of capability of a given process (AESQ, 2020), implying that a capable process is bound to produce non-defective components or products. Finding frequent application, the process capability ratio Cp can be used to evaluate the potential capability of

process variables but does not supply any information about why a process is not meeting specifications (Montgomery, 2012; Sall, 2018). To measure the actual capability, Cpk can be used and compared to Cp to

consider centring of the process (Montgomery, 2012). In addition, Sall (2018) suggests that Ppk describes

(8)

how well it could perform if process drift issues are fixed). The difference between these measures is whether a long-run classical estimate of the standard deviation is used (Ppk), or a short-run local estimate

(Cpk) (Sall, 2018). On the contrary, Montgomery (2012) points out that Pp and Ppk were developed for

describing processes that are not in statistical control, resulting in these indices essentially telling you nothing. Montgomery (2012) further claims that all process capability rations (including Cpand Cpk) should

be used with great care, stating that they are describing complex circumstances in a too simplified way. Thus, producing products that are altogether free from defects may not be possible solely by utilising tools and methods within SPC. Therefore, in order to make certain that nonconforming products are identified so that defects are not delivered to customers, inspection must be present.

1.2 GKN Aerospace Engine Systems

As part of GKN Aerospace, Engine Systems is a division manufacturing aircraft engines, engine components, space rockets, and stationary gas turbines. In addition, they provide maintenance work on aircraft engines for military use (GKN, 2020a). GKN Aerospace is an important actor on the market responsible for the production and maintenance of engines for JAS 39 Gripen. Having specialised in specific products GKN Aerospace components are present in over 90% of new larger commercial aircraft (GKN, 2020a). GKN Aerospace Sweden (GAS) has a manufacturing plant with workshops and national offices located in Trollh¨attan. Due to the time for developing and manufacturing products and components being limited at GAS, it follows that the time for analysing process capability is also limited within the organisation. GAS has started the journey towards an Industry 4.0 environment, having equipped new manufacturing processes with sensors for controlling and measuring process outcome (GKN, 2020b). Compared to process requiring more manual operations as well as semi-automated processes, these new sensor-equipped processes have resulted in increasing amounts of data being collected.

1.3 Problem Discussion

Ideally, any manufacturing enterprise would prefer to produce products completely free from defects. Generating no defects whatsoever is however rarely the case, since all systems and processes inevitably are subject to some variation (Montgomery, 2012). Employing six sigma limits and having 3.4 defective products for every one million produced is in many regards considered a good performance. However, depending on the number of components in a product or the number of process steps, one may still end up producing several defective products (Eleftheriadis & Myklebust, 2016; Montgomery, 2012). It is also necessary to point out that some industries simply cannot allow any defects at all due to the final products being used in high performing systems or in situations where defect components imply catastrophic outcomes such as putting lives at stake. The aerospace industry is such an industry, where quality of the end product is heavily dependent on inspection of products and components. From a quality engineering perspective, controlling and monitoring processes to the point where zero defects is achieved is therefore of interest. Yet it seems insufficient to employ the traditional six sigma limits. Although both Dogan and Gurcan (2018) and Gultom and Wibisono (2019) suggest that the aim of six sigma limits is to move towards zero defects, Montgomery (2012) simply point out that the goal is a 3.4 ppm defect level. Attaining a Cpkequal to 2.0 is often viewed as having zero defects (Martens, 2011), but a Cpk= 2.0 implies

the employment of six sigma limits, which only delivers a defect level of 3.4 per million units produced (Dogan & Gurcan, 2018; Martens, 2011). It follows that one cannot expect a guarantee of zero defects by solely employing six sigma limits or achieving Cpk= 2.0. From a quality perspective, it is therefore

necessary to establish how zero defects should be quantified, interpreted, and pursued in an organisation such as GAS.

(9)

Since no clear or quantifiable definition has been found in the literature, ZD may be considered nothing more nor less than what its name implies, i.e. 100% non-defective. Thus, ZD may be quantified with a defect rate of 0 ppm. Even though it is not synonymous with ZD, operationalising through Cpk= 2.0, is considered

a reasonable trade-off within the industry where it is expected to give a maximum of 3.4 defective products for every one million produced. Given that the goal of ZD in reality is operationalised through Cpk, GAS

need to follow strict product specifications to achieve required quality. The capability index is enhanced by eliminating excess variability and maintaining a random fluctuation around the mean of each process (Montgomery, 2012). Capability studies are a well-established concept of SPC methodology. Since the only terms capable of describing variability are statistical, Montgomery (2012) claims that statistical methods are key to improving quality. Foidl and Felderer (2015) stressed that several elements of Industry 4.0 may provide great benefits for quality management. It was however discovered by the authors that one of the main challenges following the technological advance is the immense amount of information that needs filtering and processing (Foidl & Felderer, 2015). As a result of increasing data collection within manufacturing industries, larger amounts and more types of measurements follows which can be used to evaluate processes. According to Sall (2018), such amounts quickly become too much to handle from a quality perspective.

In combination, considering requirements from customers and on safety together with the ongoing transformation, it is of interest to study how customer demands on ZD are interpreted within GKN Aerospace Engine Systems. Traditionally, ZD has been described in terms of Cpkequal to 2.0, but from an

organisational perspective it is necessary to evaluate whether other SPC methods can be used to quantify and describe zero defects such as Ppk, Pp, Cp, stability index etc. For GKN Aerospace, having processes that are

mainly operated manually, semi-automated processes, as well as sensor-equipped automated processes, an efficient way of collecting and monitoring process data continuously is paramount for seeing to the customer demand of ZD and preparing for a future Industry 4.0 environment. Figure 1 illustrates a framework in which different actors, systems, concepts, and methods may describe and/or interpret the concept of zero defects in different ways.

(10)

In the literature, ZD has been described in various ways over time. Present literature has mainly focused on managerial aspects to achieve the outcome. Emerging literature predicts that ZD might be fully realisable with the right technology. Industry 4.0, where big data and new technology such as artificial intelligence (AI), machine learning, and cyber physical systems (CPS) can enable smarter manufacturing through interconnecting the physical world with digital systems. Another aspect is the customers, demanding better and better products continuously, often having different demands. GKN Aerospace must try to meet these demands by providing the proper training, management, and a culture where work is done correctly to make sure no defective products are delivered to customers. Focusing on defects, SPC has a central part of explaining what defects are in terms of variation and thus what ZD actually means in statistical terms. 1.4 Aim

The aim of the master’s thesis is to investigate how methods for statistical process control (SPC) can support zero defects within manufacturing processes at GKN Aerospace Trollh¨attan. To support the above aim the following research questions have been formulated:

• RQ1: How is ZD interpreted at GKN Aerospace Sweden? • RQ2: How can tools within SPC support the work towards ZD?

• RQ3: How should tools and methods for SPC be used to approach ZD in complex/modern manufacturing systems?

1.5 Delimitations

There are, according to Wang (2013) two reasons for pursuing products and processes of zero defects; safety and customer expectations. This thesis will focus on the latter, looking at how customer requirements on zero defects can be secured using tools and methods within SPC. The main focus of the thesis will not concern the details of Industry 4.0. However, since GAS and the industrial community at large are preparing to operate in such an environment within a near future, this thesis, and the recommendations made herein will be developed by having Industry 4.0 in consideration.

(11)

2

Literature Overview

The following chapter is a literature overview treating different areas that help describe the topic of the master’s thesis. The literature overview will be used for analysing empirical data later in this report. The main sections of the chapter concern Industry 4.0, a brief review on quality management within industry, statistical process control, zero defects, and managerial implications of improving quality towards ZD. 2.1 Industry 4.0

With roots stemming from a project undertaken by the Federal Ministry of Education and Research in Germany, Industry 4.0 is a term denoting what has been commonly known as the fourth industrial revolution (Lasi et al., 2014). The new paradigm within the industrial community is going to be powered by technological advances, mainly within digitalisation. A broader digitalisation of manufacturing industries that, together with internet technologies allow factories, machinery, processes, and products to communicate with one another. According to Lasi et al. (2014), industrial actors, if capable of setting up such systems, can expect the economic benefits of mass production even though they are maintaining a high degree of product and process customisation in small-batch production (Foidl & Felderer, 2015; Lasi et al., 2014; R¨ußmann et al., 2015). Lasi et al. (2014) defined Industry 4.0 by two directions of development: application-pull triggered by social, political, and economic changes, and technology-push. The push for new technology can take different forms, one of the most important ones being a result of digitalisation and networking. According to Lasi et al. (2014), increasing digitalisation of manufacturing systems provide increasing amounts of data that can be used in efforts to control and analyse industrial processes.

Considered an advanced manufacturing initiative, Ferreira et al. (2018) suggest Industry 4.0 is focused on achieving zero defective products throughout manufacturing processes, implying that ZDM is feasible once manufacturing industries have developed environments of advanced and interconnected technologies. In such environments, contributed by new technologies, devices, innovations and contexts, humans are enabled to act on and drive the production of data. The result, according to Ferreira et al. (2018), is a quick and radical change to the perspective and the capacity we have to deal with both known and unknown data. In order to do so, the collection of data must be planned to meet requirements and enable engineers to analyse and control variation, stability, and capability of processes (SAE International, 2018a). Having a capacity to better and more efficiently discover and use data should be viewed as a key add-on on emerging decision systems according to Ferreira et al. (2018).

Aircraft, machine tools, and other high-technology industries depend largely on manufacturing processes that are safe, efficient, and adaptive. The development of products and processes within these industries are therefore vital, especially the aerospace industry, considering its need for cutting-edge technology and ability to support growth in Europe (Eleftheriadis & Myklebust, 2016). Further, production within the aerospace industry must continually improve to ensure reliable and safe products that are equal or exceed customer and regulatory requirements (SAE International, 2018a).

2.2 A Review on Quality within Industry

The quality of a product or service may be defined in different ways using different perspectives (Bergman & Klefsj¨o, 2012). A traditional definition of quality based on the somewhat narrow viewpoint that products and services need to meet user requirements is fitness for use as suggested by Juran (1999). Montgomery (2012), suggest that there are two aspects of that definition; quality of design and quality of conformance.

(12)

Montgomery (2012) stress that the definition suggested by Juran (1999) has become more associated with the latter, leading to quality work solely focusing on conformance to specifications and thus reduced customer focus. Instead, Montgomery (2012) prefers to define quality to be inversely proportional to variability. The preference is based on reduced variability being directly translated into cost reduction as a result of less waste, rework, effort, and time. Another, and perhaps broader definition of quality is provided by Bergman and Klefsj¨o (2012), suggesting that the quality of a product is its ability to satisfy and preferably exceed customer’s needs and expectations.

It is also of interest to consider how the progression of what quality encompass have changed over time as suggested by Radziwill (2018) who propose four phases of progression:

1. Quality as inspection: Seeking to sort out nonconforming products when such have been produced. 2. Quality as design: Preventive efforts inspired by W. Edwards Deming to consider quality in the design

of products and services.

3. Quality as empowerment: Continuous improvement through empowering individuals. Creating a concern for quality through everyone’s responsibility by applying approaches such as Total Quality Management (TQM) and six sigma.

4. Quality as discovery: Where industry 4.0 enables adaptive and intelligent environments Radziwill (2018) suggest that quality depends on the ability to find and combine new sources of data as well as the effectiveness of discovering root causes and new insights about products, organisations, and ourselves.

We can see that the above definitions of quality may fit into one or more of these progressions. It is also apparent that the concern for quality has augmented gradually from focusing on product inspection, process design, organisation and people, to discovering how situational factors may be utilised to work in favour for quality improvement. Quality could thus take the shape of both incremental and breakthrough innovations, as indicated by Box and Woodall (2012) who propose that quality and innovation have developed strong similarities from an engineering viewpoint. For example, Box and Woodall (2012) suggest six sigma to be an approach that, from the outset, primarily was focused on reducing defects. Since then, it has evolved onto focusing more on process efficiency to what is now more of a methodology for developing new products and services to new markets. Thus, Box and Woodall (2012) suggest that six sigma is more focused on innovation today, however stating that it is still widely associated with defect reduction.

It is the author’s view that the definition proposed by Bergman and Klefsj¨o (2012) is broad enough to encompass all four of Radziwill’s (2018) progressions yet vague enough to result in questions weather how customer needs and expectations should be met and exceeded. Montgomery (2012) defines a more hands-on approach. Thus, the thesis will use the following definition of quality:

”Quality is the ability to reduce variation to the point where customer expectations are met and preferably exceeded.”

The above definition is considered suitable because of its relevance with regards to how ZD is described in the thesis. Operationalising ZD through Cpk= 2.0 as a result of customer requirements, the above definition

relates both to the aspect of customers’ needs as well as the importance of reducing variation as a means to enhance the Cpk index and quality in general.

(13)

Being an integral part of economic performance in an environment where competitiveness and customer requirements are continually increasing (Foidl & Felderer, 2015), quality management is of critical concern for any organisation (Dogan & Gurcan, 2018). More so for organisations producing components and products that are subject to a high degree of stress or part of systems relying on safety. Aerospace, for example, is an industry where product safety is especially important (Magnanini et al., 2019; SAE International, 2018a). The quality of a product or component is described by data of multiple geometric specifications and the quality of a process is characterised by process data sets (Wang, 2013). Being affected by facilities, equipment, and manufacturing processes, the quality of a product is subject to several sources that have the potential to cause errors or defects (Wang, 2013). Papacharalampopoulos et al. (2019) argue that the quality of a product is dependent on the quality and efficiency of the manufacturing process such that deviations in production can result in defective products. Montgomery (2012) suggest that a defect is one, or a set of nonconformities that are serious enough to have a significant impact on the safe and effective use of the product.

Box and Woodall (2012) highlighted how quality and innovation may be viewed as closely related stating that an innovative system can be developed by combining statistical tools for continuously added business value. Information technology and the combination of several sources of data are often used in such systems. However, innovation usually finds greater interest among business leaders than quality. Still, traditional quality tools are often useful for paramount incremental innovations. Combining different key ideas of SPC with other areas of statistics have resulted in innovations such as time series methods, change-point methods, and operations research (Box & Woodall, 2012). For the most part, SPC methods are based on the assumption that observed data is stationary and independent over time, Box and Woodall (2012) acknowledge that such assumptions are often describing reality inadequately. Further, the 3.4 ppm defect metric, distributional assumptions, 1.5 sigma shift, and specification limits are doing more harm than good today.

2.3 Statistical Process Control

Continuous improvement is an overall philosophy for improving quality, part of that philosophy pertains to Statistical Process Control (SPC) (Mohammed et al., 2008). The objective of SPC is to increase the knowledge of a process so that it can be directed towards a specific target or a desired way. Doing so is done by reducing variation of the process or product so that performance can be enhanced (SAE International, 2018b). As nonconformities can be reduced by having less variation, SPC-methods are suitable for improving quality. Further, since variability can only be described in statistical terms, statistical methods are in fact an integral component of quality work (Montgomery, 2012). In statistical science, one primary driving force has been the need to adapt and develop methods and theory that can handle practical problems in various areas (Box & Woodall, 2012). For new products and processes, Box and Woodall (2012) suggest that statistical thinking and methods are often necessary for handling both design and analysis of the measurement systems needed. The increasing amount of data has also served as a force driving developments in statistical science together with increases in computational power, making computationally intensive methods and analysis of larger data sets possible (Box & Woodall, 2012). Utilising observed data to monitor and control processes requires some groundwork. First, appropriate methods and tools must be selected that complies with design characteristics and process variables that represent the product quality (SAE International, 2018a). Common practice is to identify measurable key characteristics (KCs) which are variables or attributes whose variation effect product fit, form, function, performance, or producibility significantly. In the second step, analytical studies of process effectiveness is conducted with regards to process stability and capability as well as actions needed to handle potential problems. In order to do so, the collection of data must be planned so that it can be used to understand the

(14)

process. The data must also be generated and analysed using statistical techniques to interpret stability, capability, and variation (SAE International, 2018a). Before continuing on to the third and last step, improvements are conducted on the process in order to see to problems with stability, capability and variation due to special causes. Once a state of statistical control is established, one can continue to the third step where process monitoring and control is applied continually in production with the goal of maintaining the stability and capability of the process. Figure 2 illustrates the different steps of process control (SAE International, 2018a).

Figure 2: Process Control Activities (inspired by SAE International (2018a), p.7)

In figure 2, the second and third step may be interpreted as phase I and phase II control chart applications respectively. Phase I involves gathering and analysing data in retrospect to determine whether or not the process has been in a state of statistical control. If not, efforts to identify and eliminate assignable causes of variation are carried out so that reliable control limits can be established for future monitoring. Once the process is brought to a state of statistical control, phase II utilises control charts to monitor the process and maintain its stability.

According to the aerospace standard on process control methods (AS13006) developed by the Aerospace Engine Supplier Quality (AESQ) group, proactive process controls are preferred over reactive approaches such as inspection (SAE International, 2018a). Post-production identification (inspection) of product defects and process deviations often result in compensation strategies that consume excess resources such as time, material, and energy (Magnanini et al., 2019). Thus, when possible, processes should be managed into a state of control as early as possible in the production cycle rather than employing quality control by inspection at the end of production (SAE International, 2018b). Quality inspection will not enable a defect level of zero. Rather, it motivates a culture of only finding solutions to urgent problems and not long-term quality and customer satisfaction (SAE International, 2018b).

2.3.1 Control Charts

The typical control chart, often referred to as a Shewhart control chart after its inventor Walter A. Shewhart, represent a plot of data over time (Shewhart, 1931). A centre line (CL) and two control limits are added to the control chart horizontally. The centre line is often represented by the data set mean and the control limits are usually computed by adding and subtracting three standard deviations to the mean of the data set (Mohammed et al., 2008). The Greek symbol for sigma ( ) is often used for denoting standard deviation, therefore these control limits are sometimes referred to as three-sigma limits. For practical reasons, it is not

(15)

always possible to have negative data points. In that case, it is customary to set the lower control limit to 0. If the data points are plotted within these limits and without any unusual pattern, the process is considered to be in a state of statistical control. Statistical control is an important construct within SPC, meaning that a process is only showing variation due to common causes (Mohammed et al., 2008) and exhibit a random and predictable pattern, within its natural way (SAE International, 2018b). Variation caused by sources that are not considered part of the system or process itself, is commonly referred to as special (or assignable) causes (SAE International, 2018a). If such variation is present, the process is said to be out of (statistical) control. There is a variety of additional guidelines to determine how a process is performing based on these types of control charts. However, as more rules are used, one can expect an increase in false alarms. Three main rules are widely accepted (Mohammed et al., 2008):

• Eight or more consecutive points on one side of the centre line.

• Two out of three consecutive points plotted beyond either the upper or lower two-sigma limit. • A trend of eight or more consecutive points either increasing or decreasing.

The observations or collected data that serve as data points in a control chart can be divided into two main categories; variable data that are continuous in nature, and discrete attribute data that are based on counts or classifications (SAE International, 2018a). These types of data can be divided further based on sample size, data type, and the size of shift in mean that is to be detected (Montgomery, 2012), see figure 3.

Figure 3: Univariate Control Charts Decision Tree (inspired by Montgomery (2012))

All the charts in figure 3 have in common that the data are collected and presented sequentially over time. Mohammed et al. (2008) suggest that it is necessary to make certain that data are independent over time. In that regard, independence implies that successive data points do not show any relationship or autocorrelation. There are two types of autocorrelation according to Mohammed et al. (2008). Positive autocorrelation is present when high values tend to be followed by high values and low values tend to be followed low values. Negative autocorrelation is present in a data set when low values are followed by high values and vice versa (Mohammed et al., 2008).

(16)

2.3.2 Process Capability

In order to provide a quantitative measure of process potential and performance, capability indices such as Cp, and Cpk are used within manufacturing industries (Pearn & Chen, 1999). Cp may be used to

determine the potential capability of a process if it is centred about the mean but does not supply any information about why the process is not meeting specifications (Montgomery, 2012; Sall, 2018). For measuring the actual capability, Cpk may be used and compared to Cp in order to give an idea of process

centring (Montgomery, 2012). In order to provide information about process capability, i.e. how well it could perform if process drift issues are fixed, Sall (2018) suggest using Cpk. For determining how well a

process is doing, Sall (2018) suggests using the performance index Ppk. Cpkis computed using a short-run

local estimate of the process standard deviation while Ppk is computed using a long-run classical estimate

(Sall, 2018). Being easy to apply and understand, indices are often used to make conclusions without considering the underlying data, distribution, and sampling errors according to Pearn and Chen (1999). One of the main prerequisites for computing reliable capability indices is that the studied process is in a state of statistical control. Montgomery (2012) argue that the indices Ppand Ppkwere developed to describe

processes that are out of statistical control, implying that they essentially tell you nothing. According to Deming (1986), it is pointless to predict the outcome of a process that is out of statistical control since it has no capability. Describing complex phenomena in a very simplified way, both Montgomery (2012) and Pearn and Chen (1999) stress that indices of process performance and capability should be used with great care since. Formulas for computing the above mentioned indices are provided below where USL is the upper specification limit, LSL is the lower specification limit, µ is the process mean, is the short-run local standard deviation and s is the long-run sample standard deviation.

Cp = U SL LSL 6 (1) Cpk = min ⇢ U SL µ 3 , µ LSL 3 (2) Pp = U SL LSL 6s (3) Ppk = min ⇢ U SL µ 3s , µ LSL 3s (4) 2.4 Zero Defects

Increasing demands on produced parts have led to an escalation in the importance of reducing defects drastically within manufacturing (Tatipala et al., 2018). In fact, to gain customers and market shares in modern competitive markets, Wang (2013) claims that reaching the goal of so-called zero defect products is essential. Zero Defect Manufacturing (ZDM) is a concept practised within manufacturing that seeks to minimise defects in processes by doing things right from the very beginning, ultimately seeking to reduce the defect rate to zero (Wang, 2013). By reducing variability during production, Papacharalampopoulos et al. (2019) suggest that ZDM attempts to achieve better and more sustainable production systems on both economic and environmental grounds. By minimising defective output, cost, time, raw materials, and other resources are reduced (Papacharalampopoulos et al., 2019).

(17)

Although it can be traced back to the 1960s (Eleftheriadis & Myklebust, 2016; Montgomery, 2012), it was during the 1990s that zero defects saw wide efforts of implementation when automotive companies wanted to cut costs by reducing quality inspections while simultaneously increase the demands on quality from suppliers. When it was introduced by the US State Secretary of Defence, zero defects (ZD) was an approach for eliminating defects derived from human error (Eleftheriadis & Myklebust, 2016). Meant to inspire all levels of the organisation to do their jobs right the first time, ZD was heavily focused on underlining that it was not meant to be viewed as a technique for evaluating employees, a speed up program, nor a substitute for quality control (Eleftheriadis & Myklebust, 2016). An important point made by Eleftheriadis and Myklebust (2016) is that, as a concept, ZD took notice of that dedication, enough training, and tool proficiency among employees did not necessarily guarantee products and processes free from the defects. However, the focus on each and everyone’s individual accountability for quality on a personal and collective level was later disclaimed by Deming (1986) who stated that the majority of causes for low quality are derived from the system itself and not the workforce. Wang (2013) address the necessity of a system for ZDM to prevent failures and increase the reliability and safety for manufacturing industries. Tatipala et al. (2018) suggest that part of such a system is the ability to control product and process parameters with the use of connected manufacturing technologies and control-systems that handle machine and other process data. The modern technologies that make up an Industry 4.0 environment, supplies the right conditions for the concept of ZDM to emerge and, as described by Papacharalampopoulos et al. (2019), ”...be established as a basic pillar of the new era.”. By increasing digitalisation and inter-connectivity of processes and production lines, Industy 4.0 can provide strong tools for achieving ZDM. Papacharalampopoulos et al. (2019) mention data based models, simulation-based engineering, online measurement, and data gathering systems as tools that can be used to work towards ZD.

Although quite a lot has been written on the subject, no clear or quantifiable definition of ZD or ZDM has been found in the literature. Therefore, it is considered to be nothing more nor less than what its name implies, i.e. 100% non-defective. Thus, ZD may be quantified with a defect rate of 0 ppm. It is however important, as suggested by Deming (1986), to consider where in the chain of process steps defects are recorded and how they are measured. Likewise, it is of interest to determine who decides what is defective and how the interpretation of defects is shared among management, worker and inspector. When considering how Montgomery (2012) defines defects, it is implied that such are recorded when a product has been delivered to the customer. With good process knowledge, it is probable to identify these sooner rather than later. Altogether, we may view ZD as a concept which seeks to reach a rate of 100% conforming and deliverable products or services. Also, one could argue that ZD is a state of being as an organisation or for a process, producing 100% conforming and deliverable products or services. Nevertheless, without advanced and interconnected manufacturing systems that can control production to perfection, guaranteeing to customers deliveries of 100% conforming products may only be feasible with the use of suitable tools and methods within SPC accompanied by thorough inspection.

2.4.1 Modern Manufacturing Requirements

Up until recently, statistical tools and collection of data have been considered excellent, or at least sufficient indicators, potentially improving quality. Today’s complex manufacturing systems enabled by technologies such as internet of things (IoT) and cyber physical systems (CPS) as well as increasing amounts of data requires a better understanding of the inter-operability between the different elements of a manufacturing system (Eleftheriadis & Myklebust, 2016). With increasing complexity of multi-stage manufacturing systems, product quality characteristics have developed inter-dependency among process stages (Magnanini et al., 2019). Despite rigorous tests and controls during manufacturing, there are still a proportion of nonconforming products that are cleared and delivered to customers (Papacharalampopoulos et al., 2019). Therefore, Papacharalampopoulos et al. (2019) suggest that there is a need for real time monitoring and

(18)

more accurate, adaptive process control methods. Considering the steadily increasing customer demands and forecasts that an Industry 4.0 environment will facilitate a higher degree of customisation and reduction in batch sizes down to one (Lasi et al., 2014), traditional methods for process control needs to be supplemented. Although viewed as a necessary control instance allowing users to measure, evaluate, and assist long-term improvements, Statistical Process Control (SPC) is at best partially applicable in small-batch production (Eleftheriadis & Myklebust, 2016). SPC requires a certain number of measurement values of a quality characteristic and is traditionally used for monitoring the characteristic so that it is possible to distinguish between the natural variability of a process and the variability assignable to special causes (Eleftheriadis & Myklebust, 2016). Assignable (special) causes are then to be eliminated and can provide information about additional improvement of the process (Montgomery, 2012). Thus, zero defect approaches such as enhanced real-time process control and improved medium- and long-term process improvement are also necessary for achieving ZDM (Eleftheriadis & Myklebust, 2016).

Looking at multi-stage manufacturing systems for producing products with complex features such as engine components for aircraft, Magnanini et al. (2019) suggest that it is important to consider quality strategy with the overall production strategy. Due to large size and high production costs Arsuaga Berrueta et al. (2012) state that the impact of defective parts in aerospace production is paramount. Final product quality is a result of the possibility within the different operations in different stages of the process chain to produce satisfactory. Further, Magnanini et al. (2019) argue that the capability of detecting defects or detecting the possibility of them along the process chain is related to time and cost savings. Controlling quality at the end of the manufacturing line enables for defects produced in earlier process stages to spread onto the following (Arsuaga Berrueta et al., 2012). For the aircraft industry, complex parts, large number of parameters, and generally short series, imply that it is insufficient to adjust production by employing SPC-methods (Arsuaga Berrueta et al., 2012).

As a result of what is considered feeble attempts to improve quality post-production, practitioners and researchers have started to focus on developing methodologies for simultaneously increasing product quality and process capability, rather than treating these two separately (Magnanini et al., 2019). Since the different stages of a manufacturing process is subject to variability that can result in deviations or defects, Magnanini et al. (2019) suggests characterising controllable and uncontrollable causes of variability on the final product is a key challenge. Magnanini et al. (2019) explain that the aim of ZDM is to reduce defective product by having an integrated strategy capable of identifying errors sooner rather than later to successfully avoid procreation of defects along the process chain. Thus, strategies for ZDM aim at improving the quality of both product and process. However, developing quality-oriented strategies that are integrating products and systems require a good understanding of the manufacturing system with regards to both organisational and technological complexity. According to Magnanini et al. (2019) that understanding provides better reactiveness to customer requirements and thus market competitiveness.

As proclaimed by Eleftheriadis and Myklebust (2016), creating a guideline of suitable quality tools for new and complex ZDM systems should not be considered straight forward. It was however discovered that systematic approaches of validating, structuring, and storing acquired data proved to be important. Eleftheriadis and Myklebust (2016) suggest selecting control tools that are critical to quality with respect to machine tolerance and collecting end-user process knowledge. By doing so, a guideline for handling what is referred to as Vital Process Parameters (VPP) can be developed. The development of new and intelligent ZDM systems are not likely to take place at one instance. Rather, the process is likely to take form gradually over time where the organisation and management will learn concurrently. Although Arsuaga Berrueta et al. (2012) suggest SPC-methods are insufficient for reaching a state of zero defects, several authors indicate that key issues are possible to handle using these methods. At the very least, typical components of SPC, such as control charts and capability studies, seems necessary to remedy existing quality problems.

(19)

2.5 Organisational Implications

As noted, there are several different points made within the literature that tries to describe what ZD is and how it is obtained. Regardless, there are organisational aspects that need to be considered in order to support the wanted trajectory of an enterprise. One important factor discussed by Deming (1986) is the adaptation and institution of leadership, where focus on outcome must be relinquished. Among the examples mentioned are work standards, management by numbers, meet specifications, and zero defects. Supervising on outcome should be replaced by sources of improvement, the intent of quality of product, and on translating the intent into design. In order to do so, barriers that hinder the employees to be proud of their work must be removed. Further, it is necessary that leaders are familiar with the work they are supervising so that they in turn are able to inform their managers what needs to be corrected such as inherited defects or maintenance of tools and machines. An example described by Deming (1986) is treating every non-conformance as a special cause when the process is stable. The result is recurring troubles when the only remedy should be improving the system by reducing process variation or adjusting the level.

Deming (1986) suggested that only acting on the system itself can result in substantial improvement. Exhortations, slogans and targets are therefore not advised. Not only must the specific process or organisation where in question be free from defects, all the suppliers down the chain must supply no defectives either. ”Do it right the first time”, is a phrase related to ZD (Eleftheriadis & Myklebust, 2016; Wang, 2013). According to Deming (1986), doing right the first time is nearly impossible since one has to be able to make certain that inputs are totally free from defects, the tools and machines for producing and measuring must also be flawless. Further, not only denouncing rates and incentive pay, Deming (1986) suggest that exhortations and targets are directed at the wrong audience, only to show that management has not identified the barriers to enable the work to be carried out properly, as a result of management expecting that employees can improve quality and accomplish ZD solely by trying harder. It is the responsibility of management to improve the system so that it can enable employees to do their jobs. What management should communicate to the employees is what they are doing for the employees, how they are improving the system so that the job can be carried out and provide better quality. Setting up and communicating numerical goals to the people within the organisation is therefore self-defeating if without a clear and detailed road map on how to reach them. Deming (1986) suggest that quotas are affecting productivity and quality negatively since they only result in increased costs of operation and curb pride of workmanship. Furhter, Deming (1986) suggest that more engineers are often occupied with developing work standards and counting defectives than engaged in production.

Numerical goals should not only be eliminated among the employees, but for management too. Employees, including management, need to be allowed to be proud of their work. Therefore, resources and prerequisites need to be secured in order for the people in the organisation to make the right quality. The right quality cannot be produced if employees have to spend their time inspecting and correcting defective product. Inspection cannot create or improve quality because it is too late as well as ineffective and costly. Inspection is planning for defects, accepting that the process does not have the capability for the required specifications. But if there is no quality, inspection may be the only alternative. It is sometimes however inevitable to have inspection because of economic reasons or customer demands. Inspecting smaller samples are also necessary for producing control charts that help achieving or maintaining statistical control of processes. If systems and processes are not stable, machines, gauges, and other equipment are out of order the only thing that can be produced is defective product. Referring to such efforts as ”putting out fires”, Deming (1986) stresses that it is not improving a process. Identifying and removing a special cause detected by a point out of control is neither. It merely resets the process to where it should be. By removing barriers that inhibit the organisation from producing quality product, people will feel important to the job they are executing, resulting in pride of accomplishment and a willingness to improve the system.

(20)

According to Deming (1986), typical obstacles are leaders who do not have sufficient knowledge to give leadership, inadequate training in technology, documentation on how the job is done, excessive or complicated documentation on how the job is done, resulting in people not knowing what their jobs are. Management can remove these barriers by listening to advice and take action on suggestions from the people who do the job. Referring once again to system stability, Deming (1986) describes that it is useless to set up a goal if the system is stable. The outcome will be what the system is capable of delivering. Distributed normally there will always be a percentage of output that is lower than the mean as there will always be a percentage that is above the mean. On the other hand, if the system is not stable it has no capability and it is therefore no way of knowing what it will put out. Therefore, setting a goal is pointless in this case as well. In other words, outcome is an ineffective measure to focus on when seeking to improve processes.

As implied by both Montgomery (2012) and Deming (1986), variation is an aspect greatly affecting the quality of products, processes, and business as a whole. As mentioned, statistical methods are key to describe variation and issues thereof. Continuous improvement is not derived from accordance to specifications, it is a result of the ability to reduce variation about the nominal value. In order to learn about the different sources of variation in an organisation, Deming (1986) suggest that barriers between staff areas need to be broken down. According to Deming (1986), production and delivery is prioritised to the extent where engineers have to make production trade-offs. Doing so inhibits them to learn about production and design problems. To a large extent, this is a result of controlling the workforce on metrics such as KPIs or ratios, ultimately resulting in them making to specification. The remedy, as described by Deming (1986) in point nine of the 14 points for management, is that people in research, design, purchase, sales etc. must learn from each other and one’s struggles. Quality should be built in at the design stage, therefore collaboration between departments is fundamental. The quality starts with the intent which comes from management according to Deming (1986). For the aerospace industry where product and design complexity is a fact, transferring the intent of quality is highly relevant, especially between design and manufacturing (Arsuaga Berrueta et al., 2012; Magnanini et al., 2019).

(21)

3

Methodology

The following chapter presents the methodology of the thesis. First, the research approach is described. Choices regarding data collection and analysis approaches are motivated, where methods that have been used are discussed in relation to alternative methods. Lastly, research quality in terms of reliability and validity of the thesis is also treated within the chapter. Figure 4 represents the main steps of thesis methodology.

Figure 4: Main methodological process steps.

3.1 Research Approach

The research approach of the master’s thesis may be considered deductive. A deductive approach is usually focused on causality (Deborah, 2013). The approach may therefore be considered deductive since ZD is a concept which, especially during the last two decades, have been studied extensively. The empirical data collected for the thesis is considered to be sufficiently detailed and rich in order be used for the purpose of exploring the phenomenon of ZD and explain it by developing themes in the thematic analysis.

In order to answer the research questions and fulfil the aim of the thesis, the research strategy is of exploratory character with a qualitative approach for collecting data. First, a literature overview was conducted in order to map out different views and approaches of ZD and SPC. The literature overview resulted in a developed model which can be described by the framework in figure 1. The framework depicts different actors, systems, and concepts that may describe ZD in different ways. The thesis used existing theory from the literature to describe the concept of ZD in general as well as how ZD may be approached by utilising tools and methods for SPC, answering the third research question. The general description was then compared to a qualitative study consisting of 14 interviews in order to gain deeper and specific knowledge of the phenomenon at GKN Aerospace. The comparison was carried out through a thematic analysis, addressing the first, and partially the second and third research question. Developing a general view through literature and comparing it to empirical observations implies the research being of more deductive character according to Saunders et al. (2007). According to Saunders et al. (2007), deduction may be viewed as theory generating data and induction may be viewed as data generating theory. In addition, Saunders et al. (2007) suggest that an important characteristic of deductive approaches is that concepts have to be operationalised to provide measurability. As the thesis assumes that ZD may be operationalised through Cpk, the deductive approach is thought to be suitable.

In order to support the second research question with a practical example, an SPC study was also conducted. The SPC study provided understanding of how GAS is working with collection, reporting, and analysis of process data. The quantitative data used was collected from the internal system for process data called QSYS. Thus, it was not generated for the purpose of the study but rather collected as a means for identifying and analysing how process data is currently managed within the organisation. In combination with the interviews and the case study, the framework was then used as a help for assessing the current situation at

(22)

GKN Aerospace Trollh¨attan with regards to working towards zero defects from an SPC-perspective. Table 1 summarises how the different studies address the three research questions of the thesis.

Table 1: Relation between the studies of the thesis and the research questions

Study Research Questions Literature Overview 1, 2, 3 Initial Analysis 1, 3 Thematic Analysis 2, 3 SPC Study 2

3.2 Literature Overview

It is, according to Eisenhardt (1989) essential to compare emerging concepts and theories with existing literature for the purpose of internal validity and generalisability. Additionally, if different sources of literature are conflicting it may drive the research to become more creative and frame-breaking, resulting in deeper insight and reducing the degree of generalisability of the study (Eisenhardt, 1989). The literature used in the thesis mainly consist of peer-reviewed articles in scientific journals. Articles from established journals were of special importance as was articles which have been cited to a higher extent since these articles are more likely to be part of the contemporary views on the topic within the literature. Additionally, the articles should be considered quite recent since none of them are more than 20 years old, thus portraying contemporary concepts and theories on the subject. To a larger extent, the scientific articles describe contemporary views within the literature. These articles have been compared to existing theories and concepts from previous literature. Among previous literature, Deming (1986) and Montgomery (2012) are considered well established sources that are broadly considered to be valid and reliable. In addition, international aerospace standards have been used in order to consider what can be viewed as an industry perspective. These standards have been used to consider the theories and concepts that are practised within industry.

Two databases have been used for finding articles in scientific journals; Scopus and Google Scholar. Scopus served as the primary source database on the basis of structure and citation traceability. By using Scopus, it is easy to toggle back and forth between different articles through citations. Another reason for Scopus serving as a primary database choice was the ability to formulate more specific search strings using booleans and truncation. Restrictions in accessibility through Scopus resulted in some articles being collected through Google Scholar, given its extensive amount of articles.

In order to find articles treating subjects such as ZDM, SPC, quality management, and industry 4.0, different keywords were used. Some of the more frequent ones included defect, zero defects, manufacturing, aerospace, process control, statistical process control, process monitoring, industry 4.0, and quality improvement. Some of the keywords generated great amounts of results from an array of industries and practices. Filtering on aerospace or vehicle transport, statistics, as well as manufacturing and engineering was therefore employed in order to reduce the number of less relevant hits and focus the results towards the project’s aim and the specific industry. A wide variety of combinations of the keywords were also employed, using booleans and truncation.

(23)

3.3 Interviews

The collection of qualitative data consisted of 14 semi-structured interviews conducted with company representatives that were either involved in quality work or had good knowledge of particular processes. Considering the literature, working towards ZD requires engagement and action from management on all levels. The employees chosen for the interviews in this thesis are therefore represented by top management, managers from areas such as design, manufacturing, and quality, as well as managers of specific products, processes and projects. The aim of interviewing employees in different types of managerial positions was to consider different areas of the organisation in an attempt to gain a comprehensive collection of data on the views within the organisation as a whole. Further, engineers within both design and manufacturing as well as production technicians were interviewed to get an understanding of the more operative aspects of quality work. These individuals work close to problems and the more practical aspects of the business. According to Longhurst (2003), semi-structured interviews are partially structured and self-conscious verbal interchanges where information is extracted through questions. Semi-structured interviews can be used as a supplement method, a way to triangulate in multi-methods research, or as a stand-alone method for collecting empirical data (Longhurst, 2003). The interviews conducted in this thesis were a way of gaining deeper knowledge of process management with regards to data collection and analysis as well as more organisational aspects such as work routines and communication to get a better understanding of how the interviewees work towards zero defects. Table 2 contains the interviewees in order of conducted interviews, the approximate number of years they have worked for GKN Aerospace, years in current position as well as approximate interview duration.

Table 2: Interviewee information and interview duration.

Number Title Years in organisation Years in position Duration 1 Manager Quality Assurance 35 1 55 min 2 Quality Assurance Product Development 38 13 51 min 3 Engineering Management & Support 36 3 59 min 4 Engineering Management & Support 3 1 54 min 5 Manager Area Manufacturing Engineer 35 0.5 46 min 6 Quality Engineer 24 8 58 min 7 Robust Design Engineer 8 8 60 min 8 Director Operational Excellence 12 1 85 min 9 Director Manufacturing Engineering & Quality 25 1 60 min 10 Director Quality 36 4 71 min 11 Chief Manufacturing Engineer 25 3 64 min 12 Chief Manufacturing Engineer 13 1 44 min 13 Technology Insertion Project Manager 34 0.5 48 min 14 Product Manufacturing Engineer 32 24 59 min

The qualitative approach with semi-structured interviews was chosen to obtain empirical data which could be compared to one another during analysis while still enabling the interviewees to speak freely about what they felt was most important for approaching ZD by utilising tools and methods within SPC. Kallio et al. (2016) suggest that interview guides for qualitative semi-structured interviews reflect the objectivity and trustworthiness of a study as well as the plausibility of its results. In order to develop a rigorous interview guide, five steps suggested by Kallio et al. (2016) was followed:

(24)

1. Identifying prerequisites for using semi-structured interviews 2. Retrieving and using previous knowledge

3. Formulating the preliminary semi-structured interview guide 4. Pilot testing of the interview guide

5. Presenting the complete semi-structured interview guide

The interview guide used during the interviews can be found in Appendix A and contained four general and 33 more specific questions as well as three statements on which the interviewees were asked to share their opinions. Both the general and specific questions were developed with the help of the literature to enable comparative analysis of literature and the collected interview data. The literature overview identified key areas that related to ZD. These areas are broadly represented by the subsections in the literature overview. First, 20 questions were developed that related to how ZD could be interpreted within GKN Aerospace. These questions were mainly developed with help of company representatives and related to earlier improvement initiatives and the internal systems at GKN Aerospace. The literature overview helped define these questions with regards to improvement initiatives, measurability, and data reliability. Three specific questions were developed for the purpose of defining what defects are and how these are detected. Six questions related to SPC as well as how Cpk is interpreted and enhanced within the organisation. Four

questions related to Industry 4.0 and how modern technologies may affect the work towards ZD. The three statements were based on difficulties in working with SPC within the aerospace industry, identified in AESQ’s Guidance Materials standard (SAE International, 2018b). Follow-up questions on some of the more specific questions were asked in an attempt to get further explanations.

Each interview was initiated with the general questions followed by the first couple of specific ones where it unfolded in a conversational manner letting the interviewees speak more freely (Longhurst, 2003). The less structured approach to interviewing was chosen for the purpose of gaining deeper knowledge about how the company representatives think about, work with, and interpret process control methods and the measures used as well as their interpretation of how these aspects relate to ZD. Each interview was recorded and transcribed within 24 hours for the purpose of keeping the interviewer focused on the specific interview. Transcription was conducted to develop a documented set of data for the thematic analysis. Each recorded interview was first played back once to serve as a reminder of the interview. During this playback, notes were taken. Afterwards, the interview was played back a second time and transcribed. During the second playback, each interview was written down in detail for the purpose of clearly depicting what each interviewee shared.

3.4 Thematic Analysis

As a relatively straight-forward form of qualitative analysis, thematic analysis is a method used for identifying, analysing and highlighting patterns within data (Braun & Clarke, 2006). These patterns are the themes formed through the analysis. Thematic analysis can be used for different purposes according to Braun and Clarke (2006). Riessman (1993) suggest that the thematic approach can be employed to theorise by identifying common themes across the participants in the research and what they share. Although it may be used with one or more theoretical frameworks, it is not bound by having a pre-existing theoretical framework for conducting analysis. Thematic analysis can be used to report meanings, experiences and the reality (i.e. essentialist or realist method) or examine how meanings, experiences, realities or events are the effects of different aspects of society (i.e. constructionist method). Thematic analysis can thus both reflect reality or explain reality (Braun & Clarke, 2006). The conducted thematic analysis uses theories and

References

Related documents

We have used basic statistical analysis methods such as distribution analysis, correlation analysis, factors analysis and partial least squares regression analysis on a

Saunders, M., Lewis, P. Research methods for business students [e-Book]. ed.) Harlow: Financial Times/Prentice Hall. Predictive analytics model for power consumption in

This chapter provides the reader with a theoretical frame of reference important to understand the subsequent content of the thesis. The chapter also summarizes

2a) internal alignment and 2b) external alignment, which are evaluated during a meeting called a product workshop. Evaluating these two categories occurs in a two-part

By interpreting the single-objective problem as a multi-objective problem a question is formed: Can a multi-objective evolutionary algorithm efficiently find a robust solution with

The study explores the role of management control systems in a strategy formulation process, this by viewing management control systems as a package and addressing

The other dimensional control workstation around another MMT called “Mauser”, Figure 3. This is also an air-conditioned room equipped with conventional control tools. There

In order to understand the role follow-up plays in projects it is first important to recognize the context in which follow-up occurs in and to understand the basic factors