• No results found

Industry 4.0: A reference model for a digital twin of a collaborative manufacturing cell

N/A
N/A
Protected

Academic year: 2022

Share "Industry 4.0: A reference model for a digital twin of a collaborative manufacturing cell"

Copied!
54
0
0

Loading.... (view fulltext now)

Full text

(1)

INDUSTRY 4.0: A REFERENCE MODEL FOR A DIGITAL TWIN OF A

COLLABORATIVE MANUFACTURING CELL

Master Degree Project in Virtual Product Realization One year Level 18 ECTS

Spring term 2020 Thomas Erthle

Supervisors: Bertil Thorvaldsson Masood Fathi

Examiner: Amos Ng

(2)

Abstract

With the advance of the fourth industrial revolution, the manufacturing industry is increasingly changing. The intelligent networking of machines and the collection of real-time data opens up new possibilities for data- based decision making. One of those applications is the digital twin. Those are digital copies of physical objects that represent the behavior of their real-world counterparts. Therefore they consist of several models to describe this behavior. The use of digital twins for virtual commissioning has been proven to reduce real commissioning time as errors can be identified and fixed before they occur. Especially in the manufacturing industry virtual commissioning is used for expensive assets such as whole manufacturing lines. This thesis investigates how a three-dimensional model of a manufacturing cell can be connected to its real-world counterpart. The aim is to develop a reference model. A virtual commissioning model of a real manufacturing cell is created to evaluate the main components of the reference model. The outcome of this study is a theoretical reference model to connect a collaborative manufacturing cell with a digital twin. Through the connection, the three-dimensional simulation model acts as a video of the real cell. The reference model can serve as the base for further research such as remote troubleshooting or to improve existing models.

Keywords: digital twin, communication, reference model, architecture, middleware

(3)

Acknowledgments

First I would like to thank my thesis supervisors Bertil Thorvaldsson and Masood Fathi of the School of Engineering Science at the University of Skövde. They guided me through the entire process in the best way possible with insightful ideas and constructive feedback.

Furthermore, I would like to thank my parents and my girlfriend for the support and encouragement throughout my study and the process of researching and writing this thesis. This accomplishment would not have been possible without them.

Skövde, June 2020

Thomas Erthle

(4)

Certificate of Authenticity

Submitted by Thomas Erthle to the University of Skövde as a Master Degree Thesis at the School of Engineering.

I certify that all material in this Master Thesis Project which is not my own work has been properly referenced.

Thomas Erthle

(5)

Table of Contents

1 Introduction ... 1

1.1 Background... 1

1.2 Problem definition ... 1

1.3 Research aim and objectives ... 2

1.4 Delimitations ... 3

1.5 Thesis organization ... 3

2 Sustainable development ... 4

3 Frame of reference ... 6

3.1 Digital twin ... 6

3.1.1 Concepts of digital twin ... 6

3.1.2 DT modeling and simulation ... 7

3.1.3 Data fusion ... 8

3.1.4 Application of DT ... 8

3.2 Internet of things ... 8

3.3 Cyber-physical system ... 9

3.4 Data exchange ... 10

3.5 Industrial solutions ... 11

3.5.1 Programable logic controller ... 11

3.5.2 Automated guided vehicles (AGV) ... 12

3.5.3 Industrial robots ... 12

3.5.4 Simulation software ... 12

4 Methodology... 13

4.1 Research design ... 13

4.2 Research strategy ... 14

4.3 Data collection and data analysis... 15

4.4 Research quality ... 18

(6)

5 Reference model ... 20

5.1 Digital twin architecture ... 20

5.1.1 Architecture conclusion ... 22

5.2 Communication ... 22

5.2.1 Communication conclusion ... 23

5.3 Requirements ... 24

5.4 Reference model ... 26

6 Digital twin creation ... 28

6.1 The case ... 28

6.1.1 Layout ... 28

6.1.2 Process ... 29

6.2 Simulation model ... 29

6.3 Limitations ... 30

6.4 Challenges for the connection ... 30

7 Evaluations ... 31

8 Conclusion ... 34

References ... 36

Appendix ... 42

(7)

Table of Figures

Figure 1 - Three pillars of sustainability (Nayyar and Kumar, 2019, p. 54) ... 4

Figure 2 - Sustainable development goals ... 5

Figure 3 - Information mirroring model (Grieves, 2018) ... 7

Figure 4 - Relation DT, CPS, IoT (Lu et al., 2020) ... 10

Figure 5 - Average RTT OPC UA, MQTT, DDS based on Profanter et al. (2019) ... 11

Figure 6 - Search process ... 18

Figure 7 - Overview of architecture models based on Redelinghuys et al. (2019); Yi et al. (2020); Zheng and Sivabalan (2020) ... 20

Figure 8 - Three-layer reference model ... 27

Figure 9 - Tact 1 (considered part of the cell) ... 28

(8)

Index of Tables

Table 1 - Search limitation ... 16

Table 2 - First keywords search ... 16

Table 3 - Finale keywords search ... 17

Table 4 - Key features digital twins ... 25

Table 5 - RTT acknowledge mode bool and integer variables ... 31

Table 6 - Assignment data type OPC UA to data type SIMIT (SIEMENS AG, 2018) ... 32

(9)

Terminology

A

AGV

Automated Guided Vehicle ... 12 AI

Artificial Intelligence ... 1 API

Application Programming Interface ... 21

C

CNC

Computerized Numerical Control ... 22 CPS

Cyber-physical System ... 6

D

DDS

Data Distribution Service ... 11 DT

Digital Twin ... 1

H

HMI

Human-Machine-Interface ... 21

I

I/O

Input/Output ... 30 IoT

Internet of Things ... 1

M

MQTT

Message Query Telemetry Transport ... 10

O

OPC UA

Open Platform Communication Unified Architecture ... 10

P

PKI

Public-Key Infrastructure ... 32 PLC

Programmable Logic Controller ... 3

R

RFID

Radio Frequency Identification ... 8

S

SC

Smart Component ... 29 SDG

Sustainable Development Goals ... 3 SGD

Sustainable Sustainable Development Goals ... 4 SHM

Shared Memory ... 12

V

VC

Virtual Commissioning ... 8 VRC

Virtual Robot Controller ... 12

X

XR

X-Reality ... 1

(10)

1 Introduction

1.1 Background

Industry 4.0 stands for the intelligent networking of machines and processes in industry by using information and communication technology. The term was originated from ‘Industrie 4.0’ which was introduced in the year 2011 at the Hannover Fair, Germany. The idea of Industry 4.0 is to create highly flexible and automated productions to be able to produce highly customized products.

Artificial intelligence is used for self-optimization, controlling, and decision support. Autonomous things like robots, autonomous vehicles that use artificial intelligence (AI), are taking over more and more work that was previously done by humans. This is more than just automation because the individual devices work with their environment. The cooperation is based on communication between devices, which is made possible by the Internet of Things (IoT). AI is used to analyze data collected by IoT. The machine learning and deep learning algorithm analyze the data in real-time (Panetta, 2020). A further technology in Industry 4.0 is X-Reality (XR) e.g. virtual reality or augmented reality. It is used by companies to optimize products and processes. For example, Volvo tests its future interior in a virtual model. Besides seeing the interior the designer can also experience it while sitting in the virtual car. Another application of XR can be found in the field of education.

Especially to train people in situations that are rare but involve high risks, such as fire fighting or nuclear control rooms. This technology saves time, costs, and leads to higher process and product quality (Hadwick, 2019).

1.2 Problem definition

The development from mass production to customized products leads to a shorter product lifecycle.

This requires more flexible production lines that need to run as much as possible to be economical.

Thus production equipment becomes more complex. Machine manufacturers are more specialized in certain technical solutions whereby today's original equipment manufacturer (OEM) production lines consist of several sub-systems from different machine suppliers. As those subsystems need to communicate with each other there is a high risk of errors, especially during the installation and ramp-up phase. But due to the desired lot size of one, machines need to be stopped for new setups.

While implementing the new programs a lot of failures can occur. With the digital twin (DT), most

of those errors can be uncovered in the digital world while the real world counterpart is still running

(11)

the old product or does not yet exist. For this reason, it is possible to reduce the commissioning time to a minimum. To optimize existing machines it is necessary to have models that can describe and simulate the behavior of the real-world machines as exactly as possible. A further point is to know the current state of the product. Due to the progress in the connection between machines and systems at the shop floor, this connection must be mirrored in the virtual world. With the increase in connectivity of manufacturing lines the connection between the different spaces becomes more complex. Especially medium enterprises in the mechanical engineering industry use high complex manufacturing lines but rarely make use of digital twins. To get a valid prediction of the behavior the digital world has to equal the real world including the connections between the machines. Virtual commissioning has been successfully used in the last years mainly for the first installation of big manufacturing lines. The creation of such simulation models with all its connections is time- consuming and thus expensive. But it has the potential to save between 75% (Reinhart and Wünsch, 2007) and 80% (Daniel, 2020) of the real commissioning time. The more it is used over its lifetime the more value it has. The connection between the digital and the real world is the heart of the digital twin concept. The DT model can leverage from this connection as the fidelity can be improved but also new applications such as remote support or final testing of single machines in its future environment are possible. The base for these applications are high fidelity 3D models that are controlled by real machines.

1.3 Research aim and objectives

The aim of this thesis is the development of a digital twin reference model. To evaluate this model a high fidelity digital twin of an existing manufacturing system is created. Therefore this study looks for an answer to the following research question:

How is it possible to connect a digital twin 3D visualization to its real-world counterpart to

mirror the current state of machines in its virtual environment?

(12)

1.4 Delimitations

The reference model is theoretical as the real manufacturing cell is not accessible in the given timeframe the model cannot be verified with the real manufacturing cell. The human representation in the 3D model is highly simplified depicted and only presence or absence is considered in the case.

Furthermore, the visualization of the product is simplified and only the main components of the product are used. The Programmable Logic Controller (PLC) and robot programs are written by the researcher on the base of models and videos as the real PLC and robot programs are not accessible.

1.5 Thesis organization

In chapter 1: ‘Introduction’, background and the problem description of this study are presented.

Furthermore, the research questions and the delimitations of this thesis are given. The second chapter: ‘Sustainable development’ gives a short overview of the Sustainable Development Goals (SDGs) of the United Nation and relates the study to it. The third chapter: ‘Frame of reference’

outlines the key concepts and serves as the base of the study. Chapter four: ‘Methodology’ describes how the research was conducted. It includes the research design, research strategy, data collection and analysis methods, and the research quality. In the fifth chapter: ‘Reference model’ the results of the systemic literature review and the requirements for the digital twin are presented. Based on this the reference model is developed. Chapter 6: ‘Digital twin creation’ the considered real case is presented and how the digital twin for the case was created. In the following Chapter: ‘Evaluation’

the reference model is evaluated regarding the key features with the created digital twin. The overall

outcome and future work directions are stated in the final chapter: ‘Conclusion’.

(13)

2 Sustainable development

Sustainable development means to satisfy the current generations’ needs without restricting future generations’ possibilities to meet their needs (European Commission, 1999). This becomes more important as humanity rises but the natural resources stay the same. While the concept was originally focused on the environmental aspect it has been enhanced with social and economic aspects. Those three parts are building the pillars of sustainable development (Figure 1). Each of those components needs to be considered equally (European Commission, 1999).

Figure 1 - Three pillars of sustainability (Nayyar and Kumar, 2019, p. 54)

Industry plays an important role in sustainable development, while at the same time sustainable

development measures can have a negative impact on the actual objectives and competitiveness of

companies. There are 17 challenges in sustainable development such as hunger, renewable energy,

economic growth, or responsible consumption (Nayyar and Kumar, 2019) which are also known as

Sustainable Development Goals (SDG) (United Nations, 2020). An overview of all categories is

given in Figure 2.

(14)

Figure 2 - Sustainable development goals

Industry 4.0 offers several approaches that can support the achievement of sustainable development goals. For instance, to achieve SDG 2: Zero Hunger that tackles the challenge of stopping hunger, sensors and drones can be used at smart farming to produce quantities according to demand. As the digital twin concept considers the whole lifecycle of a product it can contribute in different ways. A DT can already exist before the real-world object is created. Thus different scenarios can be tested in the virtual world and uncover issues that would lead to more resource consumption. Therefore, the DT can contribute to SDG 12: Responsible Consumption and Production that targets until 2030 to substantially reduce waste generation through several actions such as prevention, reduction, recycling, and reuse (United Nations, 2020)

.

With virtual commissioning e.g. possible collisions that would damage parts in the real world can be discovered. Further, the number of physical mock-ups can be reduced. While the product is in use the condition can be monitored and thus anomalies can be detected. Hence, it is possible to do preventive maintenance and only when it is really needed.

This leads to a longer lifetime of a product. At the product end of life, the DT offers the advantage

that the parts can be reused or better recycled as for instance all the components of the product are

known. Digital twins can support service technicians in remote troubleshooting by offering the

possibility to be in a virtual shop floor where virtual machines behave simultaneously like the real-

world machines. This can lead to less travel, which means less CO2 emissions and energy

consumption (United Nations, 2020).

(15)

3 Frame of reference

This chapter helps to understand the important terminologies in this thesis and the different concepts that are involved in this project. The frame of reference includes the topics digital twins, Internet of Things, cyber-physical systems (CPS), data exchange, and industrial solutions.

3.1 Digital twin

There is no uniform definition of a digital twin in the literature, rather many different interpretations can be found. In the following three of the most common explanations are mentioned.

"The Digital Twin is a set of virtual information constructs that fully describes a potential or actual physical manufactured product from the micro atomic level to the macro geometrical level. At its optimum, any information that could be obtained from inspecting a physically manufactured product can be obtained from its Digital Twin." (Grieves and Vickers, 2017, p. 94)

"A Digital Twin is an integrated multiphysics, multiscale, probabilistic simulation of an as-built vehicle or system that uses the best available physical models, sensor updates, fleet history, etc., to mirror the life of its corresponding flying twin" (Glaessgen and Stargel, 2012, p. 7)

“[...] digital twin is a real mapping of all components in the product life cycle using physical data, virtual data and interaction data between them.” (Tao et al., 2019, p. 3938)

In this thesis, we consider a DT as computer-based models that are bi-directional connected with its physical entity. Thus, it is used for monitoring, controlling, and optimizing. Future states such as errors or damage can be predicted and through close to reality simulations different scenarios can be tested to optimize the product.

3.1.1 Concepts of digital twin

The most popular digital twin concept is shown in Figure 3 and was introduced by Dr. Michael

Grieves. In the latest update, it is called the Virtual Twin concept. But it consists of the same three

dimensions: real-world products, virtual world products, and the linkage between them to exchange

data and information in both directions (Grieves, 2018). The connection is the essential part of the

digital twin to make it alive, as it is required to keep both parts mirrored. Therefore different models

are required in the virtual world to describe real-world behavior (Schleich et al., 2017). Grieves and

(16)

Vickers (2017) divided the DT in DT prototype, DT instance, and DT aggregate for the different lifecycle phases. Each lifecycle stage has its specific needs.

Figure 3 - Information mirroring model (Grieves, 2018)

Tao et al. (2018) extended this general DT concept to a five-dimension concept. Besides real space entity, virtual space entity, and the connection between them, they added a service model and a data model. The physical entity consists of several subsystems that perform tasks while sensors acquire the subsystems states. The virtual space entity includes several models such as geometry, material property, etc. to describe the behavior. The service model is used for operation optimization of the real-world twin and to calibrate the virtual model. The data model combines all date from the different models to get the high fidelity picture of the system. The connection model in this approach is more complex as it connects the data model as the core with the service, real-world, and virtual world model. The service model is directly connected to the real space entity and the virtual world entity.

3.1.2 DT modeling and simulation

The DT concept arose from the progress in simulation. Simulations are used to elaborate ‘what-if’

scenarios for real-world objects. But DT can monitor, diagnose, and predict what will happen based on real-time data. Therefore the different models need to be fidelity and as there are different tools to create them they also need to be interoperable. Depending on the needs the scale has to be adjustable and as a DT can change over its lifetime the models need to be expandable (Schleich et al., 2017).

Moreno et al. (2017) described a five-step process to virtualize a DT model. In the first stage, three-

dimensional models of the parts of the object are created. In the second step, the object behavior is

extracted as knowledge. The third part is to model the interaction of the dynamical parts of the object

and include it in the knowledge base. This is followed by operation modeling. In the end, a simulator

is created where all virtual parts are integrated to generate a close to real behavior.

(17)

3.1.3 Data fusion

DT has to combine a huge amount of data from different sources. While the IoT technology enables continuous data collection these data need to be handled by the DT. It must acquire the current state of the physical twin, maintain historical information over the whole lifetime, and generate data that are needed to simulate future behavior (Hu et al. 2014).

3.1.4 Application of DT

As the digital twin shall be used over the whole lifetime of an object it serves different purposes during this time. In the manufacturing sector, the main application fields are virtual commissioning (VC), optimization, and maintenance. VC is the testing and optimizing of machine programs and behavior with virtual machines before they are implemented in the real world. Therefore emulation technology is applied. 3D models are used and controlled by virtual PLC in a virtual environment.

For a realistic behavior of the models, further mathematical models are used to describe this behavior. The better the models describe the real behavior the higher is the fidelity of the model.

Thus, it is possible to get close to the reality model which can uncover problems, especially when different components of a machine or several machines are interrelated. While individual components or machines can function without problems on their own, interaction with other components can cause problems. Consequently, it is possible to eliminate problems and reduce the effort of changes compared to testing it on real machines (Reinhart and Wünsch, 2007). Besides testing of automation programs and processes safety functions can be tested, VC is mainly used for expensive assets in the early product development phase. In the case of optimization and product changes, VC can also be used during the growth and maturity phase of a product (Reinhart and Wünsch, 2007). But in this period DTs are mainly employed to monitor the state of an asset. Further, the continuously collected data can be used for predictive and preventive maintenance. For this purpose, other models and algorithms are needed to process the huge amount of data and find patterns and anomalies (Booyse et al., 2020).

3.2 Internet of things

IoT is the approach to connect information sensing devices such as sensors and radio frequency identification (RFID) through internet technology. Those embedded systems collect data and share it with other web-enabled devices or transfer it to storage and analyze systems e.g. in a cloud.

Depending on the architecture data can also be analyzed on edge devices and only the result is sent to

(18)

further instance. IoT offers the possibility to monitor whole production lines and support the operator in decision making. Besides manufacturing and logistics, it is used in every industry sector such as agriculture, healthcare, or finance (Rouse, 2019).

Cloud computing is a computer system for on-demand. It offers computing power and data storage (clouds) over the internet for a lot of users at the same time and from different points. Computing in clouds enables the processing of big amount of data (Federal Office for Information Security, n.d.).

3.3 Cyber-physical system

A cyber-physical system is a system to monitor and control physical processes. It is based on environment perception by integrating computation, communication, and control. Therefore it uses feedback loops between physical and computing processes. The interaction with the physical system is realized by networks. CPS is a centralized system that includes several physical systems (Liu et al., 2017). A digital twin is a part of this system (Schroeder, 2016). Global networking of plants enables high flexibility for capacity utilization and production planning (Broy et al., 2011).

CPS architectures have to be adjusted to the existing system structures. This includes the physical, network, and computer structure. Liu et al. (2017) defined a three-layer model that consists of a user system, information system, and physical system. The physical layer represents the embedded systems to collect and transmit current states. The core of CPS is the information system layer which processes the collected data. The user layer is the interface to the operator.

According to Lu et al. (2020), the relation between IoT, CPS, and DT is shown in the following

Figure 4. The IoT handles the connection between machines in the real world. This is mainly an

information exchange, but Rouse (2019) mentions that IoT can also be used for controlling. CPS and

DT can be seen as similar approaches as they represent the real world in the virtual world and both

have a bi-directional connection.

(19)

Figure 4 - Relation DT, CPS, IoT (Lu et al., 2020)

3.4 Data exchange

As DTs can consist of different subsystems, interoperable communication between the systems is required. Further, the DT must communicate with the real world in real-time. To bridge the gap between the systems, the usage of so-called middleware is needed. Middleware is application neutral software with the purpose to ease the communication between other applications. It offers the advantage that components of different providers can be used. There are three different types of middleware such as object-oriented, message-oriented, and service-oriented middleware technologies (Balador et al., 2017).

The Open Platform Communication Unified Architecture (OPC UA) is a standard protocol for interoperable process automation. It focuses on machine-to-machine communication in industrial automation and applies a server/client pattern. The protocol is an open cross-platform and it is used in industrial robotics, manufacturing, and process control (Leitner and Mahnke, 2006).

Message Query Telemetry Transport (MQTT) is a lightweight messaging protocol that is used for

IoT and machine-to-machine communication. It is mainly employed for unreliable networks. It uses

the publish-subscribe concept. The communication is between clients and brokers. Clients can both

publish, subscribe, and unsubscribe messages. The information distribution is controlled by the

brokers. It has the advantage of requiring a low bandwidth (Dasbach et al., 2019).

(20)

Data Distribution Service (DDS) is a middleware for distributed systems. It uses a data-centric publish-subscribe concept. Published data can be received by subscribers without knowing the structure or the origin of the data. This information is given by the data itself (Profanter et al., 2019).

Profanter et al. (2019) compared the performance of different communication protocols among other things by round-trip time (RTT) tests between two identical Linux machines. During this test, a signal is sent from one to another machine which returns the signal after receiving. The time it takes is the round-trip time. The RTT test has been performed in the echo mode (the response message has the same data bytes) and the acknowledge mode (the response is only one byte). Figure 5 shows the average RTT (in µs) of the acknowledge mode with different signal load for OPC UA, MQTT, and DDS while the systems are in idle state.

Figure 5 - Average RTT OPC UA, MQTT, DDS based on Profanter et al. (2019)

3.5 Industrial solutions

3.5.1 Programable logic controller

A PLC is an industrial hardware device for controlling automation systems. It receives input signals

e.g. from sensors or devices and runs a program depending on those signals by setting output signals

to activate devices such as motors or lamps. The PLC works in real-time. Besides those hardware

devices, there are so-called SoftPLCs. It is a software solution with the same functions but running

on a computer (3S-Smart Software Solutions GmbH, 2019).

(21)

3.5.2 Automated guided vehicles (AGV)

AGVs are mobile robots that are mainly used for autonomous material transport in almost every industry. The AGV follows a predefined path. There are different navigation techniques to move the AGV along this path. With the guided tape method e.g. magnetic tape or colored tape is installed on the floor and sensors at the AGV detect this tape and control the drive system to keep on the track (Tanchoco, 2013).

3.5.3 Industrial robots

An industrial robot is defined as an automatically controlled manipulator mechanism consisting of three or more axes. It can be both moveable and stationary. Due to its programmable control system industrial robots can be used for different purposes. The control system can also communicate with its environment and by this, it can build a bigger machine system with other machines or robots (The International Organization for Standardization, 2012).

3.5.4 Simulation software

For the virtual commissioning, there are different simulation software required to simulate the different parts of a real system.

ABB Robotstudio is an offline programming and simulation tool. It includes virtual robot controllers (VRC) for the ABB robots which are identical to the real one. With this simulation tool, it is possible to create digital twins, write real robot programs, and test them in a virtual environment (ABB Robotstudio, 2020).

ABB Automation Builder is a software for system automation. It offers a toolkit for configuring, programming, and testing automation projects. It includes PLC programming, control panels, drives, and motion (ABB Automation Builder, 2020). It includes a Codesys PLC programming environment.

SIEMENS SIMIT Simulation platform (V10.0) is a software tool for different simulations such as

holistic systems, signals, or devices, and plant behavior. It works as an input and output simulator

and can link different simulations and visualize simulations. To connect other simulation software it

offers a wide range of connection possibilities such as OPC UA server/client or shared memory

(SHM) (SIEMENS AG, 2018).

(22)

4 Methodology

This chapter describes the used methodology of this study. First, it gives a general overview of the methodology, divided into research design and research strategy. Followed by a more detailed description of the data collection and data analysis methods. The last section is a critical reflection of the quality of the thesis.

4.1 Research design

The empirical part of the thesis pursues the following key objectives:

1. Creation of a high fidelity digital twin simulation of an existing collaborative manufacturing cell

2. Development of a reference model to connect the digital twin with its real-world counterpart.

The research design of this study is of exploratory, deductive, and qualitative nature. According to Bryman (2012), an exploratory and qualitative research design is appropriate when the purpose of the study is to obtain more information and understanding of a phenomenon.

The research is based on a qualitative approach, which is usually built up using words and not numerical data (Bryman, 2012). The analysis of qualitative research can, therefore, be done by interpretation. The purpose of the chosen approach is to understand ‘how or why’ in order to gain a contextualized understanding of a topic. This can further explain certain behaviors or actions.

Through this common perspective of the relationship between theory and research, the researcher is required to form a hypothesis through practical concepts and to explore them with methods of data collection. Beginning with theory, continuing with the process of data collection and followed by the presentation of the results. In the end, the confirmation or rejection of the hypothesis and possible modification of the theory can be done (Bryman, 2012). The approach of this study is based on deductive theory because in this thesis the starting point is the hypothesis that a high fidelity simulation model can be linked to its real-world twin and mirror the current state of it.

This thesis is based on the collection of secondary data that was previously gathered by other

researchers. The use of this data allows the researcher to start the study process more easily and is

(23)

also a more time-saving way to get answers to the research question. The collection of primary data is usually a very time-consuming process that can be avoided (Bryman, 2012).

The adoption of the basic idea of the digital twin concept and the following breakdown as it only considers the part of the DT is used for virtual commissioning. Therefore the previously mentioned hypothesis was set up. When the hypothesis can be confirmed there are new possible applications for such models. The investigation is supported by the creation of a virtual commissioning model of an existing manufacturing cell. This cell is a collaborative system consisting of robots, AGVs, conveyors, and humans. The model serves to get a better understanding of the requirements for the connection and to set the boundaries for the research. Further, it is the base to test the connection by linking it to the real manufacturing cell.

4.2 Research strategy

The research strategy always needs to be closely related to the purpose of the research design. As the investigation of how the 3D model can be connected to the real world twin and the development of a reference model, this thesis requires, as mentioned above, an explanatory research strategy.

Therefore a combination of Design and Creation strategy and Case Study strategy is chosen (Appendix A). Design and Creation is the dominant strategy in this case. It aims at the generation of new IT artifacts. Those can be constructs, models, methods, instantiations, or a combination of them (Oates, 2005).

The design and Creation strategy consists of five iterative steps. Awareness as the step of problem recognition. Suggestions how the problem can be tackled. Development of the suggested tentative solution. Evaluation of the developed tentative solution. Conclusion of the gained knowledge.

While working on the steps new insights can be uncovered which can influence the other steps (Oates, 2005).

The case study strategy enables the researcher to get deeper insights and in-depth understanding in a

certain case (Oates, 2005). The thesis aims to develop a reference model to connect the digital twin

visualization with its real-world counterpart. To achieve this goal, the researcher first creates the

virtual commissioning model, which is denoted as the case in this thesis. According to Bryman

(2012), it is viable to associate a case study with theory testing which is for this thesis appropriated

and thus contributes to the investigation of the hypothesis and the related research question. Even

when case studies are difficult to generalize (Oates, 2005) the results can be transferred to other

cases as nowadays a lot of manufacturing cells consist of combinations of AGV’s, robots, humans,

(24)

and further automation systems. The outcome of this study is a reference model that can serve as the base for further research or used to monitor such manufacturing cells.

4.3 Data collection and data analysis

The data collection and analysis are the heart of research methodology (Saunders et al., 2009). The study is based on documents. This data collection method uses existing data that has already been collected for another purpose. Documents can be internal documents of a company, prior research, or other publications (Oates, 2005).

For the case study which includes the creation of the digital twin model, already existing simulation models, pictures, and videos of the real manufacturing cell are used. These data are made available by the company. Since the program codes are not accessible, the gathered data serves as the base for further documents such as process plans, robots and PLC programs. Furthermore, the model formation, based on the available data, is carried out according to the researcher's own knowledge.

For data collection of the Design and Creation strategy, a systematic literature review is used to collect and analyze secondary data. Such data can include quantitative and qualitative raw data but also summaries. It can be defined as a replicable, scientific method by using peer-reviewed published studies that minimize bias. A transparent search process and data evaluation are required to make the decisions, steps, and interpretations of the researcher comprehensible (Tranfield et al., 2003).

The systematic literature research is particularly important for students, as it allows the collection of large amounts of data, but at the same time offers freedom in terms of time management and the formulation of research questions as well as the analysis and interpretation of the data. In addition, high-quality data can be collected in a short period of time, which in the case of surveys or different industrial and geographical areas might not be possible in the given time frame. Further, this may allow more relevant conclusions to be drawn (Bryman, 2012), which may be important in Industry 4.0, as it is still an emerging issue that is discussed globally.

The purpose of the thesis was initially formulated with the support of a research question. This builds

the foundation to define keywords and narrow the search for relevant studies. The researcher limited

the focus of the topic to reduce the analysis of an extensive amount of data. This ensures the validity

and consistency of the results.

(25)

The chosen database for the data collection is Scopus as it is the biggest abstract and citation database of peer-reviewed scientific literature (Scopus, 2020). To reduce the number of keywords with the same word root the asterisk feature ‘*’ is used. To combine the keywords in the search Boolean operators ‘AND’, ‘AND NOT’, and ‘OR’ are applied. The search is limited to ‘title’ and English language to further narrowing down. Additionally, the search is limited to the time frame 2010 until 2020 as the digital twin concept in the manufacturing was published for the first time under this name in the year 2010 (Piascik et al., 2010). To get a high validity only peer-review articles are considered. All limitations are stated in Table 1 below.

Table 1 - Search limitation

Limitations Reason

Field Digital twin

Search area Title

Source type Peer-reviewed articles

Filter out Reviews, working papers, commentaries, book chapters

Time 2010 – April 2020

Based on the purpose of investigating the possible ways to connect the digital twin with its real- world counterpart keywords are defined (Table 2). First, a wider search has been done to identify the components to link the real world and the virtual world.

Table 2 - First keywords search

Themes Keywords

Digital twin

“digital twin” OR “virtual twin” OR “virtual replica*”

OR

Cyber-physical systems

CPS OR “cyber-physical system”

AND

Connection

connect* OR communicat* OR link* OR “data exchange”

After the first search analysis the two components, communication layer architecture, and

communication technology are identified. Therefore the literature review is split into those two parts

and more specific keywords have been defined (Table 3). The initial search was a total of 1178

articles based on the criteria below. Due to the vast amount of papers the researcher further refined

the search results by limiting the articles with ‘exact keywords’ (communication and architecture)

within the scope of the digital twin. Exact keywords can be used to increase the search specificity

(26)

without carrying out a new search (Scopus, 2020). The number of documents derived from this search was 31 results.

Table 3 - Finale keywords search

Themes Keywords

Digital twin

“digital twin” OR “virtual replica*” OR “cyber- physical system”

AND

Architecture

“application architecture” AND “application layer”

OR structure OR “reference model”

OR

Communication

communication OR “data exchange” OR “data transmission” OR protocol

The 31 articles for further analysis are selected by the relevance of the title and abstract which led to

13 documents that are analyzed by the whole content. Here, the focus was on digital twin in

combination with architecture layers and communication technologies. Articles that are not

contributing to the purpose of the study are excluded which resulted in a total of six relevant papers

for the thesis. Both categories contain three documents each. The search scheme is presented in

Figure 6 below. The entire analysis was based on the researcher's subjective interpretation of

relevance to narrow it down. By analyzing the secondary data in a qualitative form, it is possible to

uncover or evaluate new interpretations of the given data, which was not minded by previous

researchers (Bryman, 2012).

(27)

Figure 6 - Search process

4.4 Research quality

The systematic literature review is chosen to identify different approaches to connect a digital twin with a real-world twin. Due to a large amount of information, the collection and structuring in qualitative analysis of data are rather difficult. The emerging issues from the initial data search, but also the relation to the frame of reference can help the researcher to interpret the resulting data.

However, since the data can be influenced by the researcher's bias, it is important to emphasize the subjectivity of qualitative data analysis. This, however, leads to a weakening of internal validity.

In a systematic literature search, reliability is linked to replicability. This means that it is possible to repeat the study in the same way but in a different environment and to obtain the same results (Bryman, 2012).

For this research study, all the steps necessary to collect the data were described in full to enable

replicability. Since the analysis of the selected articles is based on the subjective assessment of the

researcher, this thesis loses its replicability from this part. Nevertheless, all decision factors and

methods are recorded during the entire research and can, therefore, be tracked. This ensures

transparency and external reliability (Bryman, 2012).

(28)

The study is limited to a given time period, which meant that only limited data could be collected and analyzed. To ensure a thorough analysis of all relevant data, it is necessary to take more time to strengthen the thesis with more valid results.

However, according to Oates (2005), it is difficult to generalize case studies. But since nowadays

similar technologies are used the results can be transferred to other cases. In this study, the model is

greatly simplified in its construction. These reductions in the model are documented and thus

contribute only to a small decrease in external validity. Thus, the result can be generalized taking

into account certain factors.

(29)

5 Reference model

This chapter investigates the possibilities and the requirements for the reference model which are derived from the literature and the given case. Further key features for the digital twin are summarized and the reference model is presented.

5.1 Digital twin architecture

One of the key enablers for Industry 4.0 is the digital twin as it is the linkage between the real and virtual worlds. This requires a strong architecture to deal with the complexity and real-time data transmission between and within the two worlds. The analysis of the literature shows that there are different layer architectures for digital twin or cyber-physical systems. Figure 7 shows an overview of the three architectures.

Figure 7 - Overview of architecture models based on Redelinghuys et al. (2019); Yi et al. (2020); Zheng and Sivabalan (2020)

Redelinghuys et al. (2019) present a six-layer architecture to enable real-time data and information exchange in horizontal and vertical directions. Layer one and two represent the physical twin with its actors and sensors and the controlling units such as PLC. Layer three is the interface to the virtual world and uses OPC UA. Layer five converts the data into information for the upper layers. Layer five is cloud-based storage for information. The sixth layer is the replica of the physical object and by this the interface to the user. The architecture was implemented in a case study for a pneumatical gripper as one part of a manufacturing system. The gripper position is detected with limit switches in the end positions. Further, a position sensor is used to detect if a part gripped. Besides, a pressure

Redelinghuys et al. 2019

Physical twin Physical twin Local data reposition

IoT Gateway Cloud-based information repositories

Emulation and simulation

Layer 1 Layer 2 Layer 3 Layer 4 Layer 5 Layer 6

Zheng and Sivabalan 2020

Physical layer Data extraction and consolidation

Cyberspace Interaction

Yi et al. 2020 Physical space

Interaction Virtual space

(30)

sensor and a flow sensor are complemented. The gripper is mounted to a fixed plate. On the second layer, a SIEMENS S7 PLC is located to which the sensors’ signals are connected. The third layer communicates with the PLC, the IoT gateway, with the cloud and with the emulation software plant simulation on layer six. For communication, the OPC server KepServerEX is used. The OPC UA server is continuously sending the current value of the sensors to the datalogger on layer five. With Kepware open database connectivity as a client, it is also possible to transmit data in the opposite direction. The IoT gateway also works as a client to communicate with layer three. To the upper SQL database in the upper layer, MySQL communication is used. Redelinghuys et al. (2019) recognized some issues regarding the date and time stamps on the different layers. The cloud is only used for data storage and has not been further evaluated. SIEMENS Tecnomatix PS is used for the emulation of the digital twin. It offers an interface to the database and can also retrieve data directly from the OPC UA server. To simulate the movement of the gripper components in Tecnomatix PS are moved by scheduled translation function which is activated by the OPC UA input. The model can also be simulated by using stored cloud data. It was proven that the structure is close to real-time capable. The standardized communication approach makes it useable with different subsystems (Redelinghuys et al., 2019).

Zheng and Sivabalan (2020) proposed an four-layer cyber-physical system architecture. The first layer is the physical layer. It includes besides the physical systems such as machines, sensors, human-machine interfaces (HMI) communication protocols, and interfaces. Layer two bridges the gap between the physical and the cyber layer by converting the received data to machine-readable format and sending it to the cloud in layer three. This layer forms the core of this architecture as it includes the digital twin application programming interface (API). The interaction and visualization happen on the fifth layer. They proved their architecture to be working by implementing it on a 3D printer. An RS232 serial interface is used to connect the printer with a Raspberry Pi which is used for the connection with the cyberspace. In the virtual space, it is stored in a cloud and the digital twin is feed by this cloud. The limitation of this approach is a time delay of two seconds to update the model when it is used for monitoring. As the data gathering works in real-time this delay leads to skipping of data. Still, it can be applied to collect data to optimize the fidelity of the virtual representation.

Yi et al. (2020) presented a three-layer architecture for smart assembly of complex products. It

contains a physical space, interaction, and virtual layer. The physical space layer represents the shop-

floor entities. The interaction layer bridges the gap between the physical and virtual layers. It is a bi-

directional real-time connection for transferring prediction data from virtual to physical space. A

(31)

further part of this layer is the data processing module which is responsible for data collection, preprocessing, and storage. The virtual space layer includes a 3D virtual model of the assembly process and digital twin based application such as simulation and prediction. The proposed architecture was implemented in a simplified satellite assembly. In the first step, the production process was defined with the digital model of the satellite assembly. During the performance of the real assembly, process data are collected via laser scanners, bar code readers, and sensors. The acquired data is compared with the virtual generated data. Thus, the current assembly status and precision can be determined. Further, the data can be used to predict the next assembly step and show the instruction to the worker. This closed feedback loop leads to assembly guidance in the physical world and condition monitoring in the virtual world (Yi et al., 2020).

5.1.1 Architecture conclusion

The comparison of the three different architectures shows that the core elements are the three components real space, virtual space, and the connection for information exchange between them.

This corresponds with the originally mirrored information model introduced by Grieves (2006).

While the architecture of Yi et al. (2020) is identical to Grieves mirrored information model from 2002, Zheng and Sivabalan (2020) model corresponds to the revised model from 2005 as it has the extension of the interaction which equals the virtual simulation space in Grieves revised information mirroring model (Grieves, 2006). Zheng and Sivabalan (2020) and Redelinghuys et al. (2019) architectures are more detailed. The bi-directional communication enables both worlds to leverage each other to get a higher precision of the DT as a base for optimizing the real object. To handle the huge amount of data clouds are used for data storage. While such cloud solutions improve the availability and accessibility of the digital twin it leads to higher requirements regarding data security.

5.2 Communication

Liu et al. (2019) proposed a cyber-physical machine tool prototype where OPC UA is used for the

communication between machine tools and various software applications. The prototype was

implemented with a drilling machine that is controlled by iWindow which is a software for real-time

controlling of Computerized Numerical Control (CNC) machines. The software acquires continuous

data from the machine such as spindle speed, axes position, or alarms. These data are transferred to

the OPC UA server by memory-mapped files. Additionally, data from external sensors e.g. vibration

(32)

sensors are preprocessed transferred to the OPC UA server. This server encodes the data to standardized OPC UA binary messages and sent it in real-time to the OPC UA clients on request.

The research was done on a local network. Thus, the security of the system was not considered and requires further investigation (Liu et al., 2019).

In the study by Yun et al. (2017) a digital twin platform called uDiT was developed. This platform is especially used for dynamic simulations where many different simulators are included. The presented platform applies an Object Management Group Data Distribution Service based communication middleware. It is used to connect co-simulations, real-time data from the assets environment, and the digital twin. The DDS includes a data-centric publish-subscribe and a real-time publish-subscribe. The data-centric publish-subscribe connects to both the simulation and the physical world. The real-time publish-subscribe serves as a connection to other parts in a bigger system. As the main requirement, the real-time capability of the interconnection of the physical world and cyber model was stated (Yun et al., 2017).

Sonawala et al., (2017) implemented a control and monitor platform with the MQTT protocol. The focus was on the real-time data exchange with industrial sensors and the visualization on mobile devices. The sensor data has been processed by a microcontroller and transmitted via a WiFi to the MQTT Broker. This MQTT broker works as a server. The sensors publish the data with a publishing topic through the WiFi module to the server. The application devices subscribe to the topics to the server to get the data. The Mosquitto broker is used for testing the system to monitor the humidity, temperature, and pressure in an industrial environment. During the test, it was possible to show the sensor data in real-time with different software applications. Two different modes were implemented, an offline mode to use the system in a local network and an online mode for a location independent usage (Sonawala et al., 2017).

5.2.1 Communication conclusion

DDS, OPC UA, and MQTT are all middleware solutions that are used to connect virtual and real-

world systems (Yun et al., 2017; Liu et al., 2019; Sonawala et al., 2017). The main requirements on

the middleware are real-time capability, security, and scalability. While DDS and MQTT are real-

time capable (Yun et al., 2017; Sonawala et al., 2017) OPC UA requires an extension therefore (Liu

et al., 2019). Especially in case of a linkage to the internet, the security aspect needs to be considered

(Sonawala et al., 2017). Further, the scalability plays an important role, as the cyber-physical system

needs to change over its lifetime to represent the real-world assets (Yun et al., 2017).

(33)

5.3 Requirements

The model for virtual commissioning which can be created in Robotstudio can simulate real behavior. But the included components are idealized models. This means that they do not exactly behave like their real-world counterparts (Rosen et al., 2020). But while the movements of the parts are simplified the control of the parts is high fidelity as the SoftPLC and the virtual robot controllers can imitate the real ones and the programs can be the same as in the reality. The data type of the I/Os that are necessary for this connection is either Boolean, Integer, or Double. Depending on the purpose of the digital twin there are different requirements both for the fidelity of the model and for the connection of the real and the virtual object. To control a system with the DT a secure, bi- directional, and real-time communication is compulsory (Liu et al., 2017; Yun et al., 2017). But in the monitoring mode, data only needs to be transferred from the real to the cyber world. The delay of the data exchange in this mode is not as critical as for the control mode (Zheng and Sivabalan, 2020).

The architecture is a prerequisite for the connection of the real world and the virtual world as it forms the frame. The literature shows that the underlying concept for cyber-physical systems is always identical to Grieves' information mirroring model. Depending on the application the architecture needs to be more specific. The huge amount of data needs to be stored over the whole product lifecycle. Therefore local or cloud storage can be used. Especially the implementation of cloud solutions needs enhancement and thus further layers. For the given case no storage of data is required as the purpose of the DT is to act as a mirror image of the real. It only represents the current state.

With the rise of the IoT technology more and more data are collected and transferred. Thus, the security of those data plays an important role in Industry 4.0 (Chhetri et al., 2017). The data security must be valid for local networks and local servers and even more when data is transferred or stored by the internet. While with an internet connection the security aspect is more on data theft, on a local network it is more about data manipulation (Chhetri et al., 2017).

To bring a digital twin to live means to control the digital 3D model with the data of the real world (KUKA Aktiengesellschaft, 2020). The basic task is to mirror the current movements and behavior of a real object. This requires at least a secure and close to real-time communication from the shop floor to the virtual space. Nowadays, production lines have to be designed flexibly to meet customer requirements. This means that the individual components of the line change during its life cycle.

Thus the digital twin must also adopt these changes and therefore it must be easy to change or extend

(34)

(McBurnett, 2020). This also means that new software solutions may have to be integrated into the system. Consequently, interoperability plays an important role in addition to scalability (Balador et al., 2017).

The key features for a digital twin to monitor a physical asset are summarized in Table 4:

Table 4 - Key features digital twins

Feature Description

Latency The time interval between a cause and the response to it.

Security Security of a system means that it is shielded from unauthorized access which can change the system (Industrial Internet Consortium, 2016).

Scalability The capability of a system to still perform well when the workload or scope

increased.

(35)

5.4 Reference model

The following model (Figure 8) is proposed to serve the minimum requirement for a digital twin. In this case, it is used to monitor the running manufacturing process in a virtual environment on a local network. For visualization, Robotstudio2019 is used. The connection between real PLC and Robotstudio is realized by OPC UA Server/client and SHM. The PLC ABB AC500 V3 comes with an embedded OPC UA server and for the robot controller, the IRC5 OPC UA server is used. Those servers are connected to SIMIT which takes over the client's function. SIMIT itself is connected via shared memory to Robotstudio. The variables from the PLCs can be used directly with the corresponding smart component. However, the signals for the virtual robot controller need a special naming in the SHM. It starts with ‘Robot’ followed by the system name followed by the I/O device name. The I/O signal in Robotstudio requires the same address as the corresponding signal in the SHM. The movements of the mechanisms that are controlled by smart components only need the right connection to the SHM signals. The robots are controlled by the virtual programs which in turn are controlled by the input signals. The robot outputs are controlled by the real controllers and therefore the setting and resetting of the digital outputs in the robot programs must be commented out.

The digital representation can be presented on a screen in Robotstudio. Further, there is the

possibility to connect VR glasses to Robostudio and thereby the DT can act as an accessible live

video.

(36)

Figure 8 - Three-layer reference model

(37)

6 Digital twin creation

The digital twin model is built as one twin for the whole cell. The used software is ABB RobotStudio2019 for the 3D model and ABB Automation builder for the PLC programming.

6.1 The case

The considered case is a collaborative manufacturing cell for the production of robots (Appendix B, Appendix C). It consists of six ABB duty heavy robots, nine AGVs, and two conveyors. The AGVs have a carrier plate which is height adjustable and turnable by 180°. The navigation is a magnetic fix-path orientation system. Further, it is divided into six tacts with one worker each. In this study, only the first tact is considered as it is shown in Figure 9.

6.1.1 Layout

The first tact includes two robots, one is equipped with a four-finger gripper and the other has an inspection and screwdriver unit. A conveyor delivers the robot base part on a euro pallet in the cell and an AGV is carrying it through the cell. The second robot is a sequential collaborative robot as it has the same working area as the worker but not at the same time.

Figure 9 - Tact 1 (considered part of the cell)

(38)

6.1.2 Process

An empty AGV enters the cell and moves to position AGVpos1. At the same time, the conveyor supplies the base part of the product to the pickup position. Then the robot R1 picks the part from the conveyor, places it on the AGV, and moves in a safe position. The AGV moves on to the AGVpos2.

When this position is reached the worker enters the cell and inserts the screws to mount the base part to the carrier plate of the AGV. When the worker left the safety zone robot R2 checks the position of the base part with a vision system and tightens the screws. The worker enters again and imposes parts to the base parts and the robot R2 tightens the bolts. When it is done the AGV turns the carrier plate by 180°. The worker mounts further parts, leaves the cell, and confirms that the tact is finished.

When all tacts are confirmed the AGV system starts and the AGV moves to AGVpos4.

6.2 Simulation model

For the creation of the virtual representation of the manufacturing cell, ABB Robotstudion is applied.

Besides the 3D visualization of the cell, the virtual robot controllers of Robotstudio are used to control the robots. To simulate the movements of the tools, conveyor, and human the smart component (SC) function is utilized. Also, sensors and buttons make use of this function. The ABB Automation Builder is used for PLC programming. The included PLC (AC500 V3) environment is Codesys V2.3. Furthermore, this software contains a virtual PLC which is connected to Robotstudio using the ‘connectivity’ smart component. This SC utilizes a shared memory function to exchange the data between Automation Builder and Robotstudio.

The path of the AGV system is designed with the curve function. For the stopping positions a 100 x

10 x 10 mm part was created and places on the ground. The AGV has a magnet sensor to navigate

along the path. Therefore the smart component ‘move along curve’ is used. To detect the stop

position markings an additional smart component ‘line sensor’ is added at the magnet sensor. The

AGVs only move when the motor output is active. Additionally, the emergency stop needs to be

inactive. When the sensor detects the marking position, the emergency stop, or the worker enters the

cell the motor is deactivated. The robot program starts when the input AGVready and the respective

position (robot 1 -> AGVpos1, robot 2 ->AGVpos2) are active.

(39)

6.3 Limitations

The model does not represent the original process, as the PLC and robot program has been written by the researcher. The programs are based on the video of the real cell. The real manufacturing cell uses a SIEMENS PLC which was substituted with the ABB AC500 V3. Both original PLC and the programs were not accessible within the time of the project.

6.4 Challenges for the connection

Even when the digital representation has a high fidelity some challenges need to be considered to get

close to real-time mirrored models. Especially when we consider the case that the virtual

representation is turned on while the real twin is running. The state of static objects such as sensors

can be immediately mirrored by the connection of I/Os given that the data exchange happens in real-

time. To get the position of dynamic objects e.g. robots, AGV is more challenging. This requires data

such as the joint angle of the robot or position data of the AGV.

References

Related documents

In line with the new research stream (see e.g. Bouncken et al., 2015a; Rask, 2014) and the empirical findings of this study, it could therefore be proposed that the perceived

The approach is better suited for models that include shared events, variables, and associated constraints on them; as in case of robotic-cell, where a robot has to

Figure 3: The Number of Firms, Herfindahl-Hirschman Index, Concentration Ratios and Market Shares in the US Manufacturing Sector between 2002 and 2012—Presented across the 3, 4

slightly less than half of the total area surveyed, and the total station measurements about 8 hectares (see Figure 6). The areas covered with GPS to the left and total station to

Denna jämförelse svarar på forskningsfrågan: I vilken grad uppnår regeringens försvars- och säkerhetspolitiska frame effekt genom resonans i den militärstrategiska

Tidigare forskning visar att olika aktörsgrupper inom organisationen såsom ledning, fastighetsskötare och anställda ofta förhåller sig till energi på olika sätt

As the study’s aim is to extend existing theories regarding digital solutions and their implications on business models in manufacturing firms, theoretical sampling

While there was no formal requirement on qualification for the turbine (being a demonstrator), both external and internal reviews defined verification requirements according