• No results found

Defining infrastructure requirements for the creation of Digital Twins

N/A
N/A
Protected

Academic year: 2021

Share "Defining infrastructure requirements for the creation of Digital Twins"

Copied!
47
0
0

Loading.... (view fulltext now)

Full text

(1)

School of Innovation, Design and Engineering

RISE Research Institutes of Sweden

Defining infrastructure

requirements for the

creation of Digital Twins

Master thesis work

30 credits, Advanced level

Product and process development Production and Logistics

Maryam Noora Jay

Report code: PPU503

Commissioned by: RISE Research Institutes of Sweden Tutor (company): Kristian Sandström & Álvaro Aranda Munoz Tutor (university): Francesco Flammini

(2)

ABSTRACT

Along with the evolution of the new technologies such as industrial internet of things (IIoT), big data, cloud computing, artificial intelligence (AI), etc., the amalgamation between the cyber and physical worlds in the industrial field has become necessary to realize and achieve the smart factory and increase its productivity. The emergence of the Digital Twin (DT) concept as a technology that ties the physical and digital worlds has gained significant attention around the world during the last years. However, this concept is relatively new; the literature related to this concept is limited, and its application is still under development and requires further participation from both the industry and academia.

This thesis project presented the main requirements and the steps for building a DT. Three research questions have been formulated and answered separately to fulfill the objective of this research study. The answer to the first two research questions was mainly based on surveying the scientific literature to explore this concept's background, main infrastructure, related technologies, its applications in the manufacturing domain, open issues, and some opportunities and challenges that hinder its implementation. Further, the answer to the last research question is represented in proposing a general methodology with some detailed steps for DT's building process and validating this methodology with an existing case study to show it works in practice. Further, several aspects needed for future work have also been addressed.

(3)

ACKNOWLEDGEMENTS

Firstly, I would like to take this opportunity to show my very great gratitude to my supervisor at MDH, Professor Francesco Flammini, for guiding me on the right path during this thesis project. Without his valuable support, constructive criticism, and experiences, this project would not have been possible. Secondly, I am immensely grateful to Álvaro Aranda Munoz, my supervisor at RISE Company, for his endless support, ideas, and tips during this project and special thanks to Kristian Sandström for his support in the technical field at the beginning of this research study. Thirdly, I would like to show a warm thanks to my friends, especially my friend Maria lönnerek, to support my endeavors. Finally, I would like to thank my dearest husband, Mohammed Noori, to support me so that I could focus on my research. I want to thank him for his encouragement and belief in my capabilities to overcome the challenges I have faced in this research study.

Maryam Noora Jay

Eskilstuna, september 2020

(4)

Contents

1. INTRODUCTION ... 7

1.1. BACKGROUND ... 7

1.2. PROBLEM FORMULATION ... 8

1.3. AIM AND RESEARCH QUESTIONS ... 8

1.4. RESEARCH LIMITATIONS... 9 2. RESEARCH METHOD ... 10 2.1. LITTERATURESTUDY ... 10 2.2. RESEARCHPROCSS ... 13 3. THEORETIC FRAMEWORK ... 14 3.1. THE OVERVIEW OF DT TECHNOLOGY... 14

3.2. THE ORIGIN OF DT,HISTORY,ANDDEFINITIONS ... 16

3.3. SMART FACTORY AND INDUSTRY 4.0 CONCEPT ... 18

3.4. OVERVIEW OF SOME ENABLING TECHNOLOGIES FOR DT IMPLEMENTATION ... 18

3.4.1 INDUSTRIAL INTERNET OF THINGS (IIOT) AND CPS ... 19

3.4.2 CLOUD COMPUTING AI, MACHINE LEARNING, AND BIG DATA ANALYTICS ... 21

3.5 DT MODELLING AND SIMULATION ... 22

3.6 THE MAIN REQUIREMENTS FOR BUILDING A REFERENCE MODEL FOR DT ... 23

3.7. MAIN PARTS OF DT ... 24

3.8. THE FUNDAMENTAL REQUIREMENTS FOR ENABLING EACH PART OF DT ... 25

3.9 DATA LIFECYCLE MANAGEMENT ... 27

3.10. ARCHITECTURES FOR BUILDING THE DT ... 28

3.10.1FOUR-LAYERED CPS ARCHITECTURE FOR BUILDING THE ... 28

3.10.2SIX LAYERED ARCHITECTURE FOR BUILDING THE DT ... 29

3.11. DIGITAL TWIN SHOP-FLOOR (DTS) ... 30

3.12. DIGITAL TWIN BASED PRODUCT DESIGN (DTPD) ... 31

3.13. OPPORTUNITIES, OPEN ISSUES AND CHALLENGES ... 32

3.14. CURRENT PRACTICES AND APPLICATIONS OF DT ... 33

4. METHODOLOGY... 36

4.1 A DESCRIPTION OF STEPS COMPLEMENTED WITH A HIGH-LEVEL FLOWCHART ... .…...36

4.2. VALIDATING THE METHODOLOGY WITH A SIMPLE REFERENCE CASE STUDY ...………39

5. DISCUSSION AND CONCLUSIONS ... 42

(5)

List of figures

FIGURE 2.1: DIAGRAM OF SEARCHING PROCESS... 11 FIGURE 2.2 : NUMBER OF DOCUMENTS PUBLISHED PER YEAR (SOURCE: SCOPUS DATABASE).. ... 11 FIGURE 2.3: NUMBER OF DOCUMENTS PUBLISHED PER YEAR PER SOURCE (SOURCE: SCOPUS

DATABASE)………...………...12 FIGURE: 2.4: NUMBER OF DOCUMENTS PER PUBLICATION TYPE (SOURCE: SCOPUS

DATABASE)………..………12 FIGURE 2.5 : RESEARCH PROCESS PLAN………...…………13 FIGURE 3.1: THREE DIMENSIONS MODEL OF DT (GRIEVES, 2014)………..………..15 FIGURE 3.2: THE CLASSIFICATION OF DIGITAL TWIN LEVELS AND ITS FIVE PARTS (QI, ET AL., 2018)………...………..15 FIGURE 3.3: DATA FLOW IN THE THREE CATEGORIES (KRITZINGER, ET AL., 2018)……….16 FIGURE 3.4: THE INTEGRATION BETWEEN PHYSICAL AND CYBER WORLDS IN CPS AND DT (TAO, ET AL., 2019 C)…..21 FIGURE 3.5 : SIMULATION MILESTONES (ROSEN, ET AL., 2015)……….………22 FIGURE 3.6: DT REFERENCE MODEL (LU, ET AL., 2020)………..……..23 FIGURE 3.7: DATA SOURCES AND DATA MANAGEMENT, UPDATED FROM (QI & TAO, 2018)...………...28 FIGURE 3.8: FOUR- LAYERED CPS ARCHITECTURE FOR BUILDING THE DT (ZHENG & SIVABALAN, 2020)…………..….29 FIGURE 3.9: SIX LAYERED ARCHITECTURE FOR BUILDING THE DIGITAL TWIN (REDELINGHUYS, ET AL., 2020)...…..30 FIGURE 3.10: CONCEPTUAL MODEL FOR DIGITAL TWIN SHOP FLOOR (DTS) (TAO & ZHANG, 2017)………...…31 FIGURE 4.2: STEPS FOR BUILDING GENERAL DT…………...………38 FIGURE 4.3: THE FLOW OF USING BICYCLE-SHARING PARADIGM (TAO, ET AL., 2019C)……...………..40 FIGURE 4.4: REDESIGNING A NEW GENERATION BICYCLE BASED ON DIGITAL TWIN-DRIVEN PRODUCT DESIGN (DTDP) (TAO, ET AL., 2019C)………...41

List of tables

TABLE 2.1: SEARCHING PROCESS.………...……11 TABLE 3.1: DT DEFINITION IN THE SCIENTIFIC LITERATURES………15 TABLE 3.2: THE EMERGENCE OF DT CONCEPT IN THE LITERATURE REVIEWS……….15

(6)

ABBREVIATIONS

AI Artificial intelligence

APIs Application programming interfaces CPS Cyber physical systems

DT Digital twins

DTPD Digital twin based product design DTS Digital twin shop-floor

IIoT Industrial internet of things IoT Internet of things

RFID Radio frequency identification devices SoS System of system

(7)

1. INTRODUCTION

This chapter presents an introduction to the research topic. This chapter includes the background, problem formulation, aim and research questions, and the project limitations.

1.1. Background

Nowadays, the massive data generated from different sources in the smart factory has increased the need for the real-time data integration and connection between the virtual and physical worlds (Rosen, et al., 2015; Radchenko, et al., 2018; Tao, et al., 2019d). Digital twin (DT) is considered one of the new rising and prominent technologies that enable the cyber and physical world integration through the bidirectional communication and the real-time synchronization between physical and digital worlds (Rosen et al., 2015 ; Tao, et al., 2019b). It paves the way towards the smart manufacturing and industry 4.0 (Pileggi, et al., 2019). Further, it has taken significant attention from both the academia and industry and become a necessity for improving and developing the international manufacturing thanks to the value it presents to the manufacturing domain and other domains such as aviation, aerospace, medicine, healthcare, etc. (Barricelli, et al., 2019 ; Lim, et al., 2019).

DT is considered one of the promising technologies applicable in the manufacturing field for supporting improvements, control, and maintenance purposes (Cimino, et al., 2019). It is a sophisticated technology that can provide a comprehensive digital description for real-world entities, including its functions, behaviors, and operational data (Qi & Tao, 2018). Meaning that it mirrors the real status inside the physical entities during its whole lifecycle, predicts failures, and proposes the appropriate control actions to improve its processes and operational conditions by (Barricelli, et al., 2019; Saddik, 2018; Tao, et al., 2019b). The concept of DT has been originally started to be used in the aviation industry to reflect the aircraft lifecycle; it also has been used in health management and maintenance purposes. Later on, it was adapted to the manufacturing settings to reflect the real-world systems and aids in its improvements and decision making. It is worth mentioning that DT has been realized along with the ubiquitous evolution of the new technologies such as sensors technology, internet of things (IoT), cloud computing, AI, and cyber-physical systems CPS, etc. It is expected that half of the manufacturing companies will utilize DT by 2021 for optimizing their systems and increasing productivity, and it is likely that they will kick around $15.66 billion by 2023 (Hasan, et al., 2020). According to the International Data Corporation (IDC), the companies that utilize DT in their production systems will achieve 30 percent optimization in their production processes during the coming five years. However, all the current applications of DT have only been focused on optimizing the product design during its lifecycle, while its application in terms of production systems or processes is limited and still under experimentation (Shao & Kibira, 2018).

In terms of the manufacturing sector, DT means creating a virtual replica of the production systems by using various simulation models and enables real-time synchronization between the physical and real-world (Negri, et al., 2017). DT consists of five parts: the physical world, virtual models, data, services, and connection (Schleich, et al., 2017). The Bidirectional communication between the physical and digital worlds is essential in building a complete DT and enabling service exchange (Kritzinger, et al., 2018). By using different enabling technologies such as sensors, IoT infrastructure, data processing, and analyzing technologies and smart platforms, data gathered will be processed, analyzed, and fed to the DT models in the

(8)

virtual world (Qi, et al., 2019). Utilizing DT can make industries capable of controlling, predicting failures, improve their production systems and processes and increase the quality of their products (Lu 2, et al., 2020; Qi, et al., 2019). Further, DT can offer and suggest services to the real systems according to the available information and improve the current services, such as monitoring the lifespan of systems, maintenance, etc. (Tao, et al., 2019b).

To conclude, creating a DT and make it applicable in practice requires comprehensive knowledge about this concept, its key components, and enabling technologies that support its implementation (Barricelli, et al., 2019). Although DT has been developed over the years and has been applied in several industries, this concept still lacks a reference model. Furthermore, no efforts have been devoted to investigating the main guidelines and the specific tools and technologies that can be used for creating and implementing DT (Cimino, et al., 2019; Tao, et al., 2019d). To get a thorough knowledge about DT technology, this research study will provide a systematic literature review to investigate the definitions, key components, and state of the art related to DT technology. Further, this research study will provide a general methodology with the main guidelines to build a DT and validating this methodology through an existing case study.

1.2. Problem formulation

This research study will be mainly based on performing a systematic literature review. Since DT is a relatively new concept and many companies are unaware of its importance and the manufacturing field's benefits. This research study can be considered an initial step towards defining a general methodology with some guidelines for developing DT.

1.3. Aim and Research questions

The objective of this research study is to investigate the main requirements and steps for building a DT technology. In order to fulfil this objective, surveying the scientific literature should be performed to get a thorough knowledge about the current state of the art related to this research topic, such as its definitions, origin, enabling tools and technologies, applications, etc. Thereafter, and based on the results obtained from the scientific literature, the next step will be to develop a general methodology to investigate the main requirements and steps for building a DT applicable in any industry. The goal of this methodology is to provide some guidelines supported with the main requirements and steps that should be followed in order to create a DT technology so that anyone who does not have prior knowledge or not familiar with this concept can get a general overview. Therefore, the great part of this research study will be dedicated to reviewing the scientific literature, and the rest will be embodied in the building the methodology of DT. In order to specify the scope of this research study and fulfil its main objective, the following research questions have been formulated:

RQ1: What is the definition and origin of the DT concept?

RQ2: What are the main enabling technologies that support the implementation of DT?

RQ3: What are the main guidelines for building a DT that can be applied in the industrial field?

As it is evident from the research questions, surveying the scientific literature will contribute to answering the first and second research questions and the answers will be presented in chapter

(9)

3. The results obtained from surveying the scientific literature will be leveraged in answering the third research question.

1.4. Research limitations

It was decided to make a case study for this research work at a Swedish small and medium enterprise SME. A total of 5 interviews were conducted with factory personnel at the start of the research study. However, due to the current circumstances related to COVID-19 as well as the time limitations, it was not possible to keep the contact with the company. Accordingly, the research study was adapted and scoped to the literature review. Therefore, the boundary of this research study was based on what was could be found in the scientific literature. DT technology is a concept that can be applied in diverse areas, for instance, product design, production planning, improvements, detecting failures, etc. Hence, the methodology part will be built in such a way that it can fit all the areas in any industry.

(10)

2. RESEARCH METHOD

This chapter presents the methods that have been used in order to fulfil the objective of this research study. This chapter will include a literature study and research process.

When building any research project in a structured way, the aim of the research project must be defined in order to specify the trajectory that the research project must follow. Research methodology can be defined as a structured way that can plan how the current research project is going to be performed (Hagelbäck, 2017). This chapter presents the research method used in this research project.

2.1 Literature study

The main method of this research study is performing a systematic literature review to investigate the current state of the art related to the research topic and building the theoretical framework. Therefore, this research project will be considered as a secondary study that is based on collecting data from secondary sources.

The systematic literature study has significant benefits: it helps in understanding the research problem, narrowing the research topic, formulating and answering the research questions, expanding knowledge, and improving the research methodology (Hagelbäck, 2017).

A great part of the time has been invested in surveying the scientific literature. The literature review has been focused mainly on the DT topic and bounded to provide an overall background and knowledge about this concept, its history, main requirements, and current practical applications in the industrial domain. Moreover, the literature has reviewed other concepts and technologies that can be involved in the adoption of the DT, such as IIoT, CPS, cloud computing, AI, etc.

In the beginning, three search databases have been used, which are: IEEE Xplore, Scopus, and Mälardalen University Library, to get relevant papers with high-quality publications and ensure the variety of data sources. The first keyword that has been used was “Digital twins” and the result was a large number of papers. Not all these papers were relevant exactly to the research topic, and some of them were included the keywords “digital” or “twin” but it did not completely refer to the Digital Twin. Thereafter, the search process has been repeated and updated many times by using different search strings with different keywords in order to find more relevant papers. In order to cover all the related technologies and concepts, many keywords have been used in parallel with Digital Twin such as CPS, IIoT, etc.

Lastly, the search process has been unified and divided into three steps. In the first step, the database has been limited to Scopus, and search query has been unified to “Digital Twin AND smart factory AND (CPS) OR (IIoT) AND simulation” and around 230 documents have been founded.

(11)

The second step was to apply the exclusion and conclusion criteria, as shown in Table 2.1. The time span has limited between (2015- 2020) to get the recent knowledge related to the research topic, subject area was limited to engineering and computer science, types of papers have been limited to scientific, journal articles and conference papers, and keywords was limited to (Digital Twin, smart factory, IIoT, CPS and simulation). Further, the language has limited to the English, and papers not related to the research topic were excluded. Once exclusion/inclusion criteria have been applied, and duplicated papers were excluded, a total of 130 papers were excluded.

The third and last step was to select the right papers from the 100 paper remained; this step included two phases: the first phase was to check the title and going quickly through the abstract, conclusion, and discussion. If the paper was relevant, then the reading through the whole content of the paper was completed. In this phase, a subset of 44 papers has been completely read. In the second phase, the references list of papers collected have also been reviewed to ensure the quality of these papers as well as snowballing other useful papers to be included in this research study. By applying the same criteria to the new papers founded, ten additional papers were collected. Accordingly, the total number of papers used in this research study was 54 papers. Figure 2.1 clarifies the steps that have been followed in the searching process.

Searching information

Database Scopus

Subject area Engineering and computer science.

Type of papers Scientific, journal articles and conference papers.

Keywords Digital Twin, smart factory, IIoT, CPS, simulation.

Time span 2015-2020

Language English

Table 2.1: Exclusion and inclusion criteria.

(12)

The following figures 2.2., 2.3 and 2.4 are the results obtained from Scopus database with details related to the papers obtained after applying the filtering criteria. Figure 2.2 shows the number of paper published per year. Figure 2.3 shows the number of documents published per source per year, while Figure 2.4 show the number of documents per publication type.

Figure 2.2: Number of documents published per year (source: Scopus database).

(13)

Figure: 2.4: Number of documents per publication type (source: Scopus database).

2.1 Research process

This section provides a description of the steps that have been followed in the research process. As shown in figure 2.5, the research process begins with identifying the research topic and, thereafter, performing a systematic literature review. Next, a survey for the scientific literature has been followed to survey state of the art to investigate the main gaps, challenges, open issues, and opportunities. Accordingly, specific research questions were formulated. Afterward, the general methodology for building DT has been created and validated by using a simple reference case study application. Finally, the discussion for the results obtained and the conclusion drawn from the research study have been presented.

(14)

3. THEORETICAL FRAMEWORK

This chapter aims to survey the state of the art in the scientific literatures related to DT to get an overall overview about the research topic and to investigate the current practices, main gaps, challenges, open issues and opportunities.

3.1 The Overview of DT technology

DT is one of the promising technology that has attracted the academia thanks to its capability in integrating the cyber and physical worlds, which is considered an aspect for realizing and developing the smart manufacturing and industry 4.0. This concept has been recognized before more than 15 years and has been exploited in different areas such as healthcare, aerospace industry, etc. (Tao, et al., 2019d). The viewpoints regarding DT research are not unified among researchers; some of them, such as Gabor, et al., & Weyer, et al., considered DT as a concept that entirely depends on the simulation aspect. Others like Qi & Tao, 2018, Tao, 2018 & believe that it is a three-dimension model in which the physical systems and its counterparts are correlated and collaborated together by a connection means (Barricelli, et al., 2019).

DT can be exploited for different purposes like controlling, diagnostics, prediction, and improvements, and it must be feed by some enabling technologies like a sensor, IIoT, cloud computing, machine learning, etc. DT is known for its capability in evaluating the current behaviors inside the real systems, analyzing its previous behaviors; anticipate the cause of the failures, and proposing the appropriate solutions and decision making to optimize system processes and products thanks to the real-time synchronization and the analysis of multidimensional sources of data (Qi, et al., 2019 ; Qi & Tao, 2018 ; Tao, et al., 2019b). According to Rosen, et al., 2015, any sort of prototype that can be utilized for mirroring and simulating the real status and behaviours of any object can be seen as its twin.

The first model of DT has been created by Grieves, 2014. This model has three main parts: the physical part in real space, the virtual part in virtual space, and the bidirectional connection part, which connects the two other parts and enables the data transmission from the physical space to the virtual space and the feedback information from the virtual space to the physical space as shown in figure 3.1 (Grieves, 2014 ; Tao, et al., 2019d). The physical space includes the physical entities such as machines, products, systems, or humans that can interact with the physical environment, etc. (Barricelli, et al., 2019). The virtual space incorporates the high fidelity virtual models of the physical entities that are created for controlling and improving the operating conditions and behaviors of the physical entities (Qi & Tao, 2018). The physical space is very crucial in building the digital space and provides it with data. Likewise, the digital space is crucial for giving feedback information about controlling and improving the physical space (Tao, et al., 2019d).

DT has three kinds of connection: physical-physical, virtual-virtual, and virtual-physical. The physical-physical connection refers to the interconnection and the coevolution between the physical entities in the physical environment to perform a complex task. The virtual-virtual connection in which different kinds of virtual models are collaborated to build information networks and the virtual- physical connection in which both physical entities and virtual models are synchronized with each other in which the virtual models will be adapted and

(15)

optimized based on the physical entity status. Likewise, the physical entity will be optimized according to the recommendation and feedback that it gets from the virtual models (Tao, et al., 2019d).

Figure 3.1: Three dimensions model of DT (Grieves, 2014).

Based on the three-dimension model of DT, Tao, et al., 2019a have created the five dimension model of DT to include two more additional parts, which are services and connection part. All these parts are dependent on each other. Besides, DT services are crucial for providing services to the systems to increase their performance and efficiency. DT can provide other services called third-party services such as resources service, algorithm service, knowledge service, etc. (Tao, et al., 2019d). The connection part is the part that ties all the other parts together. It enables the interconnection and coevolution among them through a wireless or wired connection, which is called the connection part (Tao, et al., 2019a).

In terms of level, DT can be classified into three levels: unit level, system level, and system of system (SoS) level. The unit level of DT represents a small piece of any manufacturing assets involved in the production process, such as equipment, material, device, etc. At the same time, the DT system level is composed of different unit levels, such as the production line. These two levels can be represented in the digital world in the form of ultra-high-fidelity models (Tao, et al., 2019b). Finally, SoS is recognized based on the unit level and system-level, and it is composed of different system levels such as shop-floor or the whole factory (Qi, et al., 2018)as shown in figure 3.2.

(16)

DT can be classified based on data integration and connection between the physical and digital world into three categories: digital model, digital shadow, and digital twin. The Digital model is representing a physical object but does not demand any type of connection to integrate and exchange data with the physical world. In this manner, the data of the physical object will be interchanged manually to improve the models, and any changes in the conditions of the physical system will not impact the digital part; and conversely, an example of these models could be the simulation models, mathematical models, etc. In the digital shadow, there is one direction flow of data between the physical and digital object. Hence any changes in the physical object will impact the digital object but not conversely. Lastly, Digital Twin is differentiated from the other two categories with the bidirectional connection in which data flow of DT are integrated into two directions between the physical and digital object, meaning that there is a controlling and synchronization between both the physical and digital object and any change in the physical object will affect the digital object and conversely (Kritzinger, et al., 2018). Figure 3.3 shows the data flow in the three categories of DT.

Figure 3.3: Data flow in the three categories (Kritzinger, et al., 2018).

According to Leskovský, et al., 2020, the evolution of DT has passed through four stages: • First stage: including only the physical space

• Second stage: establishing a digital copy to integrate with the physical data • Third stage: the connection between both physical and digital parts.

• Fourth phase: more connection and coevolution between the physical and digital parts.

3.2 The origin of DT, history, and definitions

DT concept is a relatively new concept and has a short and unclear history due to technology limitations at the time it has established (Tao, et al., 2019c). The term "twins" has been used in the Apollo program at NASA when they have created at least two identical space vehicles. The vehicle that has stayed on the earth was exploited to simulate the vehicle's operating conditions during the flight and provide potential solutions in complicated situations (Rosen, et al., 2015).

The theoretical evolution of DT passed through three phases: formulation, incubation, and development. The period between (2003 to 2011) was considered as the formulation period for the DT concept. At that time, it was immature (Jones, et al., 2020), and the technology was not

(17)

evolved to adapt and apply a concept like DT. In 2003, the DT concept was presented by the professor Michael Grieves in a presentation of one of his courses related to the product lifecycle management that is titled with ‘‘Conceptual Ideal for PLM’’ and he constructed the three-dimension model of DT as shown in figure 3.1 (Grieves & Vickers, 2017). In the formulation period, the publication related to DT was few, and the first journal was released in 2011 by Tuegel, et al., 2011. Tuegel, et al., 2011 have considered DT as a method that helps in predicting the useful lifetime of the aircraft by simulating its behaviour on a digital model (Liu, et al., 2019).

The period between (2012-2014) was considered as the incubation period of DT evolution. In 2012, the National Aeronautics and Space Administration (NASA) had captured its possible application in the aerospace industry and had formulated the first specific definition of DT; NASA has defined DT as ‘‘an integrated multi-physics, multi-scale, probabilistic simulation of a vehicle or system that uses the best available physical models, sensor updates, fleet history, etc., to mirror the life of its flying twin”. After that, significant efforts have been devoted to the DT research, and several definitions and publications have appeared (Barricelli, et al., 2019 ; Lu, et al., 2020 ; Tao, et al., 2019d). In 2014, the first white paper was released by Grieves to show the transformation of DT from an idea into an attractive technology that can be applied in practice.

After the year 2015, the spread of many related technologies such as IoT, machine learning, data acquisition, sensors, networking, data processing, etc. has led to the evolution of the DT concept, and it has been applied some industries (Lu, et al., 2020) for instance, DT had been applied in Industry 4.0 in Siemens industry in 2016 (Qi, et al., 2019). According to global research, DT has been classified as one of the best ten promising technologies between (2017-2019) (Gartner, (2017-2019). According to Grand View Research prediction, using DT will make the market obtain nearly 27.06 billion dollars by 2025 as a profit. Further, it is expected that DT would play a big role in the defense and aerospace industries in the future, and it has been classified as one of the six technologies to assist the defense industry (Lim, et al., 2019 ; Qi, et al., 2018). Table 3.1 shows some definitions of DT that appeared in the paper collection.

Expression Definition Source

Integrated system “An ultra-realistic integrated multi-physics, multi scales and probabilistic simulation consist of five parts: physical part, virtual part, data, services and the connection between them. All parts are connected with each other.” “A thorough description of the physical entity and its functional and operation data.”

(Tao, et al., 2018 ; Tao & Zhang, 2017).

(Boschert, et al., 2018)

Counterpart It is a digital counterpart of the physical assets.

(Negri, et al., 2017 ; Vachálek, et al., 2017)

Connection Information and data works as a linkage to connect both the virtual space and real space together.

(Grieves, 2014)

Simulation Simulating physical by means of digital models to forecast its behaviors.

(Grieves & Vickers, 2017 ; Qi & Tao, 2018)

(18)

Virtual copy A replica or dynamic virtual model of the real physical object (e.g. real product, real system, etc. which is represented in the virtual world.

(Madni, et al., 2019 ; Talkhestania & Jazdib, 2018)

Table 3.1: the definitions of DT concept in some scientific literatures.

3.3 Smart factory and industry 4.0 concept

The evolution and implementation of various key technologies have pushed many countries to create many strategies to enable the smart factory by facilitating the integration between the real and cyber worlds or instance: “Industry 4.0”, “Made in China 2025”, “cloud manufacturing”, etc. (Tao & Zhang, 2017). Smart manufacturing has been defined by the National Institute of Standards and Technology (NIST) as “fully-integrated, collaborative manufacturing systems that respond in real-time to meet changing demands and conditions in the factory, in the supply network, and customer needs”. The emergence of DT can affect smart manufacturing from different aspects. By representing the manufacturing systems in the digital world with digital models, manufacturers can optimize their production systems and increase efficiency and productivity because they will have all the information regarding the working conditions and performance of systems. More than that, DT can be exploited to represent workers' personal information and their working status in manufacturing by using digital models to increase the manufacturing efficiency, workers' safety, and their health (Lu, et al., 2020).

The manufacturing industry has recently experienced significant development in digital technologies worldwide, such as IIoT, cloud computing, etc. These advanced technologies helped seamlessly integrate interrelated components, which referred to industry 4.0 or the fourth industrial revolution. Industry 4.0 represents the upcoming digitization stage motivated by massive data, increased computation capability, connectivity, the interaction between humans and machines, and integration between physical and digital spaces. It aims to develop the production systems and environment by making its products and processes smart, integrated, and connected (Redelinghuys, et al., 2020). Arguably the origin of industry 4.0 backs to the IoT technology, a network of connected devices such as hardware, software, and sensors that can gather and exchange data using the internet. To conclude, the emergence of industry 4.0 enabling technologies contributes to facilitating the implementation of the DT by enabling real-time synchronization and data and information exchange between the physical and digital world (Negri et al., 2017; Kritzinger et al., 2018).

3.4 Overview of some enabling technologies for DT implementation

To build and improve the DT, there must be a comprehensive knowledge about various types of technologies that enable the interaction and connection between the physical entity and its counterpart in the virtual space (Lim, et al., 2019). As mentioned earlier, there are many enabling technologies involved in the implementation of DT such as IIoT, big data analytics, cloud computing, AI, machine learning, etc. Together with these technologies' rapid development, DT has been developed and become more specialized (Tao, et al., 2019d ; Tao, et al., 2019b). However, these technologies differ in concept, configuration, implementation and utilization (Lu, et al., 2020). Accordingly, the following subsection presents an overview of some enabling technologies that can be involved in the implementation of DT technology.

(19)

3.4.1 Industrial Internet of Things (IIoT) and CPS

IIoT is a set of smart things connected to form a comprehensive computing network; these things can automatically regulate, transfer data and information, interact, and adapt to the different changes in the real environment (Madakam, et al., 2015). IIoT is an IoT, but it is utilized within the industrial domain. IIoT technology is considerably growing and developing, and it is one of the key enabling technologies for implementing and empowering DT. IIoT is an infrastructure that enables the interconnection and data exchange between the physical systems and digital world by means of sensors and different types of communication protocols (Park, et al., 2020 ; Lu 2, et al., 2020). It is expected that more than 20 billion physical devices will be connected to the internet during the year 2020, specifically in the industrial domain. As a result, enormous data can exceed 40 zettabytes is generated. Such data is gathered by utilizing RFID, sensors, gateways, or IIoT devices (Qi & Tao, 2018). Thanks to the wired and wireless connection between the smart objects in the real entities, services presented to the real world will be increased (Madakam, et al., 2015). Arguably, the emergence of IoT infrastructure with sensor technology enables the interchange of massive data with various sources ( (Barricelli, et al., 2019).

IIoT mainly consists of four layers architecture, the sensing layer that senses data come from the physical world, the networking layer that aids the wireless or wired connection between both the real and physical world in order to interchange data and services, Service layer that offers and organize services to cover user’s requirements. The last layer is the Interface layer that facilitates the connection between humans and applications (Park, et al., 2020). Currently, the IoT cloud system can play a crucial role in DT platforms (Radchenko, et al., 2018).

CPS is a set of hardware objects such as machines, devices, equipment, etc. and software programs like big data analytics. CPS is a paradigm that enables the convergence and real-time interaction between the dynamic physical world and the cyber world within the wireless and wired communication network, and it controls the physical systems and increasing its efficiency and robustness ( (Redelinghuys, et al., 2020). CPS can be seen as the next wave of IoT, and it has two main features: the first is the high-level capability to gather real-time data from the real entities or real-world environment. It transmits it to the digital world by utilizing networking technologies and sensors that sense the real world's changes. The second feature is the high computational capability to process and analyze these data in the digital world by using digital models. The digital world is the main center for processing data and sending it back to the real world with information or instructions for the recommended changes (Masudul A. & El Saddik, 2015).

DT technology is related to the CPS concept. Both of them help enable the smart factory due to their capability to enable real-time synchronization and cyber-physical integration, as shown in figure 3.4. Both of them have gained great attention from academia and industry due to their crucial role in increasing the production systems' effectiveness and quality. The two concepts have physical and digital parts. The physical part comprises everything involved in production processes such as operators, production systems, environment, and material, whereas the cyber or digital part includes different applications, models, and mathematical operations that are processed in a better way to increase and improve the productivity (Tao, et al., 2019b). Digital models work as an appendix to recognize the CPS and construct its basis and function.

(20)

DT is considered as the core part and a precondition for improving CPS (Masudul A. & El Saddik, 2015) due to its capability in tracking the physical systems, controlling, forecasting, and analyse its performance, and supporting decision making without stopping the production (Shao & Kibira, 2018). The integration between DT and CPS in manufacturing will improve the management more accurately and effectively. It worth mentioning that CPS has the same hierarchical classification, structure, and function that DT have, and it can be classified into three levels, which are unit level, system level, SOS level. (Tao, et al., 2019b).

Both CPS and DT have emerged in the same period, but DT did not get any interest from academia, and industry like the CPS got until NASA and the US Air Force begin to exploit DT in aerospace issues. Since then, DT has been realized. Although DT and CPS have many similar characteristics and enable common objectives such as cyber/digital and physical/real integration, connectivity, monitoring the physical systems, they have comparable characteristics. The first crucial difference is that CPS is considered as a multidisciplinary and integrated system that comprises all the real entities, virtual models, hardware devices, connection means, and software programs and depend on combined technologies which are computation, communication, and control (Redelinghuys, et al., 2020), whilst DT is a virtual model that mirrors the real-world entities. DT can depend on CPS to achieve its goals and not the opposite (Zheng & Sivabalan, 2020). In this way, CPS focuses on control more than making replicated virtual models of the physical world as in the DT. Furthermore, DT is closer to the engineering classification, while CPS is closer to the scientific classification. Furthermore, sensors and actuators play a crucial role in implementing both CPS and DT and facilitate physical and digital integration. However, the virtual models are the essence of DT due to its importance in describing and forecasting the behavior of physical worlds, while sensors and actuators are the essence for enabling CPS (Tao, et al., 2019b).

It is worth to mention that the feedback loops in DT and CPS and the bidirectional connection are crucial to enable real-time interaction and collaboration between both physical and digital worlds and control the physical entities (Tao, et al., 2019b). To conclude, realizing the basis and application of CPS is crucial in creating DT (Redelinghuys, et al., 2020).

Figure 3.4: The integration between physical and cyber worlds in CPS and DT (Tao, et al., 2019b).

3.4.2 Cloud computing AI, machine learning, and big data analytics

Cloud computing is one of the technologies concerning IT infrastructure that provides an opportunity to collect, store, and manage the massive IT resources and data efficiently and

(21)

make it accessible from anywhere and at any time by using the internet. Hence, the data stored in the cloud can be accessed and utilized by users more accurately and efficiently (Fu, et al., 2018). Using cloud computing for DT will help in improving and adapting the virtual entity entirely in the cloud and enable the manufactures, designers, and everyone involved in the manufacturing production to get access to the virtual entity and get the information that they need to make significant decisions by using the internet regardless where they are situated. Therefore, it is crucial to have connected devices provided with a high security level to protect systems and enable connection and data transmission (Tao, et al., 2019c).

AI is a collection of algorithms that can sense, understand data and enable machines to learn from experience to anticipate and respond to different situations. It is the intelligence shown by devices and programs (Madakam, et al., 2015). DT enhanced with AI makes DT intelligent, being active in the digital world and take autonomous decisions depending on data analyzed (Autiosalo, et al., 2019). Machine Learning is an aspect of AI. It is a set of algorithms and models that can process massive data produced for AI or a user to extract valuable information and make suitable decisions. It makes systems learn and improve their performance and accuracy through the huge data processed (Ahuett-Garza & Kurfess, 2018).

Data is an indispensable part of enabling the smart factory (Qi & Tao, 2018)and is considered a key enabler technology for building the DT. DT is working with massive data generated from different sources, such as machinery, historical data, real-world data, virtual world data, products, environment, equipment, etc. Hence, it needs different techniques to clean, filter, transmit, and process such data (Tao, et al., 2019c). Real-time data can be gathered from the real equipment using IIoT devices, sensors reading, radio frequency identification devices (RFID), Application Programming Interface (API). Such data provide information about system efficiency, operational conditions, environment, production planning, user data, etc. (Qi & Tao, 2018). Gathering and transmitting data from the physical space to the virtual space makes DT aware of what is happening in the real entities to react more speedily and give feedback for updates and changes to the physical space (Rosen, et al., 2015). Therefore, it is important to have different platforms, different communication protocols, and data processing methods to enable the connection and data exchange and management (Lim, et al., 2019). Data processed can provide designers or manufacturers with the information they need to help them in making more informative and significant decisions (Khajavi, et al., 2019)).

Big data analytics is a way of managing data in such a way that can be exploited in detecting failures of the production systems and help in proposing solutions. Big data analytics works for DT in terms of data, and it is considered as a complement part of DT. The integration between these two technologies plays a crucial role in promoting the smart factory in different areas. They can work in parallel to cut the barriers during different phases of the product lifecycle. Big data can be classified into three sorts: structured data, which can include digits, characters, tables, etc. the semi-structured, which includes: trees or graphs; and the unstructured data that include records, photos, documents, videos, etc. Furthermore, big data comprises many characteristics: volume of data, the variety that includes different types of data from different sources, velocity in processing data, the value of data, etc. (Barricelli, et al., 2019).

(22)

3.5 DT modelling and simulation

DT models and simulation are the main pillars that boost the DT and bringing it to the practice. DT can be considered a part of a specific simulation or an environment that supports creating a special simulation model to perform a specific goal. A simulation is a tool that can be used for optimizing production planning and production processes through its entire lifecycle (Barricelli, et al., 2019 ; Cimino, et al., 2019 ; Kritzinger, et al., 2018 ; Pileggi, et al., 2019). DT technology can represent the next generation in terms of simulation, modeling and improvements (Rosen, et al., 2015), as shown in figure 3.5.

Figure 3.5 : Simulation milestones (Rosen, et al., 2015).

The physical world object's characteristics in terms of behaviors, geometry, physics, etc. are significant and fundamental aspects for building the virtual models of the real-world object. Further, evaluating these models is crucial in confirming the high fidelity and robustness of the physical and virtual worlds (Qi & Tao, 2018).

Arguably, DT is much more than just a part of simulation; it is ultra-high-fidelity models that mirror the real object's working conditions dynamically within the real-time synchronization between both the real and virtual worlds, which means that DT reflects what is happening in the real world presently and predicts what may happen in the future, while simulation reflects what can happen in the future by creating (what-if scenarios) (Schleich, et al., 2017). Furthermore, DT needs the physical world continuously, whereas simulation can work without the physical world (Khajavi, et al., 2019). Hence, DT can control, realize, recommend changes, and improve the functionality of all the physical assets thanks to the real-time data synchronization and exchange between the real and virtual worlds (Saddik, 2018).

3.6 The main requirements for building a reference model for DT

According to Lu, et al., 2020, building a fundamental reference model for DT concept demands three basic requirements: (1) Information models representing the characteristics of the real object. (2) Connection methods enable the interconnection and interchange of data between the physical and digital world and (3) the processing of data from different sources to build a real-time representation for the real object. These requirements are dependent on each other, and if

(23)

anyone of them is missing, DT will lose its value and essence as a concept. All the three requirements are clarified in figure 3.6

Figure 3.6: DT reference model (Lu, et al., 2020).

To fulfill the requirements of this DT reference model, there must found:

1. Connection devices to enable the direct or indirect connection and interchange of data between the real world and digital world and between DT and other DT in the same environment. Further, enabling the connection between DT and field specialists in order to be able to run DT by means of interaction interfaces.

2. Using different ontologies that are capable of managing the massive data generated from different sources and converting it into an understandable form that can be shared and exchanged between humans and various machines. And;

3. Creating high fidelity models in the virtual space that can receive real-time data constantly from the real entities in order to be able to realize, analyze, forecast, and suggest improvement by sending commands to these entities. Meanwhile, real assets should react and perform these commands. This closed-loop of the interaction of the physical and digital world will enable DT to optimize the entire production process (Barricelli, et al., 2019).

3.7 Main parts of DT

Based on the five dimensions model constructed by Tao, et al., 2019d, DT will consist of five parts. All these parts are dependent on each other and have an equal role in constructing a full DT. These five main parts are described as follows:

1. Physical space: is the physical environment where the physical entities are existing, and the manufacturing activities are performed. It is considered as the main container of all the physical manufacturing resources. This space includes, for instance, different smart devices, machines, materials, sensors, robots, human-machine interfaces (HMI), etc. (Zheng & Sivabalan, 2020 ; Tao, et al., 2019b).

The various types of physical entities in the physical space are described in details as follows:

(24)

 Physical environment: it also refers to the physical world or physical space where the physical entity lies. It refers to any measurable parameter that impacts the environment and entities and can be provided to the digital environment (Jones, et al., 2020).

 Physical component: it can be an individual part of equipment or machine, etc. (Leskovský, et al., 2020).

 Physical object: it can be a product or smart manufacturing device that includes several connected components (Leskovský, et al., 2020). This object cannot interact or execute anything in the real environment (Josifovska, et al., 2019).

 Physical process: it is the activities performed in the physical world (Leskovský, et al., 2020).

 Physical system: it is a combination of many objects, components, etc. (Leskovský, et al., 2020).

 Physical node: is the part that can recognize the physical object and collecting its data, operational conditions, and behavioral status and provide it to the digital world. Nodes can also interact with other nodes in order to execute a task in the physical environment. Some examples of these nodes: different types of sensors, smart gateways, actuators, interfaces, etc. (Josifovska, et al., 2019 ; Parrott & Warshaw, 2017).

 Human: humans can be recognized by the physical nodes. However, there is no evidence from the literature that humans can be included in DT's building process, but humans can have a role in the main production process (Josifovska, et al., 2019).

2. Digital space: it is the space where the high fidelity models of DT are created. It contains all the manufacturing resources in the virtual environment. Further, this space contains different applications and services related to data management, analytics, and computing such as AI, machine learning, cloud computing, etc. (Jones, et al., 2020).

The modeling process is the basis for making a factual representation of the physical world in the digital world based on real-time data and historical data that help provide information about different aspects of the physical entities and optimize its performance (Tao, et al., 2019b).

3. Data: DT is working with massive data that came from different sources with multi-dimensions and scales. Therefore, data is an indispensable enabling part of building the digital models of DT. Data sources can be machinery, historical data, real-world data, virtual world data, products, environment, equipment, services, etc. (Qi, et al., 2019).

4. Services: it is basically integrated software platforms that include different applications for providing services such as management, improvement, monitoring and prognosis and prognostic and supplying it to the physical or digital world (Barricelli, et al., 2019 ; Qi, et al., 2019). Services can facilitate the application of DT, especially in product design. Other services are called third-party services, which can be achieved through DT (e.g. data service, equipment service, algorithm service, knowledge service, simulation services, etc.) (Josifovska, et al., 2019 ; Tao, et al., 2019d ).

(25)

5. Connection: this part has a significant role in enabling the connection among the physical space, digital space, data, and services; it enables the real-time data transmission between the physical and digital parts. The connection between all DT parts can be classified into six types: physical-digital models connection, physical-data connection, physical-services connection, digital models-data connection, and digital models-services connection (Qi 2, et al., 2019).

3.8 The fundamental requirements for enabling each part of DT

DT is a complicated and long operation that requires a diverse mix of technologies and tools for enabling all its parts .i.e. physical space, digital model, data, service, and connection between them. Each part of DT is enabled by a related tool or technology that is significant to facilitate the respective part's role in the DT (Qi, et al., 2019). This section presents the main requirements provided with some examples for recommended tools and technologies needed to enable each part of DT as follows:

1. Physical world requirements: the physical world entities must be smart to enable the collection of data, integrate, and coevolve with the digital world. Besides, it is necessary to use different enabling tools and technologies to recognize and control the physical environments, collect its data, and enable its entities to work effectively and safely. Such technologies can be, for instance: RFID, sensors, etc. Using these technologies, the physical world (i.e., entities and processes) are mapped to the digital world to create models that are high fidelity to reality (Qi, et al., 2019).

2. Virtual world requirements: to build ultra-high fidelity models, the physical world data must be recognized, managed, and visualized to be exploited in the modeling process. By using different sophisticated technologies such as machine learning, artificial intelligence, cloud computing, etc., virtual models will be created to mirror the real world's current behaviors and operating conditions. As the real world is always changing, the virtual models must be updated according to the new changes using related technologies (Qi, et al., 2019).

The modelling process comprises various types of semantic models, these types are:

Geometric Models: describe and represent the geometric features of the physical

objects, for instance: volume, shape, structure, etc. Tools that can be used in these models, for instance: 3D MAX, AutoCAD, etc.

Physical Models: represent the physical characteristics of the physical entities such as

functionality, distortion, deformation, cracking, corrosion, etc. by using different methods such as the Finite Element Model (FEM) (Qi, et al., 2019 ; Tao & Zhang, 2017).

Behavioral Models: describe the behavior of the physical entities, its reaction to the

changes that could happen in the real environment, and its efficiency by using different methods such as Finite state machine, neural network, etc. (Qi, et al., 2019 ; Tao & Zhang, 2017).

Rule Model: this type of model represents the rules and limitations of the equipment

and provides DT with the ability to judge, assess, improve, and predict. Rules can be derived from historical data or expert’s knowledge, For instance, using data mining algorithms (Qi, et al., 2019 ;Tao & Zhang, 2017).

(26)

Process Model: this model explains the main process that the physical entity is

involved in (Josifovska, et al., 2019).

3. Data: data is considered as a prerequisite for building new knowledge (Tao, et al., 2019d). DT is such a system that deals with large amounts of data generated from different sources. Therefore, data is situated in the center of DT. Using different analytics and fusion technologies, these data are gathered, transmitted, processed, analyzed, integrated, visualized, and stored to extract valuable information from it and show some invisible patterns. All these stages that manage data represent its lifecycle. Arguably, any changes in the operating condition and physical environment will affect and change data. Some examples of data sources and how they gathered can be summarized as follows:

 Hardware data: this type of data can be gathered from hardware systems by using different technologies such as sensors, IIoT technologies, Barcodes, QR codes, (RFID), etc. (Qi, et al., 2019).

 Software data: This type of data is gathered by using different application programming interfaces (APIs).

 Network data: this type can be gathered through the internet by using different search engines or APIs.

 Historical data: is collected from the physical entity, and it helps in understanding the current and previous situation of the physical entities and predicting its future state (Leskovský, et al., 2020).

 Data related to materials and products such as performance, stock, etc. are gathered through service systems or from the product or material itself (Qi & Tao, 2018).

 Environmental data which includes the changes that affects the physical environment for instance, temperature, humidity, vibration, etc. (Qi & Tao, 2018).

 Management data: this type comprises data related to the manufacturing information systems and computer-aided systems. It could be related to design schemes, production planning, etc. (Qi & Tao, 2018).

 Customer data: this data are related to customer feedback and comments that are collected from e-commerce platforms and online websites for instance: amazon or social media platforms such as Facebook, LinkedIn, YouTube, etc. (Qi & Tao, 2018).

4. Services: DT requires different services that consist of different software applications, platforms. Services are crucial for providing the real system and virtual space with appropriate services and solutions based on their requirements (Barricelli, et al., 2019 ; Qi, et al., 2019).

5. Connection: different tools and technologies such as networking, communication protocols, communication standards, and interfaces technologies, sensors, etc. are needed to enable the two-way communication and data exchange between the physical and virtual world (Barricelli, et al., 2019 ; Qi, et al., 2019).

3.9 Data lifecycle management

Data passes through several stages taken in sequence to represent its lifecycle, as shown in figure 3.7. The data lifecycle gives a clear picture of how different data are collected and transformed into valuable and useable information to be used for future decisions. To manage data at each stage, it demands a specific enabling technology or tool for each stage. As data is

(27)

increasing all the time, data enabling technologies must be developed to deal with such large data (Qi & Tao, 2018). The data lifecycle is represented in the following steps:

Data collection: this is the first and important stage in the data lifecycle where physical

entities data are collected from different sources in the physical world by using different enabling technologies (e.g., sensors, RFID, IoT devices, etc.) (Qi, et al., 2019 ; Tao, et al., 2019c).

Data transmission: data can be transmitted using wireless and wired connection

standards and protocols, such as 5G (Qi, et al., 2019).

Data processing: in this stage, real-time data or offline data are analyzed and processed

in the digital world using different analytics techniques to extract valuable information from these massive data. (Qi & Tao, 2018 ; Qi, et al., 2019). In this stage, data will be identified, cleaned, compressed, transformed, and duplications removed. Some examples of technologies that can be used in this stage: neural network methods, deep learning, statistical methods, edge cloud computing, AI, machine learning, etc. (Qi, et al., 2019).

Data fusion and integration: As data sources are various, in this stage, data will be

synthesized, combined, filtered, and optimized to discover the invisible patterns and be prepared for planning, judgment, etc., (Qi & Tao, 2018). One of the recommended technologies for data fusion could be AI. It worth mentioning that advanced AI techniques can be used to boost DT’s realization capability to help in giving recommendations and solutions automatically (Tao, et al., 2019c).

Storing data: in the stage, data are stored to be ready for future utilization using

different technologies such as NewSQL, cloud computing, etc. (Qi, et al., 2019)

Data visualization: this is the final step in which the outcomes of data and information

analyzed are displayed clearly to show all the rules, conditions, logic, etc. Data outcomes can be displayed in the form of schemes, histograms, line charts, tree charts, audio, video, etc. In this way, the user can interact with data. Data visualization technologies are changing according to the applications used (Qi, et al., 2019).

(28)

3.10 Architectures for building the DT

3.10.1 Four-layered CPS architecture for building the DT

Zheng & Sivabalan, 2020 have proposed this architecture, and it is based on the CPS concept. In this architecture, DT must have two modes of functionality: monitoring mode and control mode that operate simultaneously. In the monitoring mode, DT will act as a digital shadow of the real object and will have restricted monitoring over the real object, while in the control mode, it will act as a control center that has full control over the real object. This architecture considers the hardware and software sides and the key technologies that help improve the DT to be exploited as a reference model for the future development of product-level DT. This architecture consists of four layers as follows:

1. Physical layer: this layer comprises all the involved physical entities such as machines, sensors, HMI, etc. Each entity uses suitable communication protocols and communication interfaces for data transmission. To make these systems monitored in the digital layer, all data during its lifecycle must be accessed and managed using different data acquisition techniques. All the data produced from the physical layer will be encoded, relying on the communication protocol utilized by following the OSI model.

2. Data extraction and consolidation layer: this layer works as a data connection between the physical and digital layers. It collects and transmits the data coming from the physical layer to the digital layer. In this layer, data is converted and integrated into a machine-readable form such as txt. , csv file, and after that it uploaded to the cloud.

3. Digital layer: in this layer, DT is created by using different models on the cloud to mirror the physical system. In this layer, three types of models are created to develop the product level DT. These models are the digital model, computational model, and graph-based model. The models are interconnected and managed through API, which is related to the DT.

4. Interaction layer: this is the last layer in this architecture, and it enables the interaction between humans and real-world systems using DT, which is found in the cloud. Using a highly protected network gateway, DT can be accessed from any place and any device that can display information such as smartphones, iPad, mobile phones, etc. as shown in figure 3.8.

(29)

Figure 3.8: four- Layered CPS architecture for building the DT (Zheng & Sivabalan, 2020).

3.10.2 Six-layered architecture for building the DT

This architecture has been established by Redelinghuys, et al., 2020, and it is based on the 5-level CPS structure. It enables data and information exchange and coevolution between the physical world and DT and between DT and the external world. It consists of six layers:

Layer 1 consists of different devices such as sensors or actuators that can give or exploit a signal to be interchanged with a controller or data acquisition device lie in Layer 2. Accordingly, Layer 2 will be a data acquisition device considered a source of data in the physical twin, as shown in figure 3.9.

Layer 3 is mainly connected with Layer 4 and transmitting data to Layer 5 and occasionally interacts with Layer 6. The tool used in layer 3 is called Open Platform Communications Unified Architecture (OPC UA) server. It helps in exchange and protect data and information and enables the interaction among different devices. Layer 4 is where data and information are converted and managed. It is also called IoT gateway. It manages data coming from layer 3 and makes it useable for the higher layers. It connects Layer 3 and Layer 5 by converting data coming from Layer 2 into information and then delivers it to the cloud repository in Layer 5. Layer 5 is the information repository that stores all the information coming from Layer 4 in the cloud and makes it accessible by Layer 6. This information can be used to evaluate the physical twin's current behavior, and thereby it can be used for decision making. Finally, Layer 6 is the last layer, and it is linked with Layer 3, 4, and 5. This layer is supplied with simulation

(30)

software. It works like a user interface that enables the user to interact with it through real-time or historical data about the physical twin.

Figure 3.9: Six layered architecture for building the digital twin (Redelinghuys, et al., 2020) . This architecture gives a clear picture of the flow of data and information between the physical and DT using different layers to foster its intelligence. In this architecture Redelinghuys, et al., 2020 have utilized OPC UA servers and cloud-based database services to decrease the need for developers’ experiences.

3.11 Digital twin shop-floor (DTS)

The shop floor is an example of a system of system (SOS level). The digital twin shop-floor (DTS) concept has been created based on the DT concept to achieve the integration and interaction between the physical and digital worlds in the shop floor. Tao & Zhang, 2017 have proposed DTS and it includes four elements: physical shop floor, virtual shop floor, integrated service platform, and fused data of the three parts, as shown in figure 3.10. The physical shop floor comprises all the physical entities, human, equipment, etc. The virtual shop floor includes high fidelity virtual models representing the physical shop floor, and these models are based on four levels: geometry, physics, behavior, and rule. All these models are incorporated in terms of function and structure to construct the physical equipment's virtual representation. Further, verification and validation for these models are crucial to assure its reliability.

Figure

Table 2.1: Exclusion and inclusion criteria.
Figure 2.3: Number of documents published per year per source (source: Scopus database)
Figure 2.5: Research process plan
Figure 3.1: Three dimensions model of DT (Grieves, 2014).
+7

References

Related documents

The results can be compared with the commonly used IPAQ, which actually provided low and non-significant correlations when the agreement was assessed with ActivPal as the

The Board of Directors approved on 18 December 2008 a new share-based incentive plan (Performance Share Plan 2009–2011) to be offered to the President and CEO and other members of

In Finland the processing of health-related personal data for scientific research purposes is regulated by the Medical Research Act (488/1999), the GDPR, the complementary

Authors find another connection to previous research who say that consumers' preferences and liking towards the product influence decision making and has an impact on how

This may seem a strange turn, not least because, if in later paintings Homeric themes are frequent, at first glance they are not really prominent in Engonopoulos’s poetic

The financial information is improved to a certain extent that seems to reduce information asymmetries, and auditors can utilise analysis of big data which has led to that

settings that are unlikely to have complete coverage at high quality in the near future is to select a circumscribed population from which reasonably detailed, complete, and

Is it one thing? Even if you don’t have data, simply looking at life for things that could be analyzed with tools you learn if you did have the data is increasing your ability