• No results found

The umbrella term Industry 4.0 and the digitization of the industry : A qualitative study on the progress of Swedish companies

N/A
N/A
Protected

Academic year: 2021

Share "The umbrella term Industry 4.0 and the digitization of the industry : A qualitative study on the progress of Swedish companies"

Copied!
50
0
0

Loading.... (view fulltext now)

Full text

(1)

Linköpings universitet SE–581 83 Linköping

Linköping University | Department of Computer and Information Science

Master’s thesis, 30 ECTS | Information Technology

2021 | LIU-IDA/LITH-EX-A--2021/030--SE

The umbrella term Industry 4.0

and the digitization of the

indus-try

A qualitative study on the progress of Swedish companies

Tim Hellberg

Filip Ström

Supervisor : Patrick Lambrix Examiner : Olaf Hartig

(2)

Upphovsrätt

Detta dokument hålls tillgängligt på Internet - eller dess framtida ersättare - under 25 år från publicer-ingsdatum under förutsättning att inga extraordinära omständigheter uppstår.

Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut enstaka ko-pior för enskilt bruk och att använda det oförändrat för ickekommersiell forskning och för undervis-ning. Överföring av upphovsrätten vid en senare tidpunkt kan inte upphäva detta tillstånd. All annan användning av dokumentet kräver upphovsmannens medgivande. För att garantera äktheten, säker-heten och tillgängligsäker-heten finns lösningar av teknisk och administrativ art.

Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den omfattning som god sed kräver vid användning av dokumentet på ovan beskrivna sätt samt skydd mot att dokumentet ändras eller presenteras i sådan form eller i sådant sammanhang som är kränkande för upphovsman-nens litterära eller konstnärliga anseende eller egenart.

För ytterligare information om Linköping University Electronic Press se förlagets hemsida http://www.ep.liu.se/.

Copyright

The publishers will keep this document online on the Internet - or its possible replacement - for a period of 25 years starting from the date of publication barring exceptional circumstances.

The online availability of the document implies permanent permission for anyone to read, to down-load, or to print out single copies for his/hers own use and to use it unchanged for non-commercial research and educational purpose. Subsequent transfers of copyright cannot revoke this permission. All other uses of the document are conditional upon the consent of the copyright owner. The publisher has taken technical and administrative measures to assure authenticity, security and accessibility.

According to intellectual property law the author has the right to be mentioned when his/her work is accessed as described above and to be protected against infringement.

For additional information about the Linköping University Electronic Press and its procedures for publication and for assurance of document integrity, please refer to its www home page: http://www.ep.liu.se/.

(3)

Abstract

The umbrella term industry 4.0 stands for the ongoing integration of modern smart technologies in industrial settings. The implementation of the technologies it covers has been an ongoing process for a decade. However, there is a huge variation in companies adaptation of these technologies and progress is slow. This master’s thesis is conducted in collaboration with Cybercom Group and aims to investigate what is keeping the adapta-tion of Industry 4.0 technologies from taking of. This study was mainly tackled through qualitative interviews with nine different Swedish industry companies. In these interviews their opinions and thoughts on the current state of their factory in contrast to the industry 4.0 vision were discussed. This study reveals that limitations such as high costs on IoT sensors, unsure return of investment, battery constraints on IoT devices, unstable avail-ability on IoT devices connected to the cloud and a general lack of knowledge hampers the adaptation of Industry 4.0 technologies. Finally we provide a price approximation for the monthly cost of streaming data to the three biggest cloud providers (AWS, Microsoft Azure and Google Cloud) for a use case of 8 sensors monitoring a blasting machine. This case shows that the price models of AWS and Google cloud is better suited for smaller cases than Azures while both AWS and Azures price models outperforms Google clouds price model at larger scales.

(4)

Acknowledgments

We would like to thank Olaf, Oscar and Niklas for supporting us with various issues through-out the project and cheering us on. We also want to thank everyone at Cybercom. We want to direct a special thank you to all the companies and persons we have interviewed. Finally, we would like to thank our opponents Nils Tyrén and Måns Fredriksson for providing con-structive feedback at the Opposition seminar.

(5)

Contents

Abstract iii Acknowledgments iv Contents v List of Figures vi 1 Introduction 1 1.1 Motivation . . . 1 1.2 Aim . . . 1 1.3 Research Questions . . . 1 1.4 Delimitations . . . 2 2 Theory 3 2.1 Related Work . . . 3 2.2 Industry 4.0 . . . 6 2.3 Internet of Things . . . 6 2.4 Wireless Technology . . . 7

2.5 Cloud Computing and Cloud Services . . . 9

2.6 Big Data Analytics . . . 10

3 Method 12 3.1 Pre-study and Interviews . . . 12

3.2 Analysing the Data . . . 13

3.3 Comparing the Price of Cloud Services . . . 13

4 Results 17 4.1 Interview Results . . . 17

4.2 Interview Summary . . . 30

4.3 Price Calculation Results . . . 32

5 Discussion 35 5.1 Results . . . 35

5.2 Method . . . 37

5.3 The Work in a Wider Context . . . 38

5.4 Future Work . . . 38

6 Conclusion 40

(6)

List of Figures

3.1 Blasting machine use case . . . 14

3.2 Illustration of batching . . . 16

3.3 Price table for Cloud IoT Core . . . 16

(7)

1

Introduction

1.1

Motivation

The digitization of the industry has been an ongoing process since the 1990’s with the adop-tion of personal computers enabling the automaadop-tion of machines. This is commonly referred to as the third industrial revolution. Now we are in the midst of the fourth industrial revo-lution, industry 4.0. Originally a German initiative, industry 4.0 is the ongoing integration of modern smart technologies in industrial settings. Technologies such as the Internet of Things (IoT), Cloud and the fourth/fifth generation of telecommunication (4G/5G), to name a few. Through a pre-study consisting of literature review and interviews with companies, we have observed an abundance of high level presentations of the industry 4.0 concept. Since we are still in the middle of the revolution there are also many technologies that are not yet available or fully developed, e.g. 5G. Identifying which of the industry 4.0 technologies are used in in-dustry today, how they work and how they can be improved is therefore of high interest. The bundle of technologies we will investigate further in this thesis is the Connectivity Bundle, which as described by Nilsen and Nyberg [21], encapsulates the aforementioned technologies (IoT, machine-to-machine interfaces, Cloud, and big data analytics).

1.2

Aim

The aim of this thesis is to provide insight into the connectivity bundle of industry 4.0 tech-nology and identify what factors in implementing these technologies are relevant in creating value.

1.3

Research Questions

1. What factors are relevant for a factory interested in implementing IoT with cloud? 2. What are good metrics to measure when evaluating these factors?

(8)

1.4. Delimitations

1.4

Delimitations

The implementation of IoT and cloud technology in an industrial environment is of course very case specific. The results of this thesis will therefore be dependent on the companies interviewed and their experiences.

(9)

2

Theory

In the following chapter previous research and all relevant theory to this thesis will be pre-sented. The goal is to provide enough insight into every topic so that the result and discussion chapters will be easy to follow.

2.1

Related Work

A multiple case study on the adoption of industry 4.0 technology in manufacturing have been made by Nilsen and Nyberg [21]. Nilsen and Nybergs thesis had a qualitative approach conducting semi-structured interviews with five different industries. These were the au-tomation, heavy equipment, aerospace, electronics and motor vehicle industries. The thesis were done in collaboration with a tool company that had customers within these types of industries. The purpose of the thesis was to categorize industry 4.0 technologies based on their application in industrial settings. The attempted categorization resulted in a suggestion of two so-called technology bundles, namely the Human-machine-interface (HMI) bundle and the Connectivity bundle. The HMI bundle consists of technologies that focus on im-proving the labor work of operators, i.e. employees that repair and monitor the machines in the factory. An example of this is augmented reality used to guide operators making repairs or replacing parts. This bundle was observed to occur more often in industries with flexible production logistics such as car manufacturing. The connectivity bundle consists of technologies such as IoT, machine-to-machine interfaces, cloud and big data analytics. The main goal of this bundle is to enable a more extensive data analysis with the aim that this can create optimisations in the production processes. The connectivity bundle definition is adopted in this thesis as it accurately describes the collection of technologies that are our main focus.

With Industry 4.0 the focus on data analytics have been a hot topic, especially the predic-tive analytics. Building a good data model could aid in answering the questions regarding what could happen in the future. Sivri and Oztaysi have surveyed a few papers with different predicitive methods and with different applications [30]. Some of the most used models are the Neural Network and Decision Tree, according to their study. In the literature review were several different types of applications looked at, e.g. quality prediction, power consump-tion, anomaly detection and machine efficiency, which all used different machine learning

(10)

2.1. Related Work

methods. A model is pretty much a function that need some input variables and from them generate a result that hopefully will represent the real end result good enough. Sivri and Oztaysi also made their own real life test case, where they put a few predicitive methods to the test at a shoe production company. The models were supposed to predict the production time. Sivri and Oztaysi collected the data, created the models and then trained them and ran the test on them. The tested models were CHAID, which is a Decision Tree model, a K-nearest neighbour model and an AI Neural Network model. Where the Decision Tree performed the best of the models when compared. The importance of big data analytics and data driven decision have been very clear both from the literature and the interviews. In our research we thought about making a similar small scale experiment where we implemented a data model and analysed data in an industrial context. However, due to limited time it was out of scope for this thesis.

There are lots of factors that come to play when implementing industry 4.0. Raj et al. have put together a list of different barriers that could potentially hamper the adoption of industry 4.0 in the industries [27]. First were a literature review done where the authors created a list of 23 different barriers that later were presented to six experts in the field. Two experts were from France and four were from India, which with the purpose of this research were impor-tant since they wanted to establish differences between developed and developing countries. In our research only Swedish companies will be considered and that aspect therefore will be ignored. After consulting the experts were only 15 barriers left as the rest were found to be not as relevant to implement industry 4.0. Relations between the barriers were then tried to be identified, in order to find their importance relative to other barriers. They sorted the barriers in to three different categories, the first category were the prominent barriers. These barriers have a great impact on other barriers, meaning that they are quite important to solve before working on other barriers. The most common factor hindering the industry 4.0 were "lack of a digital strategy alongside resource scarcity" and solving this should be a priority for most companies. The second category is the influencing barriers, which means the once that have the most crucial effect on industry 4.0. The most important in India were "lack of standards, regulations, and forms of certification" and "low maturity level of the desired technology" were the most important in France. The final category is the resulting barriers, meaning the barriers that were affected the most by others. The top factor in India were "high investment in Industry 4.0 implementation" and in France were the "ineffective change management" factor the top one. The major take away from this research seems to be that managers have to enhance internal capabilities as well as formulate strategies and road maps, in order to be successful. In our research, where we wanted to discover the important and valuable factors when implementing industry 4.0, we also found factors that currently hamper the adoption. However, we focused more on the value industry 4.0 create and why industry 4.0 will be part of factories in the future, instead of why industry 4.0 is something that factories still struggle with.

P. Raptis et al. highlights the crucial role data plays in the realization of Industry 4.0. Due to the traditional centralized point-to-point control and communication in modern factories not being suitable for the increasingly challenging requirements of networked environments, it is argued that it will be decades and not years before the industry 4.0 vision will be state-of-the-art. In order to provide information on the current status of research done in the data management and communication area of industrial applications the authors have performed a survey on "data enabling industrial technologies and data centric industrial services from the point of view of data management as it applies to networked industrial environments" [28] over the period of 2015-2018. The authors provide a rich and detailed description of the current Information and Communication Technologies as they are applied in industrial systems. We argue that these technologies are the very same as the ones covered by the adopted term "Connectivity bundle". However, this article walks through all the

(11)

2.1. Related Work

actual technologies and protocols covered by the term in great detail. One example of this is that they have through their survey found that Zigbee, WirelessHART and ISA100.11a, and WIA-PA are the four mainly used wireless standards in industrial environments. Fur-ther they have through their extensive literature review identified open research challenges on data management in industrial networked environments. One of these is engineering energy efficient data delivery solutions with low latency. This is necessary because of the increasing number of battery-powered devices such as Industrial IoT devices etc. Because of these devices industrial networks may consume high amounts of energy. Another as-pect with the energy consumption is the battery running dry on the sensor, something that has been observed as an issue for a company interviewed in this thesis. The other open research challenges were data distribution in local and mobile clouds, distributed, real-time data security for industrial robots and assembly line, and convergence between the indus-trial/automation/manufacturing field and the communication/networking/computation field. We would argue that their work is similar to ours in terms of what subject in industry 4.0 they focus on, which is the data analytic and communications subject. However their work covers more of the academic side while we investigate the more practical side by surveying companies invested in the subject.

The focus of the work by E. Vedin were to establish how industry 4.0 affect the operators in the factory as well as establish their feelings towards it. The interviews were performed with two factories in different stages of their digitization, one in an early and one in a later stage. From these factories came 8 operators, five from Factory A and three from factory B. Factory A were in the beginning of their implementation of industry 4.0 and factory B were in a later stage. From the interviews six main categories with 14 sub categories were created. The first category were "New technology relieves employees", which have meant that they have gotten more efficient and have to do less physical work but also that there are fewer employees and therefore less people to socialize with. The second category were "There are requirements for established functioning communication". Often when a new technology have been introduced are the requirements on the system and the employees not entirely clear, which causes problems in the beginning. The other major issues here is the abstraction of the system, the operators do not know exactly what the system does, which makes it hard to solve some of the problems that occurs. The third category were "Employees need to feel important", the automation have meant that operators sometimes feel a little bored when they monitor the systems. Others have expressed that they feel important because someone has to do their work. The fourth category were "New tasks & routines must not be overwhelming". The operators want to be involved in the development of the system, both to feel involved and it would also make it easier to get to know the new systems. The operators also felt that the way they implement the new changes should be gradually in order to not overwhelm the employees with new systems. The final category were "What kind of experience you have presents different challenges". Which means that different employees have different experience and the challenges from the system might differ a lot depending on who uses the system. The core category were "mutual understanding", which summarise the most important concept from the interviews and all the other categories. All parts of the system, both those who develop the systems and those that use it in their daily work, needs to be involved when implementing industry 4.0 solutions [36]. Like in this thesis, E. Vedin conducted her work with qualitative interviews with different industry companies. However, our focus lies on the value of industry 4.0 technologies while Vedin covers an important aspect which is absent from our work which is how the implementation of industry 4.0 technologies effects the employees of the factories.

(12)

2.2. Industry 4.0

2.2

Industry 4.0

Industry 4.0 or as it also has been called, the fourth industrial revolution, was introduced already back in 2011. The idea behind industry 4.0 is to create cyber-physical systems (CPS), where virtual space and physical space are integrated with each other. To enable industry 4.0 Information and Communication Technologies (ICT) are important and they will build the infrastructure for industries in the future. Da Xu et al. state a few technologies in the article "Industry 4.0: state of the art and future trends" [39] that will make it possible to create the next generation of industries . Some of those technologies are IoT and the technologies related to IoT, cloud computing, CPS and industrial integration. IoT involves a lot of other technologies but the main idea behind it is to attach sensors to machines which makes it possible to connect the machines to each other. This would be one step to creating a CPS where the machines represent the physical world, and the connection to other machines and computers creates a virtual world. Connecting lots of machines will need great computational power to process all data in real time. This is why, according to the authors, cloud computing is such an important part. Cloud computing offers great storage space and computational power to make the real time decisions that is needed. Another step the industries have to make to create a CPS is integrating these systems into their architecture and business to enable these technologies. Da Xu et al. propose that there are a few challenges left to solve. These are technological challenges such as the current information and communication infrastructure not yet being compatible with industry 4.0 technology, scalability, real time data analysing and other IoT related technologies. Another challenge is the lack of standardization, which they say is a key aspect to make the industry 4.0 a success. The last important aspect of the industry 4.0 is the security and privacy aspect, which is quite hard and complex to implement correctly [39]. This is because traditionally information streaming out from factories have been heavily limited due to the sensitivity of the data. This is something that will radically change when implementing solutions that utilize public clouds etc.

2.3

Internet of Things

The Internet of Things generally speaking is just devices communicating with each other over the internet. This could be done using sensors, different software’s and of course some communication protocol. According to Gilchrist [12], there are four different focuses in the field of IoT, the commercial, consumer, enterprise and industrial focus. The commercial focus on things such as banking and financial services, the consumer focuses on things like smart homes and the enterprise focus on businesses. The fourth and the one in focus in this thesis is the industrial IoT which focuses on things like manufacturing and logistics in an industrial environment [12].

Industrial Internet of Things

Lee and Lee have defined three IoT categories applicable to enterprises that can be used to enhance customer value, which in return could improve revenue [19]. The first category is monitoring and control. By collecting data on how equipment performs, how much energy it uses and environmental conditions it is possible to identify areas of potential improvement and as a result optimize operations. This could potentially lower both costs and achieve higher productivity depending on the application. The second category is Big data and busi-ness analytics. Here we find the customer value by using analytic tools to process the large amounts of data generated by IoT devices. The example Lee and Lee uses is the Oral-B Pro 5000 interactive toothbrush which is able to provide users with information about their oral routine. The toothbrush does this based on how the user brushes their teeth, with the goal of improving their dental care. The third category is information sharing and collaboration.

(13)

2.4. Wireless Technology

An example of this could be sensors placed in the lock mechanism of a door to a house. The homeowner could then be alerted through their mobile device if the door is not locked. The core of this idea is defining events that can be caught by sensors. The next step is to define what the sensors catching the event should do with that information, if it is to be shared with other IoT units, with people or both [19].

Furthermore, Lee and Lee claims that a potentially appropriate method for assessing risk and rewards with investment of IoT technology is real option valuation [19]. The definition of a real option according to Adam Hayes is "A real option is an economically valuable right to make or else abandon some choice that is available to the managers of a company" [15]. The real option valuation takes into considerations the steps of a process and not just the final result, allowing for investors to jump ship or change course at each step. The interested reader can watch "The Essence of Real Options" by Damodaran for a thorough explanation on real options [9]. According to Lee and Lee a real option valuation approach fits the high uncertainty and risk of the information technology field [19].

2.4

Wireless Technology

Based on the IEEE 802.11 standard, Wi-fi was commercially released in 1997 [11]. The in-dustry were early adopters of this novel technology. They recognized that the potential of wireless in certain industrial use cases were great. For example the implementation of com-munication in factories and warehouses could be drastically improved because the traditional solutions with cables and wires were sub optimal due to the vast areas of their shop floors [12]. Presently there are many other alternatives to Wi-fi such as Bluetooth (IEEE 802.15.1), Cellular (3G, 4G, 5G) and Zigbee (IEEE 802.15.4) to name a few. These are all wireless tech-nologies used in IoT and have their strengths and weaknesses which will be covered in the following sections. The three main traits to consider when comparing wireless technologies for IoT is bandwidth, coverage range and power consumption [8].

• Bandwidth in this context is the rate at which the network can receive and process data. Since IoT devices can consume a lot of data the bandwidth of the network can therefore be of high importance.

• Coverage range is the physical distance which the signal can travel without losing con-nection. The importance of this trait is dependant on how far apart the devices will be and if they will be stationary or mobile.

• Power consumption is the amount of battery the network drains from the device. In every case were the IoT device runs on battery this is an important factor.

Wi-fi

Wi-fi is the common tongue name for wireless technologies based on the IEEE 802.11 net-working standard which uses radio waves on 2.4 Ghz and 5 Ghz frequencies to transmit information between devices. When using Wi-fi on IoT devices you use a microchip and pos-sibly some firmware to protect the device. The device sends and receives data from Wi-fi access points which are approximately up to 46 meters away. This range is usually much more limited in industrial environments due to thick walls, lots of heavily plated machines etc. The power consumption of Wi-fi devices is on the higher end, especially compared to e.g. Bluetooth [2]. With coverage range also being limited the main strength of Wi-fi is the band-width with 802.11ac and 802.11ad producing theoretical bit rates of 800 Mbps and 6 Gbps respectively [12].

(14)

2.4. Wireless Technology

Bluetooth

Bluetooth was invented by Ericsson at the initiative of Nils Rydbeck with the purpose of developing wireless headsets [20]. It is based on the IEEE 802.15.1 standard which covers wireless connectivity with portable and mobile devices within personal operating space [8]. The Bluetooth standard was designed for portable equipment and is through its use of fre-quency hopping at frequencies of 2.4 and 2.485 GHz less susceptible to interference. Possibly more suited for IoT is Bluetooth low energy which as the name suggests is designed for low power consumption [18]. However, up until recently the focus of Bluetooth has not been in-dustrial applications but rather human-machine interfaces such as headphones and speakers etc. Researchers have started to suggest improvements and features to the existing protocol in order to make it better suited for the requirements of industrial applications [8]. In terms of bandwidth Bluetooth has a rate of up to 3 Mbps depending on the configuration [18]. The coverage range is typically up to 10 meters but is depended by obstacles and the density of walls between the connected devices. The power consumption can be very low depending on the use case, but is regardless, as mentioned earlier, lower than Wi-fi [18].

Zigbee

ZigBee is based on the IEEE 802.15.4 standard and it works using a mesh-network structure [8]. It has low power usage, coverage range of 10-20 meters, and low bandwidth (around 250 Kbps [33]). According to Basri Celebi et al. Zigbee is a technology that fits most of the industrial applications that uses wireless technology [8].

Cellular

The wireless technologies used by smartphones (at least when there is no available Wi-fi connection nearby) is 4G. 4G stands for fourth generation as in the fourth generation of cellular connectivity, succeeding the third generation 3G. The 4G standard was introduced 2008 by the International Telecommunications Union (ITU) [4]. 4G has high coverage range (16-24 km [33]) and has relatively high bandwidth with the standard stipulating data rates of 100 Mbps. The power consumption is relatively high, higher than Wi-fi and Bluetooth [33]. A fairly novel group of cellular technologies that trade the high bandwidth of 4G to combat the issue of high energy consumption is Low Power Wide Area Networks (LWPAN). Members of this group are e.g. LTE-M, Narrowband-IoT and LoRa.

Next generation of cellular connectivity and communication is the fifth generation or 5G for short. ITU defined back in 2017 the standards that 5G should fulfill to be called 5G. According to the article "What Will 5G Bring?" are there three major use cases that 5G will bring. The first one is the increased mobile broadband, which brings faster download and upload speeds. The second is the ultra-reliable and low-latency communication (URLLC) that 5G offers. The third and last one is the massive machine-type communication (mMTC) which means that more devices within a small area can communicate without losing performance [7]. According to the requirements that ITU published in their report [34], with respect to the previously major use cases, are as follow:

• Peak upload speed of 10 Gbps and download speed of 20 Gbps. • Latency of less than 1 millisecond.

• Up to 1 million simultaneous connected devices per km2.

To enable fast decisions and communication between machines will the low latency be im-portant, since some machines have to make a decision within milliseconds. The fact that you can connect more than one million devices per square kilometer are also a big deal to enable

(15)

2.5. Cloud Computing and Cloud Services

industry 4.0. Factories usually have a lot of machines close to each other. These machines could have multiple IoT devices that wants to communicate, which means that the amount of connected devices will grow fast. According to Allen will 5G roughly enable 1 000 more devices per square meter, compared to 4G [1]. This will enable the factories to connect more machines and if they want to add more devices in the future there will be space to add those to the same network.

This was a draft of some of the wireless technologies used in IoT with the aim to give the reader insight into some of the technologies that are presently used in the industry and the pros and cons with these technologies. The curious reader is referred to Chapter 2 in the book "Industrial IoT" by Ismail Button (chapter written by Hasan Basri Celebi, Antonios Pitarokoilis, and Mikael Skoglund) where they provide details on more relevant technologies (White-fi, 6LoWPAN, Sigfox etc) [8].

2.5

Cloud Computing and Cloud Services

That cloud computing is a necessary technology to use for IoT applications seems fairly undisputed [39, 19, 16]. So what is cloud computing? According to Azure, which is Mi-crosoft’s cloud platform, it is defined as:

"cloud computing is the delivery of computing services—including servers, stor-age, databases, networking, software, analytics, and intelligence—over the Inter-net (“the cloud”) to offer faster innovation, flexible resources, and economies of scale." [3].

According to Gilchrist this modern idea of cloud computing came about with the launch of Amazon Webservices in 2006 [12]. Amazon was struggling to cope with the growing scal-ing requirements of their web services. To solve the issue they built large data centers and infrastructure to handle database, compute and storage. Amazon then started to rent spare computing capacity on an on-use basis which is what we know today as Infrastructure as a Service (IaaS). Other companies followed Amazons lead, among them were Google (Google Cloud Platform) and Microsoft (Microsoft Azure) which together with Amazon are the three largest Cloud service providers as of 2021 [13]. As mentioned there are other public clouds available which together make up 35% of the market [13]. Among these are Alibaba Cloud (which is close to Google Cloud platform in market share), Oracle Cloud and IBM Cloud to name a few. Besides Infrastructure as a service there are two other categories of service, Platform as a service (Paas) and Software as a service (Saas).

• Infrastructure as a service (Iaas) is as touched upon earlier the rental of spare computing capacity. What this means in practice is that instead of buying and managing hardware one can instead rent compute, storage and network from a cloud service provider. The user then only pays for what is used and accesses the rented hardware through a termi-nal on their local computer or through the cloud providers web service.

• Platform as a service (Paas) is essentially infrastructure as a service but with the ad-dition of an operating system running and access to software development languages, libraries, APIs, and microservices. The idea is to provide everything a developer may need to develop applications on the cloud.

• Software as a service (Saas) is something probably many people are familiar with and have used without knowing it is called Saas in the cloud world. It is web-server based software being accessed through a web browser. In contrast to applications running locally on the computer the user does not need to update the application since the ap-plication is managed by the provider and then delivered on-demand through the web.

(16)

2.6. Big Data Analytics

Real world examples of Software as a service platforms are Dropbox, Microsoft 365, Slack, Google Drive and Salesforce.

Because of the pay for what you use model and the scaleable capacity of the cloud it is a useful tool for IoT implementations. It can be appealing, especially to small to medium sized business, to utilize this flexibility of instead acquiring and managing hardware internally. Managing hardware internally can be a challenge because IoT devices generally generate a lot of data, and depending on the use case the computational power required can vary, which makes it hard to predict how much hardware might be necessary. However, using a public cloud service is no silver bullet, and it has its downsides. One of these downsides is latency, which is the time it takes for data to be transmitted from a device to the cloud, processing the data, and possibly sending data back. There are many time-critical operations in industry where the machines make decisions on ms speeds. The latency implied by using public cloud services would be too high to be involved in these operations. Cloud services is therefore not (yet) a replacement to existing infrastructures in the industry but rather a compliment.

2.6

Big Data Analytics

One of the major factors in Industry 4.0 is the use of data analytics [38]. You want to be able to make smart decisions from the data that the machines collect. This includes for example making simulations on processes, predict when a machine needs maintenance and fine tun-ing the machine for optimal performance. The problem here lies in the amount of data you have to process. When it comes to industries, the data itself could be extremely large and hard to manage, even for a small factory. To be able to manage the data and make well in-formed decisions, you need two things. The first one is knowledge about the machines, the process and the industry as a whole. This knowledge could be used to extract exactly what the collected data means and why it might look like it does. The second one is experience with big data analytics, how one could analyse large quantities of data in an efficient and reliable way. Combining the knowledge about the data and the skill to handle the data in an efficient way, the possibilities to make well informed decisions will naturally increase.

What is Big Data?

When the amount of data is too large for traditional databases and software’s to handle in an efficient way, you call it big data [30, 12]. Big data are usually defined by the four Vs, volume, veracity, velocity and variety. Volume refers to the size of the data, where the size usually is in the scale of many Terabytes or larger. In an industrial environment there are usually lots of machines and therefore lots of data points where you can make measurements. This means that the industry will have access to a large dataset. Veracity refer to how accurate the data is and how much you can actually trust that the data is correct. For example, in an industrial context veracity could refer to the data collected by sensors. To what degree can we trust that they collect the data correctly. If you change the sensor, could you guarantee that the new one collect the data exactly like the previous one? Velocity refers to how much the data grows and how fast it grows. As stated earlier there will probably be lots of data points in an industrial environment. This means that the size of the data could grow rapidly. You might also add new sensors later, which means the data will grow even faster. Variety refers to the different forms the data could take. Which in an industrial environment, where there usually are lots of different machines from different manufacturers, each data point could look different. All these factors make it harder to handle the data, both on the software and hardware side.

(17)

2.6. Big Data Analytics

Data Driven Decisions

One of the main parts of big data analytics is the predictive part. To predict what lies in the future and why the future looks like that is important. When should a machine be main-tained, when is it time to switch machine and what could you expect from the machine? These things could be answered with smart machine learning algorithms and modelling. By using historical data and human knowledge about the machines clever models could be built. These models can then be used to make smart and data driven decisions, to make the indus-try more efficient. Another aspect of the data analysis is that you could fine tune parameters and settings on the machines. Making them more efficient.

Artificial Intelligence and Machine Learning

The problem today is not that you do not have enough data, you could always collect more data, which most times is preferable. You could add more data points, collect data more frequently or both, just to be able to take well informed decisions. The problem is then how you take those well informed decisions. The data is too large for a human, or even several, to comprehend, which means that they would probably miss important details and patterns. This could be avoided with a smart AI or machine learning algorithms, since computers are much faster at handling data.

Humans are quite good at finding patterns and analysing data, but we are not that good at repetitive tasks. Humans get easily bored and are not always that fast when it comes to data analytics on large datasets. These types of tasks suits AIs well. A well trained AI could perform repetitive tasks and could make analysis on large datasets much faster and for longer periods of time, compared to a human. The problem is that it is not that easy to program an AI to do exactly what you want. Which is an important part, because the normal AI only does exactly what we humans have taught it to do.

Machine learning is a field within the AI scope and are therefore also an AI. Machine learning algorithms takes a more statistical approach and use historical data to predict the future. By analysing earlier inputs and outputs, different variables and just generally how the system have worked, you could create a model that will be able to predict the future. The model will have varying results because it works by trying to map the current input to earlier input and the result will depend on how well it can map it.

In an industrial environment, where you want your machines to run as much as possible, fault detection is a major problem [31]. You want to detect a fault as fast as possible, to make sure you correct it. This is something an AI could suit well for, and if it is trained to detect anomalies it could rule out just normal outliers and only show significant faults. The AI could potentially find faults before they happen and alert someone so they could take care of them preemptively. Anomaly detection could also be used on training data for a machine learning or AI training set. Looking for outliers and anomalies and removing them from the training set could generate a higher accuracy on the models you try to train.

(18)

3

Method

In this chapter the methodology for answering the research questions will be described. The methodology for answering the first two research questions is a literature review as well as interviews with companies connected to the industry. These will be covered in Sections 3.1-3.2. The methodology for answering the third question is a price calculation of the cloud service expense for a use case of IoT sensors and is covered in Section 3.3.

3.1

Pre-study and Interviews

To get a grasp of the field a literature review was done before the interviews took place to have some knowledge to base the interviews on. Between and after the interviews further reading was conducted, both to try to find solutions to the problems in the industry and also to get more knowledge about the field. To find good scientific articles we used some keywords and the databases google scholar, diva-portalen and the Linköping University Library. The key-words were ”Industrial Internet of Things”, ”Internet of Things”, ”Industry 4.0”, ”5G”, ”Big Data Analytics” and ”Cloud”. The interviews were conducted over Microsoft Teams with multiple companies. The companies were mainly contacted directly via email or through contact forms on their respective websites. In total we contacted 30 companies with 9 compa-nies responding and agreeing to an interview. Some responded and said no but the majority did not respond at all. The contacted companies either had a factory, and therefore had an interest in or had been thinking about connecting their machines to a cloud service, or had implemented a similar solution in their own industry. The industry the factory operates in was not considered since the important part were that they had a factory environment. The size of the factory were considered, where we wanted input from both small, medium and large factories. The interviews were semi-structured qualitative interviews as described in [17]. Each interview was conducted with one person or one company at a time and the focus of the questions were to open up to a discussion with follow-up questions. When conduct-ing the interviews one of us took notes while the other person asked most questions, both the prepared and the follow-up questions. The questions asked during the interviews can be viewed in Section ?? and the reasoning behind them could be viewed below. If possible the discussion were recorded via the built in recording tool in Teams, which made it possible for us to listen to it at a later time, where we then could transcribe the interviews [17].

(19)

3.2. Analysing the Data

The Interview Questions

During the pre-study we had three informal interviews with different companies in order to gain a solid foundation and introduction to the subject. The interviews covered the industry 4.0 topic as a whole but mainly focused on IoT and cloud solutions since that is what this thesis is focused on. From these interviews we saw that this subject seemed to be interesting and relevant and formulated questions that would allow us to further dive into the subject with other companies. The questions were formulated in a way that there were no ”yes/no” answers but rather they open up for discussion. The answers to these types of questions is more unpredictable and takes longer to digest than quantitative questions, making it more time consuming. However, due to the non-quantitative nature of research questions this type of approach felt necessary.

3.2

Analysing the Data

We used an inductive approach when we analysed the data from the interviews. This meant that the analysis were conducted with minimal predetermined theory and the data itself de-rived how the analysis were structured. Specifically the ”thematic content analysis” were used, where common themes were found and then presented with preferably multiple exam-ples from the interviews [6].

3.3

Comparing the Price of Cloud Services

One identified value metric that became apparent when interviewing the companies (Section 4.1) was the expense of using public cloud services. In order to provide an approximation on the magnitude of this expense we looked into a specific use case of IoT sensors where public cloud services were used. The price of using the three (to date) biggest cloud services: Amazon Web Services (AWS), Microsoft Azure and Google Cloud [13] were compared for this use case by applying their price models to the use case [25, 14, 26]. The following sections will provide a detailed description of the use case and the calculations.

Use Case: Blasting Machine

The use case studied for this problem is the monitoring of a large blaster machine used to hone iron beams. In this machine it is of interest to monitor the temperature and vibrations of the axis in the motors. The monitoring is conducted with VVB001 sensors from IFM [37]. Two axis per motor are measured and with four motors in one machine it becomes a total of eight sensors. From each sensor, 4 data values of 16 bit each with a sampling frequency of 1 Hz were outputted to a gateway which then forwarded the data to the cloud using the MQTT protocol. The MQTT protocol introduces an overhead which we have used Vasil Sarafovs work to approximate the size of [29]. The author made a comparison of IoT data protocol overheads and has provided a lower bound of 342 bits for the MQTT-protocols overhead which is the size we will use for these calculations. This results in a transmission rate of 854 bit/s or 106.75 bytes/s from the gateway to the cloud. An illustration of the case can be seen in Figure 3.1.

Price Calculations

We have identified Amazons "IoT-Core", Azures "Iot-Hub" and Googles "Cloud IoT Core" to be the baseline IoT-function services available from each platform and similar enough for a comparison. These services enables IoT devices to get their telemetry data to the cloud environment. When the data arrives at the cloud there are additional functions that might be attractive or essential for specific use cases, some of these will be discussed in chapter 5.

(20)

3.3. Comparing the Price of Cloud Services

Figure 3.1: Illustration of the blaster machine use case. There are four motors with two sen-sors each sending data to a gateway which forwards it to the cloud using the MQTT protocol.

However, due to the large number of possible setups of functions and services, providing an approximated price for these is out of scope for this thesis. Instead we provide a price approximation for getting data from device to cloud. The aforementioned IoT-services all uses monthly payment models based on the size and number of messages that are transmit-ted to (and from, but not relevant in our case) the cloud. We therefore calculate from our transmission rate of 106.75 bytes/s that the monthly usage is:

3600 seconds/hour ˚ 24 hours ˚ 30 days=2592000 messages

2592000 messages ˚ 106.75 bytes=276696000 bytes=276.696 Megabytes (MB) Since AWS and Azure both calculate their message size in kilobyte (kB), we have to con-vert the bytes to kB. Using the following calculation makes each message 0.104248046875 kB:

106.75 bytes ˚ 0.0009765625 kB in a byte=0.104248046875 kB

These values are then applied to the different pricing guides of the cloud services.

AWS IoT Core

The information about the prices to implement and use AWS IoT Core was gathered from AWS official price calculator [25], which provided a calculator that estimated the price. There are a few things to consider when calculating the price of the AWS IoT Core, the first one is the location. The closest location to Linköping is EU (Stockholm). The next is the amount of devices you want to connect, which in our case would be just one (the gateway). Next, we wanted to know for how long we wanted to be connected, which is all day, every day, mean-ing 43 800 minutes each month accordmean-ing to their calculator. The last question is how many messages we want to send each month. A message in AWS could be as large as 5 kB, which is much larger than our messages. That means that we have to send 86 400 messages each day, making it 2 592 000 messages per month. AWS have three different pricing tiers where

(21)

3.3. Comparing the Price of Cloud Services

each million messages cost either $1.20, $0.96 or $0.84 depending on how many messages you have sent. It cost $1.2 for each million messages up to one billion messages, which is much more than our use case.

Azure IoT Hub

The information about the prices to implement and use Azure IoT Hub were gathered from Azure official price calculator [26], which provided a calculator that estimated the price. When implementing Azure IoT Hub you have multiple things to consider. The first thing is the location, which we chose the closest one to Linköping, in this case Norway Easy (Oslo). The next thing we had to consider were the tier, the basic tier or the standard tier. Basic allow only communiaction to the cloud, while the standard tier allows bi-directional communica-tion. In our case we want the bidirectional tier, meaning we had to choose the standard tier, which is a bit more expensive. If we only wanted to send messages to the cloud but nothing from the cloud to the devices, the basic tier would have sufficed. Next we had to consider how many messages we would send each day. The machines will send information every second, and it is 86 400 seconds each day, meaning that we send information 86 400 times to the cloud each day. A message according to Azure could be as large as 4 kB, then each new 4 kB are considered a new message. The data we want to send is much less than 4 kB, mean-ing that we send 86 400 messages each day. The lowest messagmean-ing tier that Azure provides allows for 400 000 messages per day.

Cloud IoT Core

The Cloud IoT Core price model is based on the monthly data volume sent to the cloud. They charge per MB and the price decreases in incremental steps with increasing volumes. The volumes and prices can be seen in Figure 3.3. We can also see that there is a minimum charge of 1024 bytes. This means that any message to the cloud which is lower than 1024 bytes will be charged as if it were 1024 bytes. Because of this, the previous calculation of how many bytes are sent per month instead becomes:

2592000 messages ˚ 1024 bytes=2654208000 bytes=2654.208 Megabytes (MB) The general pricing formula for Cloud IoT core is:

price=message size ˚ price/MB

which then, if the same amount of data is transmitted each message becomes: price= (monthly Data Volume ´ 250 MB)˚price/MB

Here the free 250 MB per month is taken into account. From this formula it is then easy to plug in our 2654.208 Megabytes per month to get a price estimation.

With our case being relatively small, we were also interested in seeing how this would scale. To do this we created a script in Matlab. The script essentially does the same calculation as above but with the addition of batched messages. What this means is that we fit several telemetry transmits from the sensors into one message to optimise cost, see Figure 3.2. With our use case of 8 sensors for example we get:

1024 ˚ 8 bits per byte ´ 342 bits

512 bits =15.33203125 transmissions

which means that we can cut our number of messages and therefore also our monthly MB per month by a factor of 15. The same method can be applied to all the cloud models, and the calculations for the other two are covered in Section 4.3.

(22)

3.3. Comparing the Price of Cloud Services

Figure 3.2: Illustrates how we can decrease the number of messages sent to the cloud by batching our transmissions up to the maximum message size.

Figure 3.3: The price table for Cloud IoT Core provided by google and last updated 2021-05-07 [24]. Price per MB decreases with increased data volumes and the minimum charge when sending data is 1024 bytes

(23)

4

Results

In this chapter the results from the interviews are presented as well as the results from the price comparison.

4.1

Interview Results

Nine interviews were conducted in total with a wide range of different companies. The following section will be structured as follows; A short introduction of each company along with the main relevant points from the interviews will be presented. The section will then end with a presentation of common themes that we have derived from these interviews.

For the readers convenience the research questions attempted to be answered by the in-terviews are repeated here:

• What factors are relevant for a factory interested in implementing IoT with cloud? • What are good metrics to measure when evaluating the factors mentioned above?

SKF

SKF or Svenska kullagerfabriken is one of Swedens largest companies and one of the worlds largest bearing manufacturers [32]. SKF also constructs and develops gaskets and lubrication systems [23]. In this interview we got to talk to two persons from their company which together have more than 50 years of experience in the field.

The two employees told us when asked about their adaptation of cloud technologies that SKF have been talking about implementing some kind of a hybrid variant of a cloud solution. This hybrid would use both the cloud and some type of edge or fogging solution.

"Men nu senaste året, ett och ett halvt året kan man väl säga, så har man ju börjat prata om en slags hybridvariant. Och nu pratar jag inte om hybridvariant som att du har olika moln eller så. Utan vi pratar om en slags hybridvariant där vi kör, förlänger molnet ner till edgen och vise versa."

(24)

4.1. Interview Results

English translation: "But now the last year, one and a half years, you could say, you have started talking about a kind of hybrid variant. And now I’m not talking about hybrid variant like you have different clouds or so. But we’re talking about a kind of hybrid variant where we run, the cloud extends down to the edge and vice versa."

With cloud based solutions comes great challenges. One of these challenges is how could one access the data. Some of the manufacturers want to be able to collect data from their machines to know how they are running. This might even be the expected behaviour and something they demand. The problem is that if the manufacturers could connect to the ma-chines, so could someone else.

"Hur accessar vi fabriken utifrån, för våra maskintillverkare kan behöva komma in o kolla, för att det är ju en förväntan att man ska kunna göra det, men om de kan göra det, då kan någon annan göra det. Det blir väldigt jobbigt efter ett tag" English translation: "How do we access the factory from the outside, because our machine manufacturers may need to come in and check, because it is an expecta-tion that you should be able to do it, but if they can do it, then someone else can do it. It will be very difficult after a while"

Another aspect of connecting machines to the cloud is that they could deploy updates from some central location. They do not need to physically move to the machine, which could save time and make the process of updating more efficient. You have to know what you are doing to not interfere with any currently running processes. Which makes the centralization of these processes a lot harder.

"Vi måste kunna ha en förmåga att kunna göra saker centralt."

English translation: "We must be able to have the ability to do things centrally."

Collecting data today could potentially generate enormous amounts of data, too much for any individual human to process and make any sense of. Before, humans could understand the data and know what everything they collected meant. Today you have to use computers and advanced analytical tools such as AI and machine learning algorithms to make any sense of the data. If these tools are built correctly the understanding of the system could grow.

"Förr så kunde vi ha människor som hade koll på det dära, men idag så är det så otroligt mycket data som kommer in, det är så otroligt mycket brus, så vi be-höver ha då statistiska modeller, om vi förenklar machine learning teknologin till statistiska modeller, som är kapabla till o ta in obegripligt många parametrar som vi kan tycka är fullständigt meningslösa men som skapar förståelse för det här systemet och därigenom kan hantera det hära."

English translation: "Before we could have people who knew about it, but today there is so much data coming in, there is so much noise, so we need to have sta-tistical models, if we simplify machine learning technology to stasta-tistical models, which are capable of taking in incomprehensibly many parameters that we may find completely meaningless but which create an understanding of this system and thereby can handle it here."

Although, you can not run everything in the cloud, some things have to be built in the logic in the machine. For example, if you are about to saw yourself in the leg, you do not want to wait for a response from the cloud before the machine stops. The fast response times apply to other processes too, since lots of processes do not have time to wait for slow responses. They might need to make a decision within milliseconds and that is something the cloud probably never will be capable of doing due to latency and unreliable connections.

(25)

4.1. Interview Results

"Om du har en motorsåg med någon IoT, så vill du ju inte att den ska skicka upp till molnet o vända för o ta reda på att du håller på o såga dig i benet"

English translation: "If you have a chainsaw with an IoT, you do not want it to send up to the cloud and turn around and find out that you are about to saw yourself in the leg"

SKF wants to work with more open technologies where they can change system or the supplier easily. The sellers of technology usually want you to commit for many years, but if the technology is outdated within a couple of years, then you do not want to use that technology anymore. If you work with the open technologies, you have more freedom to change system or supplier as often as you want.

"Vi vet ju att om vi blir inlåsta i en nån specifik typ av kontrollmekanism, o det visar sig att den är undermålig om fyra år. Då har vi ett väldigt, väldigt stort problem när vi ska migrera ut ur den."

English translation: "We know that if we are locked into a specific type of control mechanism, and it turns out that it is substandard in four years. Then we have a very, very big problem when we migrate out of it."

With large amount of data comes lots of possibilities. One of these is the ability to use the data to answer questions. When you ask questions to the data, you get a better understanding of the data and can therefore make well informed and intelligent decisions.

"Ett viktig värde för tillverkningen iallafall om vi pratar om de, det är ju när vi har mycket data, så kan vi också ställa frågor till datat som hjälper oss att fatta intelligenta o kloka beslut."

English translation: "An important value for manufacturing at least if we talk about them, it is when we have a lot of data, so we can also ask questions to the data that helps us make intelligent and wise decisions."

One thing that the RnD department at SKF have observed is that the sensors are not ex-actly the same. They have their own individual finger print, which means they do not always behave in the same way. This could generate an interesting effect if you for example apply some type of machine learning algorithm which learn from one sensor. If you have to change the sensor, the machine learning might not work as intended. Since it has been learning from another sensor and the new sensor might measure things a little different.

"Säg att jag har en IoT device där ute som fångar till exempel vibrationsdata, o den har gjort det ett tag. O så av någon rackarns anledning så måste jag byta sensorn som mäter vibration. Då är det nämligen så att, det fingeravtrycket av den gamla o den nya sensorn är olika. Då kan du få ganska intressanta effekter."

English translation: "Say I have an IoT device out there that captures vibration data, for example, and it has been doing so for a while. And for some reason I have to change the sensor that measures vibration. Then it is namely so that the fingerprints of the old and the new sensor are different. Then you can get quite interesting effects."

Södra Cell

Södra Cell is a company that have three facilities in Sweden that mainly produces paper pulp for the manufacturing of paper and textile products. Besides paper pulp they produce green electricity, district heating and by-products such as green methanol. In this interview we got to talk to Södra Cells Head of Technology Development whom onward will be referred to as

(26)

4.1. Interview Results

HTD.

Södra Cells usage of IoT fit into the earlier described connectivity bundle. They have sen-sors which are connected through the internet for data analysis on the cloud. HTD walked us through two different scenarios of this application.

The first scenario was when they outsource the entire service to a sensor supplier. The example given on this was that they use motor-sensors supplied by ABB which are con-nected to ABBs infrastructure. The analysis of data is therefore made at ABB, and they will make reports on the analysis and inform Södra Cell if anything is wrong with the machines monitored by the sensors.

The second scenario is when the sensors are connected to Södra Cells own servers. It may still be the sensor-suppliers analysis software that is used but Södra Cell do the analysis themselves. This is a common scenario when they replace previously manual measurements with sensors.

The reason behind these sensors is to get more data with the ambition that this data can lead to information that will ultimately decrease the downtime of the factory. In HDTs own words:

"mycket liksom blir ju ett underhålls relaterat perspektiv o få in mer information för att få upp tillgängligheten. mycket handlar om att mer data i ett första läge och se till att lättare se trender och tendenser tidigt. Sen kan man ju tänka sig framöver om man får upp det här i Cloud miljö att man kan länka ihop det här med annan processdata och därifrån lära sig liksom och både förutse men även få en förståelse varför olika haverier inträffar."

English translation: "much will like be a maintenance related perspective and get more information to increase availability. much is about getting more data in a early stage and making it easier to see trends and tendencies early on. Then you can imagine in the future if you get this in the Cloud environment that you can link this with other process data and from there learn as well as both predict but also get an understanding of why different accidents occur."

HDT continued explaining that these two scenarios are not the only sensors in the factory, and therefore not the only source of information. These scenarios were rather their use case scenarios of IoT-sensors. They have, for example, vibration-sensors on their so called A-class machines, which are bigger machines with high priority. On these machines they have many sensors on a small surface transmitting measurements every second. These are then connected to the internet for online analysis through conventional wired techniques. Their use of IoT was then made clear by this statement:

"Så IoT för oss det blir ju på kanske mer B-klassade då dem som är lite lägre prio på men att vi har mer spridda över hela fabriken och det är enskilda mätpunkter eller fåtal mätpunkter på respektive objekt. Det kan vara en pump eller nån fläkt eller dem typerna av utrustningar. Då mäter vi ju inte så ofta då kan vi mäta en gång, på vissa objekt en gång i timmen och vissa en gång om dygnet, per dygn eller något sånt där."

English translation: "So IoT for us it will be on perhaps more B-rated then those that are a little lower prio on but that we have more spread across the factory and there are individual measuring points or a few measuring points on each object. It could be a pump or a fan or those types of equipment. Then we do not measure

(27)

4.1. Interview Results

as often then we can measure once, on some objects once an hour and some once a day, per day or something like that."

The major strength of IoT, being its wireless properties, was then revealed to introduce many limitations. One limitation is the battery life. Doing measurements every second as they do on A-class machines would not be feasible because the battery of the device would run dry too fast. As mentioned in the statement above they have to settle for once per our or once per day measurements on their B-class machines. A second limitation is the wireless range. Their shop floor is very large and there are many concrete walls and machines that interferes the wireless signals. For this reason Södra uses sensors which uses 4G technology over technologies such as Bluetooth and WiFi due to 4Gs superior range. This somewhat forced choice then further feeds into the battery life issue since 4G is less energy efficient than for example bluetooth. A third limitation is the availability of the IoT sensors. When measuring their availability they found that 95% of the transmissions are successful which gives a failure rate of 5%. This limits the IoT sensors to maintenence monitoring since a loss of 5% data would not be acceptable in process critical measurements. The fourth and final limitation that was mentioned is the price and supply of sensors. According to HDT most of the sensors available uses Bluetooth or WiFi, while 4G sensors are very limited. This is then mirrored in the pricing due to supply/demand.

"Jag menar oftast pratas det om i allmänna ordalag att en sensor kommer kosta 50 dollar men dem ligger ju på snara 500 dollar i dagsläget, det är ju 5000 spänn för en sensor."

English translation: "I mean, it is often talked about in general terms that a sensor will cost 50 dollars, but they are at almost 500 dollars at the moment, it is 5000 bucks for a sensor."

Due to the numbers of sensors required in their factory this lack of supply and high price is then a big limitation to IoT usage. But due to the strength of wireless they still have use cases for these sensors where the limitations are manageable.

Toyota Material Handling

Toyota Material Handling manufactures and distributes different types of forklifts. In this interview we got to talk to the product manager of ’I-site’ whom onward will be referred to as PM. I-site is a solution Toyota Material Handling have developed that allows their forklift costumers to monitor their forklifts. The costumer can through a web interface see how much a forklift is used, how the drivers behave and how the battery charges etc.

The data monitored is gathered by a data handling unit installed on the trucks which gathers data through sensors and transmits this data through GPRS, 3G and 4G. The value creation for both the costumer and the Toyota factory comes from the information extracted from the raw data, as PM said:

"Väldigt mycket handlar om att man med hjälp av den data vi samlar in kan jobba mer effektivt."

English translation: "It is very much about being able to work more efficiently with the help of the data we collect."

When this statement was elaborated on, the proposed increased productivity for the fork-lift costumer mainly comes from the proven decreased rate of accidents and improved battery management. The decreased rate of accidents is made possible by two things. The first one is through alerts and notifications being sent to managers when something serious happens.

(28)

4.1. Interview Results

This enables a more consistent followup to accidents since sensors decide when an accident is serious enough, rather than an employee who might try to cover it up or not think it is too serious. The goal of this is not to find a scapegoat but rather to aid and educate employees, making them better drivers. The second one is the ability to control who is able to access the machines. Through I-Site one can create a list with drivers and assign what truck or which truck types the respective drivers are allowed to drive. This information is then transmitted to the data handling unit on the trucks which will then automatically restrict access to unau-thorized drivers. This feature is useful due to the large numbers of accidents caused by said drivers, in PM’s own words:

"Det här är väldigt mycket säkerhetsfrämjande, om man tittar mycket på oly-cksstatistik så är det väldigt mycket som beror på att det är outbildade förare som kör fel truck så att säga."

English translation: "this is very much safety-promoting, if you look a lot at acci-dent statistics, it is very much due to the fact that it is untrained drivers who drive the wrong truck, so to speak."

The improved battery management comes from the ability to monitor battery behaviour which helps customers to take action early when non-optimal charging behaviour is caught. This is valuable because non-optimal charging will reduce the lifetime of the battery which essentially means reducing the lifetime of the forklift.

"Batterierna är ju en relativt stor del av kostnaden för våra relativt små truckar. De här minsta truckarna är ju i stort sett bara en stor klump med ett batteri i och två gafflar."

English translation: "The batteries are a relatively large part of the cost of our small trucks. These smallest trucks are basically just a big lump with a battery and two forks."

The I-site solution does not only benefit the customer. The data gathered also benefits Toyota and increases their productivity. Approximately 60% of the trucks they distribute are rental trucks. Being able to monitor the condition of the trucks allows Toyota to know when these trucks will need service. Before solutions like this an operator had to physically go to the trucks do meter readings etc. to get a grasp of the condition.

When asked about challenges/improvements of the product one limitation to this solu-tion is the cost of using 3G/4G networks. Because of this cost the data handling unit on the trucks only transmits data once every hour. PM mentioned that some functionalities customers desire would need real time communication which with the current networking solution would be too expensive. According to PM, 5G might solve the cost issue and enable real time functionalities.

Holmen

Holmen is a Swedish company within the forest industry. The person we got to talk to were the manager at the UX and Innovation department at Holmen, whom onwards will be called UXI.

UXI talked about having known about industry 4.0 for some time now, but the ignorance in the rest of the company has meant that more or less nothing has happened yet. The man-agement do not know about the possibilities but if they made the decision to start working on industry 4.0, the company as a whole could work towards this. As it currently is, the UX and Innovation department is too small to make any real impact.

References

Related documents

Industrial Emissions Directive, supplemented by horizontal legislation (e.g., Framework Directives on Waste and Water, Emissions Trading System, etc) and guidance on operating

What strategic considerations are companies faced with when making decisions regarding the trade-off between inventory levels and satisfying a volatile demand in the

46 Konkreta exempel skulle kunna vara främjandeinsatser för affärsänglar/affärsängelnätverk, skapa arenor där aktörer från utbuds- och efterfrågesidan kan mötas eller

Both Brazil and Sweden have made bilateral cooperation in areas of technology and innovation a top priority. It has been formalized in a series of agreements and made explicit

För att uppskatta den totala effekten av reformerna måste dock hänsyn tas till såväl samt- liga priseffekter som sammansättningseffekter, till följd av ökad försäljningsandel

The increasing availability of data and attention to services has increased the understanding of the contribution of services to innovation and productivity in

Generella styrmedel kan ha varit mindre verksamma än man har trott De generella styrmedlen, till skillnad från de specifika styrmedlen, har kommit att användas i större

Parallellmarknader innebär dock inte en drivkraft för en grön omställning Ökad andel direktförsäljning räddar många lokala producenter och kan tyckas utgöra en drivkraft