• No results found

Design and development of intelligentapartments

N/A
N/A
Protected

Academic year: 2021

Share "Design and development of intelligentapartments"

Copied!
106
0
0

Loading.... (view fulltext now)

Full text

(1)
(2)

Design and development of intelligent

apartments

by

Author: Zahraa Shahid

Advisor: Maurizio Di-Rocco

degree of Master of Arts/Science in Robotics and Intelligent Systems,120 credits

School of Science and Technology

Centre for Applied Autonomous Sensor Systems (AASS)

(3)

Abstract

Developments in the fields of Ambient Intelligent Environments (AmIE) and Ubiquitous Computing environments have provided the opportunity to develop assistive technologies to help individuals who need medical care on a daily basis; in fact the integration between robots, smart actuators and sensors can ease the execution of several domestic tasks. Since building a real physical IE is costly, simulations have opened the door to various possibilities to design, test and explore different scenarios of virtual IE. This thesis project is aimed at developing an agile prototyping toolkit that provides the means for users to simulate smart environments for robotics, AmIE and Ubiquitous Computing fields. This toolkit leverages well known software middlewares: Robot Operating System (ROS) and Gazebo. The former is a middleware providing access to several state of the art techniques used in robotics as well as to a variety of hardware platforms. The latter provides a simulator which allows to reproduce 3D environments as well as robotic platforms. Besides reproducing realistic scenarios, the proposed framework easily allows to go beyond a single apartment: in particular in this thesis a double floor building has been taken into account. Performances about this scenario and the integration with existing middlewares are also provided.

(4)

Acknowledgements

I would like to thank my supervisor Maurizio Di-Rocco for his guidance and advices throughout this thesis, and take this opportunity to express my sincere gratitude to my family for their support and encouragement.

(5)

Contents

Abstract i

Acknowledgements ii

List of Figures vi

1 Intoduction, motivation and objectives 1

1.1 Introduction. . . 1

1.1.1 Motivation for using simulation . . . 2

1.2 Desired features. . . 3

1.3 Objectives . . . 3

1.4 Related work . . . 5

1.5 Thesis structure: overview of the chapters . . . 8

2 Building intelligent environments 10 2.1 Introduction of simulation platform. . . 10

2.2 Architecture of Gazebo simulator . . . 11

2.2.1 Open dynamics engine (ODE) . . . 13

2.2.2 Visualization . . . 13

2.3 Building physical models . . . 14

2.3.1 Simulation description format (SDF) . . . 14

2.3.2 Links . . . 14

2.3.3 Joints and their safety limits . . . 15

2.4 Proportional-Integral-Derivative (PID) controller . . . 16

2.5 Interface with a robotic middleware . . . 18

2.5.1 ROS: robot operating system. . . 18

2.5.2 Plugin interface . . . 18

2.5.3 Tcp communication or Gazebo communication . . . 19

2.6 Gazebo-ROS packages . . . 19

3 The developed Angen simulator 20 3.1 System and package overview . . . 20

3.2 Design of the apartment environment. . . 22

3.2.1 Implementation. . . 23

3.3 Actuators . . . 23

3.3.1 Design of the door model . . . 23

(6)

Contents iv

3.3.1.1 Implementation of the door model . . . 24

3.3.1.2 How the simulated door works . . . 26

3.3.2 Design of the elevator model . . . 27

3.3.2.1 Implementation of the elevator model . . . 28

3.3.2.2 How the simulated elevator works . . . 30

3.3.3 Design of the light model . . . 31

3.3.3.1 Implementation of the light model . . . 31

3.3.3.2 How the simulated light works . . . 32

3.4 Turtlebot-Robot . . . 32

3.4.1 Implementation of the Turtlebot-robot . . . 33

3.4.2 How the simulated Turtlebot robot works . . . 34

3.5 Sensors . . . 35

3.5.1 Camera sensor . . . 35

3.5.2 Luminosity sensor . . . 35

3.5.2.1 Design of the simulated luminosity sensor . . . 36

3.5.2.2 Implementation of the simulated luminosity sensor . . . 37

3.5.3 Passive infrared sensor (PIR) . . . 39

3.5.3.1 Design of the simulated PIR sensor . . . 40

3.5.3.2 Implementation of the simulated PIR sensor . . . 42

3.5.4 Optical switch sensor . . . 45

3.5.4.1 Design of the simulated optical switch sensor . . . 45

3.5.4.2 Implementation of the simulated optical switch sensor . . . 46

3.5.5 Pressure or contact sensor . . . 48

3.5.5.1 Design of the simulated contact sensor. . . 48

3.5.5.2 Implementation of the simulated contact sensor . . . 49

4 Simulation results 52 4.1 Introduction. . . 52

4.1.1 Angen apartment scenario . . . 52

4.2 Components Communication . . . 54

4.3 Tests of Angen simulator and results . . . 57

4.3.1 Performance Results Analysis . . . 64

4.3.2 Desired features after the tests . . . 66

4.4 Simulation of a robotic task . . . 66

5 Discussion and conclusion, limitations and contributions 70 5.1 Discussion . . . 70

5.2 Conclusion . . . 73

A Appendix Guide to install ROS and Gazebo simulator 75 A.1 Starting Gazebo simulation: . . . 76

A.1.1 Spawn and delete sensors after starting Gazebo: . . . 77

B Building the simulation environment: 78 B.1 Adding constructed models to model database: . . . 78

(7)

Contents v

B.1.1 Create an SDF elevator model . . . 78

B.1.2 Create a model of two floors apartment with SDF . . . 79

B.1.3 Create doors with SDF: . . . 80

C Writing actuators’ plug-ins 82 C.0.4 Elevator and lights ROS enabled model plugin . . . 82

C.0.5 Elevator controller: . . . 83

C.0.6 Light controller: . . . 83

C.0.7 Door Ros world enabled plugin . . . 84

C.0.8 Turtlebot Ros world enabled plugin . . . 85

C.1 Launch File.world: . . . 86

D Sensors 87 D.0.1 Contacts sensor . . . 87

D.0.2 Passive Infrared (PIR) sensor:. . . 87

D.0.3 Luminosity sensor: . . . 88

D.0.4 Optical switch . . . 88

D.0.5 Camera sensor . . . 88

(8)

List of Figures

1.1 The constructed inteligent environment is composed by two apartments,

several actuators and sensors . . . 4

2.1 Gazebo distributed architecture from top broken into libraries for physics simulation, rendering, user interface, communication, and sensor gener-ation. Three different processes will be provided: physics simulation, sensor generation, gui, and a master for coordination. . . 12

2.2 Two joints of a robot model connecting different parts of rigid bodies. . . 16

2.3 A block diagram of a PID controller in a feedback loop [81] . . . 17

3.1 Angen package hierarchy in terms of folders.. . . 21

3.2 Door model of the kitchen room when it is opened. . . 25

3.3 Door model of the livingroom when it is closed. . . 26

3.4 Complete PID [83] . . . 27

3.5 PID with P term [83]. . . 29

3.6 Distance with P and I term [83] . . . 29

3.7 Elevator Model with a door on one side and open from the other side. . . 29

3.8 Elevator Model . . . 30

3.9 The green rays is a light point rendered in the scene and the light sensor model is represented with a white geometrical model.. . . 32

3.10 Turtlebot robot is moving in the livingroom. . . 33

3.11 The figure describes the communication methodology between ROS and the actuators. . . 34

3.12 A camera type of sensor embedded in a light sensor model which is located high up in each room to detect the luminance of each wall. . . 38

3.13 PIR sensor in the entrance of the ground floor apartment with its vi-sualized blue rays in the environment, detecting the motion of dynamic objects in the environment within 6 m range. . . 44

3.14 PIR is moving up and down in the livingroom and posed up in one corner of the room. . . 44

3.15 An optical switch model is posed on the top of a door as a laser type of sensor, represented by two objects distanced at 0.5 m with its visualized blue rays and its range is cut off when the attached green strip of the door comes in between the emitter and the detector, detecting the event. 47 3.16 An optical switch model is posed on the top of a door as a laser type of sensor, represented by two white strips distanced at 0.5 m with its visualized blue rays and its range is a full ray when the attached green card of the door is moved away, detecting the door as opened. . . 47

(9)

List of Figures vii

4.1 This figure depicts how Gazebo organizes the models and plugins to be

placed into Angen world.. . . 54

4.2 This figure describes the relationships between developed components in Angen simulator and the proposed middlewares Gazebo ROS and Gazebo ROR Packages. . . 55

4.3 CPU% of Added PIRs Objects . . . 60

4.4 CPU% of deploying PIR sensors during Simulation . . . 60

4.5 CPU% of added Luminosity sensors . . . 62

4.6 CPU% of deploying luminosity sensors objects . . . 62

4.7 CPU% of Added optical switch objects . . . 63

4.8 CPU% of deploying optical switch objects . . . 63

4.9 CPU% of Deploying Contact sensors . . . 64

4.10 The Angen world . . . 67

4.11 The elevator’s door as closed . . . 67

4.12 The elevator’s door as opened . . . 67

4.13 This figure describes the PIR sensors performance in terms of its respon-siveness and the robot’s position in each room is shown along its trajec-tory in the first floor apartment, showing the delay between each detected values (red and green points). . . 68

4.14 This figure describes the PIR and optical switch sensors performance in terms of detection and the robot’s position in each room is shown along its trajectory in the second floor apartment, showing the delay between each detected values (yellow points). . . 69

(10)

Chapter 1

Intoduction, motivation and

objectives

1.1

Introduction

Ambient Intelligent Environments (AmIE) is a term that was coined by Eli Zelkha, Brian Epstein and Simon Birrell in 1998 [1]. The first concept, ambient, refers to an en-vironment with certain requirements such as ubiquitousness[2] and the term Intelligent Environment (IE) refers to modeling IE via agents should be consistent with modeling and simulating other entities in the environment [3]. AmIEs’ applications are found in a number of environments like smart houses, offices and ambient health care [4]. The other term that is used numerously in the field of intelligent environments in recent years is Ubiquitous Computing environment and it is necessary to maintain a distinc-tion between both of the terms. Mark Weiser (1993) has defined Ubiquitous Computing as computers are scattered around to make use of them throughout the physical envi-ronment, but also should be effectively invisible to users. On the other hand, Sakamura (1987) defined the term as information and computing resources that can be attained by users at any time and anywhere such as smart homes and offices in order to provide the best services to its habitants [6].

As it can be seen from the definitions of the both terms, AmIE and Ubiquitous Comput-ing environments are distinct by the former considerComput-ing the social and human factors, while the latter is more technology focused and its applications requires advanced proto-typing [7,8]. Our aim is to develop a distributed heterogeneous system that is considering the social and human factor besides the technology. It can be said that the developed project in this thesis contributes to the AmIE research field. Intelligent Environments

(11)

Chapter 1. Intoduction, motivation and objectives 2

(IE) become of importance particularly for people who are in need for medical care. In-tegration of technology solutions to aid this group of people and carrying out the needed tasks in the environment intelligently by controllers, and sensing the condition of the environment by means of sensors, were part of the project’s goals. In the IE field, multi-ple middle-wares were used to develop IEs and facilitate the communication among the different physical objects in the environment. After reviewing several projects conducted within AmIE and Ubiquitous computing fields, it was found that a considerable number of related projects have favored ROS (Robot Operating System)/Gazebo middle-ware to implement similar objectives as ours. Yet, in this project we produce developments that are conducted using the latest version of Gazebo standalone version 1.9.1 for sim-ulation ad visualization purposes, and Gazebo ROS Packages as a middleware for inter communication, controlling all of the developed components via plugins to construct the final product which is the Angen simulator.

1.1.1 Motivation for using simulation

The development of an autonomous mobile robotics is required to give more attention to experimental methodologies including a number of projects that were created for this purpose [9,10], and more workshop series [11,12] were included. Therefore, the idea of simulating experiments has grown since they are now used as an alternative to real experiments on real robots [13,14]. Simulations thought-about as alternatives to real experiments that are impractical to implement in reality [15] . Experiments conducted by simulation can avoid the high costs of deriving new algorithms and models [21], avoid injuries, and trimming the time needed for running and reducing the efforts for writing software programs, debugging them, and displaying the results [88] compared to scenarios that are conducted on real robots. Bedsides, experiments in the real physical world would also require the designers to have appropriate knowledge of composing a real physical environment[88]. The repeatability during the experiments is another significant reason to use simulations in the fields of mobile robotics [22] and Ubiquitous computing, since duplicating the same conclusions when repeating the same experiments several times during testing would be necessary [23,24]. From my own experience of working on a previous project in a real physical environment, hardware shortcomings were the most difficult constraints when working in a real world system. On the other hand, simulations have been criticized by many scholars for not producing realistic results that can be transferred to the real physical world. This criticism was justified when the performance of the simulation and the performance of the experiments on the real robots are not validated[16].

(12)

Chapter 1. Intoduction, motivation and objectives 3

Simulation was defined by [17] as a reproduction of a process by another process whose state of that system or object is changing over time, as well as it is defined as a medium between a theoretical and empirical method [19] or methods to carry out experiments on computers [20]. In the field of robotics, simulations have opened the door to various possibilities to design, test and explore different scenarios, of course, with the possibilities of success or the failure of that system. The generated system in the simulation, is a mobile robot which interacts in the environment and might bring about the uncertainties of its actions that could be dealt with by the physical engine in simulation models[88] .

1.2

Desired features

Desired features of smart environments in general and in our developed Angen simulator are needed to be developed in each room; two movement detector sensors to detect movement of people and mobile robots round the scene, one luminosity sensor to detect the luminance of the room, a contact sensor for each chair and table and elevator in the environment to detect certain activities that take place in the rooms when inhabitants sit on the chair and using the tables or the elevator, one optical switch sensor for each door to monitor the state of the door when it is closed or opened. For instance detecting the luminance of the room is feasible to minimize energy consumption.

1.3

Objectives

According to Gazebo founders Nathan Koenig and Anderow Howard [52] planned fea-tures are to be done in the future work of Gazebo such as programmable objects (doors, elevators, lights) to extend the realms in which Gazebo can be implemented. As we are writing this thesis, these features are still not implemented. Therefore, this has motivated us to develop those actuators and devices in this thesis project.

The context’s environment is initialized from Angen facility in Orebro for developing housing and assistive technologies for elderly care. It is conducted within the center for Applied Autonomous Sensor System (AASS) at Orebro University. The Angen facility is occupied with a number of different sensors such as contacts sensors, optical switches and Radio-frequency identification (RFID) to track and localize people and a number of actuators and devices. The purpose of this thesis is to develop an agile prototyping toolkit that provides the means to users to simulate smart environments for robotics, AmIE and ubiquitous computing fields and having a simulator to be a fairly realistic rendition of the actual Angen facility or close to it, and simulate some of the real-world tasks.

(13)

Chapter 1. Intoduction, motivation and objectives 4

Specifications:

• develop Angen environment that corresponding to the real Angen facility,

• design and simulate models like doors, elevators and lights,

• simulate real-world tasks like switching lights on or off, close or open doors, moving elevator up or down using middle-ware Gazebo ROS.

Also, in this thesis we develop a set of tools that allow users to create sensors such as passive infrared motion detectors, optical switches, contacts and luminosity sensors that can be used in any smart environment in general ,and in our developed Angen simulated world in particular. We used the middle-ware ROS to interface the sensors’ data to retrieve information and react accordingly. The original structure of the tools that implemented the sensors are already available in Gazebo, but we did some modifications that intends to simplify them and enhancing their performances. In this thesis, we propose a multi-robot Gazebo simulator 1.9.1 for simulation and visualization a double floor apartment, actuators, devices and sensors and we propose Gazebo ROS Packages and ROS groovy as middle-wares for communication and developing the actuation and sensing modules.

Figure 1.1: The constructed inteligent environment is composed by two apartments, several actuators and sensors

In this thesis we also evaluate the performance of the developed simulator, testing the amount of the computational load by Gazebo server and client processes when deploying

(14)

Chapter 1. Intoduction, motivation and objectives 5

sensors. So the research questions set in this thesis is to determine whether it is feasible to build an intelligent environment simulator by the proposed ROS/Gazebo middle-wares, and to what extent this developed framework can be used in research activities given the computational constraints that is imposed by the proposed middle-wares.

1.4

Related work

Developing simulations of IEs will improve opportunities to efficiently design, develop and implement experiments of IEs in real physical world. Previous studies (Kranz et al.,2007, 2010) and (L. Roalter, M. Kranz, and A. Mller, 2010 ) were found to be more about simulating cognitive objects using ROS/Gazebo as a middle-ware, their studies were extended to include kitchen and office spaces, while a dearth of relevant work in the context of developing virtual intelligent environments using a stand-alone Gazebo simulator and its meta package for ROS integration (gazebo ros pkgs) for communication were found. In this thesis, our focus during the literature study is to find simulations of IEs that have used ROS/Gazebo as middle-ware but some other approaches were also included using different middle-wares for simulations.

In the Master thesis of ”Embedded sensing and actuation integration von eingebetteten systemen in alltagsgegenstaende” [28] which has conducted within activity recognition field, the researchers integrated actuators and sensors into everyday life basics like sup-plying a cup with sensors, LEDs, and a RFID tag in its base. For example, the cup was used as an object for activity recognition purposes such as publishing its location, what it contains , what its temperature, and how frequently it is used. Other Cognitive objects were produced by Kranz et al. [29] such as a fan and plant. The created objects retrieve data from the environment and utilize them to provide services for the users through so-cial networking such as Twitter, using PCA (perception-cognition-action) loop controller approach. One of the most established live-in apartments as an intelligent environment called PlaceLa founded in the USA (2004) is primarily designed for Ubiquitous Com-puting research. Sensors were integrated in every cabinet in every room to sense and record the activities around the place and using the data for further research [30]. The AwareKitchen project presented by Kranz et al. and Beetz et al. [31,32] has included many smart objects such as appended RFID and doors attached to magnetic sensors to detect when the door is opened and closed. They have also developed a smart knife and cutting board that is equipped with sensors to do the measurements of the forces and torques and retrieve data when interacting with the food, the prototype has employed a Player/Stage/Gazebo platform [33].

(15)

Chapter 1. Intoduction, motivation and objectives 6

A project related to danger prevention and home safety, based on an RFID system releasing warning to caregivers in elderly home caring, the project was developed with fuzzy rules [34]. Another project has targeted individuals with special needs who cannot use their hands to move a computer mouse. The researchers have developed a camera-based mouse called Camera Canvas which can be controlled when the user moves his head in front of the camera[35].

Distributed Embedded Intelligence Room (DEIR) is a 3D intelligent environment devel-oped by [37] who used games and multi agent system (MAS) technologies as approach to implement an IE using 3dsMax [36] and Ogre3D. According to the project’s developers, with MAS technique, experiments can be carried out without constructing a physical environment to conduct them. DEIR was embedded with several sensors and actuators that are connected through RS-485, LonWorks, IP network and Zigbee networks which are capable to perform real-time activity from a distant.

An inhabited office as IE was developed by [38] and designed as a static entity embedded with different sensors e.g. PIR sensors for movement detection using Arduino boards, contact sensors on the doors using Phidgets Interface Kit boards (Products for USB Sensing and Control), light and temperature sensors over ZigBee, traffic and weather information via RSS news feeds and moisture, temperature and light sensors for office plants via Twitter. Within the same project, a Cognitive Cup was developed and en-hanced with an accelerometer to catch its orientation by infrared LEDs and observe the fluid level and its temperature. It established a communication channel via RFID and ZigBee to grasp the cup when it is full and bring a new cup when the coffee is cold [38].

A Load Wooden Table object was also developed as part of the project to provide information about objects placed on it (e.g. its weight; position, etc) [39]. The project has extended its applications to conduct a mobile interaction with Magic-Phone as a remote control to control other devices (e.g. change TV channels) in the environment after detecting the direction that the phone is pointing at [40].

In ”Robots, Objects, Humans: Towards Seamless Interaction in Intelligent Environ-ments” project, the developers have implemented the MagicPhone scenario as a person is holding the mobile phone and interacting, using a PC keyboard and a Microsoft Kinect camera to detect objects by viewing their images [41]. Furthermore, they also employed GPS or the WLAN Service Set Identifier (SSID) to extend the project functionality outside the cognitive office and detect persons approaching the campus and sending out notifications to the magic phone device, upon that, the office’s PC shows a calendar with email application launched via Wake-on-LAN to notify the user once the person reaches the office. A traffic service was also implemented by searching the best route

(16)

Chapter 1. Intoduction, motivation and objectives 7

to go home and suggest the right time to do so to the user and a Personnel Robotic 2 (PR2) was also used and controlled by ROS in the office [41].

In [42] the developers invented several types of sensors; a light sensor using a camera and calculating the image intensity, a proximity sensor based on a laser scanner sensor, and a battery status sensor via a battery model. The lights were dynamically emitted and displayed through an implemented dynamic texture ROS/Gazebo as middleware.

An IE of a simulated apartment with rooms and furniture was developed by [43]. The environment was aimed at using an autonomous mobile robot ARTOS to search and look after humans, specially elderly people, and when there was a situation that required getting helps, ARTOS provided a platform medium between the persons and the care givers, also an animated human character was simulated to make things appear more realistic by using SimVis3D simulator to simulate and visualize the apartment with different sensors, actuators and the human character.

For human-robot interaction purposes, [45] invented walls and furniture with different sensors like cameras, microphones, distance sensors, and RFID tags embedded in them, also a humanoid robot with its actuators; head, arm and body and facial expression was developed to interact with ROMAN robot. The sensors inputs were intended to be used by the humanoid robot for its control system which is based on Human Animation standard, and the simulation performance was executed in real time.

The Sim3 simulator was chosen for a Master thesis project by Erik Westholm [46] to simulate a human character that moves around the PEIS environment that is embed-ded with sensors triggered when using objects for further activity recognition research development.

An intelligent wall was developed by Subrt and Pechac [48] for indoor environments, equipped with an active frequency selective surface, sensors and a cognitive engine and a cognitive wireless networks. According to the results of the implemented project, the performance of the overall system has been improved up to 80% when utilized with intelligent walls, it basically controls radio coverage and responds to the wireless system and observes the position of users, publishing information to the cognitive engine to optimize decision-making.

Another developed toolkit [49] for developing IE used ROS as a middle-ware and Gazebo simulator to simulate real world events that are triggered when using a mobile phone devices. The toolkit used Sweethome 3D a CAD software for modeling the objects. They created a room that has dynamic objects such as doors, chairs, windows, lights and controllers to turn the lights on/off , and a smart phone to cooperate with the ROS. Those actuators were fixed elements in the simulated world except light objects. Sensors

(17)

Chapter 1. Intoduction, motivation and objectives 8

were also embedded in the simulated world such as temperature, motion detectors, lights, and contacts for windows and doors, a GUI JQuery Mobile was also used as a remote control to control certain activities.

Berardina De Carolis and Giovanni Cozzolongo [50] developed a project within IE field that consisted of two phases; simulation and control, for implementing experiments by designers and user interface, enabling house inhabitants to interact easily with the devices in the environment directly by utilizing voice commands, and touch-screens or indirectly assigning tasks to a robot [50]. The first phase of the project has employed 3D Graphics, and 3D Studio Max to export the models in VRML (Virtual Reality Modeling Language) for simulation. The control aspect took place by publishing ACL messages from an XML formatted message describing the area situation in the room e.g. a stereo is playing music to be received by a Java class to parse it and visualize the message at the interface level. ”Homes for Robots: A Rapid Prototyping Toolkit for Robotics and Intelligent Environments” is a Master thesis conducted by [51] at the university of Mnchen to model different objects such as doors, walls, lights, windows, etc., using 3 CAD software called Sweet Home 3D that supports several formats: OBJ, DAE, 3DS and LWS and exports them to ROS/ Gazebo systems.

1.5

Thesis structure: overview of the chapters

Chapters:

• Chapter 1: In this chapter we included introduction to the thesis topic, motivation, objectives, and related work in the field of intelligent environment.

• Chapter 2: Illustrates the simulation medium, Gazebo architecture and other im-portant concepts.

• Chapter 3: Introduction to the developed Angen simulator, its developed architec-ture and details about the methods and implementation phases of the developed environment, actuators, devices and sensors.

• Chapter 4: Simulation results related to the computational costs and the systems performance throughout the process are described.

• Chapter 5: Discussion and conclusion, limitations and areas for further develop-ment and research.

Appendices: The documentation of the project work with details about the practical phase implementation of the developed framework.

(18)

Chapter 1. Intoduction, motivation and objectives 9

• Appendix A Guide to Install ROS And Gazebo Simulator.

• Appendix B Building the Environment.

• Appendix C Writing Actuators Plug-ins.

(19)

Chapter 2

Building intelligent environments

In this chapter we introduce a number of general properties of the proposed wares, the Gazebo simulator and its architecture and an overview of the robotic middle-ware to interface our code and build the actuation and sensing modules for our developed Angen simulator. In the building the physical models section, we illustrate how we constructed the physical entities of the environment, actuators and devices, with all the involved elements and their properties to construct the Angen world. In PID controller section, we illustrate the controller that was used to develop the actuation module for the elevator and doors.

2.1

Introduction of simulation platform

The Gazebo Simulator was founded by Nathan Koenig and Andrew Howard in 2002 at the University of Southern California and in 2009, ROS and PR2 were integrated into Gazebo by Willow Garage[52]. It is a multi-robot simulator in a 3-dimensional world, dedicated to outdoor environments, even though most of its users are using the simula-tor for indoor environments [53]. Gazebo simulasimula-tor is capable of simulating a number of robots, sensors and actuators altogether with ROS. It provides a network to control robots that also named as models by the users and maintains a straightforward API with the necessary hooks to network with the robot controllers programs. There are different types of physics engines that simulators are based on, for example (NGD, 2008; ODE, 1994; SD/FAST, 1994). All of them would provide libraries to simulate dynamic models in the environment. According to the simulator specifications on its official website, Gazebo has the ability to provide a sensor feedback and physical interactions between models employed by an open-source graphics renderer Object-Oriented Graphics Ren-dering Engine (OGRE) [54] and the Open Dynamics Engine (ODE) [55]. Also, with a

(20)

Chapter 2. Building intelligent environments 11

rigid body dynamics library [56] it enables users to simulate the physical behavior of the objects [57] and support also the use of Bullet Physics engine [58]. All the above mentioned libraries are aimed at facilitating the interactions between the models and its surrounded environments. Gazebo’s capabilities have grown rapidly as it supports designing and developing actuation and sensing modules with different interfaces (e.g. ROS and Solidworks, and animate humans) to carry out tasks on the models (robots) which are instructed by the users (e.g. actuators and devices) and processing informa-tion (e.g. sensors). It supports the animainforma-tion of models and it is recommended to be used when we have models that are meant to be static. In addition to the ODE rigid body dynamics engine, a collision detection engine would provide information related to the shape of each body and the contacts position between the bodies to indicate the user about the collided objects in the world. It is considered to have higher fidelity than Stage [59]. Using Gazebo requires the user to have a fair knowledge of the operating systems Linux (Ubuntu), C++, and ROS for integration purposes [60].

2.2

Architecture of Gazebo simulator

To emphasize Gazebo as an effective tool for simulating and visualizing indoor envi-ronments, it would be good to list the features that Gazebo architecture provides for developers and users which have been evolved over multiple releases. Gazebo supports dynamic simulation using different physics engines such as ODE and Bullet provid-ing access to physical and dynamical prosperities of each model (robot), and presents advanced 3D graphics with the support of OGRE for rendering environments and im-proving realism. The diversity of the supported sensors such as lasers, 2D cameras, RFID, and contact sensors is contributed to the list of its powerful features [64]. It pro-vides various robot models like Turtlebot mobile robot which we used in our developed Angen simulator, PR2, arms, grippers etc. Also, it has a powerful GUI and accessing simulation parameters such as simulation time, real time, a step time to update the physics engine one iteration and finally a real time factor to display the performance of the simulation in terms of time compared to the real time [64] and we used this tool to evaluate the performance of our system which we elaborate more in chapter 4. Plugins are another important feature that enables the developers to control almost every aspect of the simulation engines, ODE, rendering system, and sensor generation. In addition to, it supports importing meshes from any source with COLLADA and Stl extensions [64].

(21)

Chapter 2. Building intelligent environments 12

Figure 2.1: Gazebo distributed architecture from top broken into libraries for physics simulation, rendering, user interface, communication, and sensor generation. Three different processes will be provided: physics simulation, sensor generation, gui, and a

master for coordination.

The above figure describes the architecture of Gazebo simulator starting from the server which runs the physics engines, and generates sensor data by loading and updating sen-sors with the support of physics, sensen-sors, rendering and transport libraries [91]. The Gazebo client uses transport and rendering libraries which are used by the sensor library to generate data for sensors like cameras. It is based on QT software to access a run-ning simulation and interacts, visualizes and saves simulations[64]. In order to create a realistic simulated world, we needed to use most of the available resources that Gazebo provides. All of these resources reside in the simulated world that consists of models, robots, physics properties, scene properties, static and dynamic objects, sensors and worldplugin instances, including the core functions (e.g. physics update and message processing) that are controlled in the world. In our Angen simulator project we have managed to include all of these aspects to produce a simulated world that is close to a real world. In order to synthesize those components in the world, it requires us to pass through certain processes to run the simulation successfully. One of these processes is the Gazebo server which can be run on a separate machine, and instantiates a master that is used by client to locate the server and the worlds within the server. Its location can be specified using the GAZEBO MASTER URI environment variable. The second process is the client process to visualize the world. It can be run by a different computer and also connects to the server that has already been located and connected to [65]. The third process is passing messages over sockets to inter-communicate between the server and the client, it supports many types of messages that has the capability to create a new message based on Google Protobuf syntax [95]. Finally, the parameters that are setting the communication between the server and the client and help finding resources, are all set in the environment variables bash script that is installed with Gazebo. All of

(22)

Chapter 2. Building intelligent environments 13

these parameters are very important to find resources and establish the communication, starting from the hostname, port of the master, the path to the resource files, the path to Gazebo plugins, path to models, path to Ogre plugins, additional IP and hostnames addresses in case the user want to publish their own addresses [66].

2.2.1 Open dynamics engine (ODE)

The physics engine of Gazebo is ODE. It was developed by Russel Smith with other con-tributors in 2001, and extensively used by games, virtual world platforms such as Gazebo, OpenSimulator, Webots, and other research projects [69]. It is written in C/C++ and designed with two essential components, a rigid body dynamics simulation engine to simulate the dynamic and kinematics for interactions between rigid bodies in the envi-ronment backing various geometries such as boxes, sphere, cylinder and heightmap. In addition, there is a collision detection engine with some drawbacks such as approximat-ing friction and poor support for joint-dampapproximat-ing [70].

2.2.2 Visualization

Gazebo utilizes OGRE as a flexible 3-dimensional (3D) rendering engine[91]. It was written in C++ and designed to simplify the process of utilizing hardware-accelerated 3D graphics by developers[54]. Gazebo uses an open standard Open Graphics Library (OpenGL) [71] and OpenGL Utility Toolkit (GLUT) [72] for Graphical User Interface (GUI) via OGRE as the defaults [92]. OpenGL was developed by Silicon Graphics Inc. (SGI) in 1991 for rendering 2D and 3D graphics in Computer-aided design (CAD), virtual reality, simulation tools, and video games platforms were made available on most operating systems including Windows, Mac OS X and Linux [73]. OpenGL dominance was due to its performance since many applications were originally written in Integrated Raster Imaging System Graphics Library (IRIS GL) for high-end SGI workstations, which were far more capable, both graphically and in raw CPU power, than the PCs of the time [73]. Besides, GLUT was used to develop applications in OpenGL and provides a portable API to write a single OpenGL program that works across all computers and different operating systems [72].

(23)

Chapter 2. Building intelligent environments 14

2.3

Building physical models

2.3.1 Simulation description format (SDF)

Models or robots are the objects that we integrated in the developed intelligent envi-ronment Angen such as doors elevator, furniture and etc. They are are composed of links, collisions, joints, sensors and plugins. Models are described and stored as XML file format called SDF which can be used to depict varieties of intelligent models [75] in the simulated world XML file. Gazebo supports the SDF syntax since it covers all the features that Unified Robot Description Format (URDF) lacks. It wraps up all of the important information for a robotic simulator that can be used to characterize the models features starting from its physical properties to its visual properties. These specifications may include the scene features such as ambient lighting and shadows, the physics engine properties which are specifying real time update, real time factor and so on. Kinematics and dynamics properties are described within the branch of a model element, the former covers the motion without the forces while the latter examines the motion and its forces that produces it. Lights can be described as individual entities as a point, a spot, or a directional light. Additionally, the structure of SDF helps the user directly to import COLLADA and STL meshes to meet the developers needs for building realistic models. In our Angen simulator, we developed all of the models using SDF features with their powerful geometrical shapes and we made use of the capabilities of using COLLADA and Stl meshes to build the elevator, furniture and the door han-dles, they were all imported from 3D warehouse repository to overcome the issue of not being able to design the needed intelligent model with the available SDF features, or not finding the desired model among the existed models in the Gazebo model database. The imported models were designed from Googles SketchUp graphics editing software which were made available from the Google 3D Warehouse[67] with a different format that are supported by Gazebo. It is a large on line library of 3D models, many of them designed by professional graphics artists and students. Meshes can be also created with any 3D modeling tools like Blender, Cinema4D or 3ds Max or exported from a 3D repository.

2.3.2 Links

Every model must have at least one link to form the rigid body depending on the desired model the user wants to build. The links have various collision and visual geometries that form the model such as empty for empty geometries, box, cylinder, heightmap, image, mesh, plane and sphere. Each link has a kinematic, a pose, an inertial and a collision object that need to be as simple as possible to reduce the computational time when

(24)

Chapter 2. Building intelligent environments 15

examining an event of two collided bodies. In addition, the visual properties include features of the body visual matters like material texture and the desired shape. Also, the sensors are loaded within the links, and each sensor is inherit other properties for more specifications. In our Angen simulator every model we developed consists of essential graphics and collisions the can be different from the visual models. These primitive links could be connected by various joints such as prismatic and revolute that keep the degrees-of-freedom to allow the preferred movements. Consequently, these joints and links can function on physical collisions in a rational manner. Also, assigning an appropriate mass, inertia and friction values for the models to simulate its behavior through ODE or simply using the physics defaults that are described in the SDF documentation.

2.3.3 Joints and their safety limits

Joint(s) are used to connect rigid bodies together to establish the kinematic and dynamic relationships as it shown in figure (2.2). In Gazebo the SDF syntax supports different types of joints for different purposes depending on its application. They are used ei-ther for connecting two bodies or for controlling them. They connect two bodies with kinematic and dynamic properties to move the component in a 3D space [75]. In our developed Angen simulator we have used revolute (rotational) joints for door hinges to support the single-axis or uni-axial rotation function, which is commonly used in robotics [74]. Also in the developed Angen simulator we used a prismatic joint (translational) also known as a sliding joint for the developed elevator sliding along the axis.

Assigning wrong parameter values for the joint element could cause a wrong and unstable motion behavior. The joint attributes include; names of the parent and child of rigid bodies that are supposed to be connected, pose, joints axis in the parent model frame represented by the x,y,z components and its dynamics specifications which is addressed by the damping and friction properties. The upper and lower limits are available to constraint the joint’s motion. Additionally, some other attributes to produce a stable controller were used, such as limits for effort and velocity. Setting a bound on the effort would not allow the controller to interfere to enforce the joints motion beyond the specified effort or velocity. We used the defaults for other attributes since it will not affect our controllers’ stability.

The k velocity term determines the scale of the effort bound.

For a velocity v, a velocity limit of v+ and a k velocity of k v the upper bound on effort is

(25)

Chapter 2. Building intelligent environments 16

Figure 2.2: Two joints of a robot model connecting different parts of rigid bodies.

2.4

Proportional-Integral-Derivative (PID) controller

Development of actuation modules was needed to be able to control the doors and the elevator in our developed Angen simulator for commanding them to behave intelligently in the Angen world. In this thesis project, we use the Proportional-Integral-Derivative (PID) feedback control that uses a closed-loop to control the states (outputs) of the joints positions of our dynamical objects the doors and the elevator, giving a ”feed back” as an input to the process and closing the loop [77]. PID is considered to be the most common closed-loop controller architecture where more than 95% of the control loops are PID types that can be found in all domains with control theory, particularly in robotics [77]. The PID computes the error which represents the difference between the desired input value (r) and the actual output (y). This error signal (e) will be sent to the PID controller, and it will calculate both of the terms; the derivative and the integral of this error signal, producing control signal and minimizing the error [93]. Stability can be achieved with the proportional term resulting in control offset, while the integral term repeats the action of the proportional band every integral time constant. This is done to recover from a disturbance in conditions, and the derivative term is used to provide damping, predicts within the process and takes quicker action than the integral term to

(26)

Chapter 2. Building intelligent environments 17

correct it [78]. From the output of the P term, the system will never reach a steady state while it guarantees that the process output agrees with the reference in steady state, while D can improve the stability of the closed-loop system [78]. Adding up the three terms would result in control signal as following [77]:

u(t) = M V (t) =Kpe(t) + Ki

R0

t e(t) dt.+Kd d dte(t)

Figure 2.3: A block diagram of a PID controller in a feedback loop [81]

Software loop implements a PID algorithm [79] :

”Pre-compute controller coefficients Previous error=0

Integral=0 loop

error[t]= setpoint - actual position integral[t]= integral[t-1] + error[t] derivative[t]= error[t] - error[t-1]

action= Kp * error[t] +Ki * integral[t] * dt + Kd * derivative[t] / dt wait(dt)

end loop In the beginning of the loop, two variables are maintained within the loop and initialized to zero. Then the current error is calculated by subtracting the de-sired position (the process variable or PV) from the current setpoint (SP). Next, the integral and derivative values are calculated to construct the error by combining three preset gain terms, the proportional gain, the integral gain and the derivative gain to derive an output value.[80] The current error is to be stored and used in the next dif-ferentiation after waiting a duration of time (dt) of seconds to start a new loop again,

(27)

Chapter 2. Building intelligent environments 18

reading in new values for the PV and the setpoint to calculate a new value for the error[80].

2.5

Interface with a robotic middleware

Here we will emphasize the in-processes and inter-processes that we used to interface with our code and construct the controllers to control our models in the developed Angen simulator. We considered three methods to interact with the simulation in the Angen simulated world.

2.5.1 ROS: robot operating system

Robot Operating System (ROS) was presented by Morgan Quigley as an open source framework designed to accommodate a wide range of robotics systems and supports many high-level algorithms such as object recognition, navigation in cluttered, and dy-namic environments designed for Turtlebot and PR2 mobile robots. It handles issues related to the technical side portability, efficiency, and scale. Also, it uses publish/sub-scribe messaging mechanisms for inter-communication and is to be asynchronous and loosely coupled by the users. It provides a flexible and reliable way to generate, name and remap the message formats at running time [68]. ROS supports command line tools that allow users to build packages and dependency checking and other tools that were used in this project such as (rostopic pub, rostopic echo) [68].

2.5.2 Plugin interface

A plugin is described in Gazebo as code written in C++ associated to all Gazebo classes that are documented in the public API[62]. It is linked against Gazebo libraries and roscpp and loaded at the runtime. Each plugin is registered with a Gazebo simulator as a (model, world, system, sensor) type of plugin. The resultant from the compilation process is a shared library libplugin model to be loaded into Gazebo either by loading them into the world files or into the model files. All plugins are in the Gazebo name space and each one must be derived from a plugin type. Utilizing plugins would develop the means of the world to include facilities such as dynamic loading of models and exploiting sensors. The plugin has to be loaded in the SDF file, depending on the type of the plugin we are loading in. When we start Gazebo, it parses the SDF file, locates the plugin, and loads its code [90]. Finally, the plugin must be registered with Gazebo using the

(28)

Chapter 2. Building intelligent environments 19

GZ REGISTER TYPE PLUGIN macro with the name of the plugin class. For example, the registered macro for a model plugin type is GZ REGISTER MODEL PLUGIN [62].

2.5.3 Tcp communication or Gazebo communication

The Transmission Control Protocol (TCP) communication is used for interaction with the simulation by subscribing and publishing messages (data) via Gazebo topics during transportation process using boost::asio and protobuf for serialization [63].

2.6

Gazebo-ROS packages

Gazebo ros packages is another way to connect to ROS as middle-ware to establish a communication process and to achieve a better and improved ROS integration with a stand-alone Gazebo. The set of ROS packages evolved in the fall of 2013 to integrate with the stand-alone Gazebo latest version 1.9. The new package is giving the opportunity for developers to use all the available plugins that were implemented in the outdated simulator textunderscore Gazebo. It was impossible to do so when using the stand-alone Gazebo version. Under the new upgraded system, we were able to achieve our goals that are related to sensors work. A detailed description of the instructions regarding the installation, requirements and running the new system is mentioned in the appendices section.

(29)

Chapter 3

The developed Angen simulator

In this chapter we explore the methods that we used to develop the Angen simulator. In the first section we give a high-level overview of the developed architecture and how the Angen package is organized in terms of folders and tools. Next, a description of instructing the environment is described. The actuators section describes the doors and the elevator and light components when they are utilized in Angen environment. The actuators communication with ROS section introduces the processes that were used to interface ROS system and how the communication is implemented. Finally, the sensors section describes all integrated and independent sensors that are used in the developed Angen simulator and how they are instructed and implemented.

3.1

System and package overview

The Angen package is considered as a specialized Gazebo Ros stack developed at AASS Research Center at Orebro University. It supports tools for developing virtual intel-ligent environments for simulation in Gazebo standalone version and its meta Gazebo Ros Packages middlewares. The significance of the package is a set of shared libraries ’libplugin model.so’ as a resultant from the compilation process of the written plugins (codes). All the libraries are combined in the ’lib’ folder, and can be attached directly to the models or to the world SDF files depending on the plugins type. The package has two main stacks, the actuators and the sensors. SDF models are combined in Gazebo models folder and it is recommended that the models to be added to Gazebo model database. The tree of the developed package hierarchy is shown in figure 3.1.

The actuators stack was developed before Gazebo Ros packages has emerged, it contains source files, shared libraries, sh executables files and README file which demonstrates

(30)

Chapter 3. The developed Angen simulator 21 Angen_Project Gazebo_Models _SDF Actuators CMake Src elevator ros_plugin.cc door door_world_plugin.cc Turtlebot drive_base.cc robot_controller.cc CMakeLists Scripts_Sh Build libdoor_world_plugi n.so libros_plugin.so librobot_controller.s o drive_base Sensors launch world src Light_Sensor Light_Sensor_controller.h Light_Sensor_utils.h LightSensMsg.h luminosity_sensor_controller. cpp luminosity_sensor_utils.cpp Optical_Switch optical_switch_controller.cpp optical_switch_controller.h SwitchMsg.h Pir_Sensor Pir_controller.cpp PirMsg.h msg LightSensMsg.msg PirMsg.msg SwitchMsh.msg materials scripts textures meshes lib drive_base librobot_controller.so libros_plugin.so libdoor_world_plugin.so libgazebo_ros_bumper.so libluminosity_sensor_controller. libluminosity_sensor_utlis.so liboptical_switch_controller.so so libpir_controller.so CMakeLists Package.xml

Figure 3.1: Angen package hierarchy in terms of folders.

the stack hierarchy. The shared libraries of the elevator and the doors were implemented using built-in functions of PID class in Gazebo API to control the doors and the elevator. The lights controller was implemented as part of the elevator plugin since both of them are treated as a model plugins. The Turtlebot robot is part of the actuators stack and it is controlled via keyboard arrows by the users. All of the controllers in the actuators stack have been implemented using the methodology that is described in section (3.4)

(31)

Chapter 3. The developed Angen simulator 22

describing how to communicate with ROS. The sensors package is composed of four sensors plugins: PIR, optical switch, luminosity sensor and contacts sensor.

The sensors were compiled in the same way the gazebo ros packages does, to interface with ROS and developing our sensing modules. We also added to ’.bashrc’ file the setup scripts, to fulfill the requirements of exporting the plugins and the media environment paths every time we want to start the simulation in the terminal. Additionally, we com-bined the actuation modules with the sensing modules to simply integrate the actuators and the sensors all in the Angen world. The sensors stack includes the launch and world files to initialize our simulation world when starting Gazebo. In addition to README file and other configuration files. It also contains 3D meshes and materials (textures and scripts) that we use for the sensors and the actuators models. The source folder ’src’ includes three sub-folders, each of them contains the plugins of the sensors in addition to a custom messages folder used by the sensors plugins.

The PIR sensor is implemented using a range laser that was derived from gazebo ros block laser plugin and extended to serve the purpose of the project. The range values of the laser are gathered and stored in a matrix and the movement is detected when the value of the calculated difference of previous scans is greater than a threshold. The plugin update rate is throttled at 10 HZ rate so is publishing the topic.

The optical switch sensor is working with as the same as PIR sensor. However, this sensor needs a small strip attached on the top of each door, so when it cuts the range of the ray, the event is triggered. The range values of the two simulated rays are gathered and stored in a matrix; the movement of the door is detected when the value of the calculated difference of previous scans is greater than a threshold. Its plugin and custom message is published at 1 hz rate.

The luminosity sensor is a camera sensor that calculates a one-pixel image and the RGB color intensities, the plugin is publishing the topic at 1 hz rate.

The contact sensor is integrated into the models (table and elevator) to give feedback via contacts message when collisions of two models collide. The plugin is publishing the topic at 1 hz rate.

3.2

Design of the apartment environment

In our Angen simulator we designed a static physical apartment which comprised of five rooms; kitchen, living-room, bedroom, workroom and a bathroom. The apartment is composed of several physical bodies with inertia, collision, and visual properties.

(32)

Chapter 3. The developed Angen simulator 23

Specifically, it has 50 physical links, with each one of them represent either a wall or a floor. All of the walls and the floors have a box geometrical shape with different sizes to avoid complex geometries that are computationally expensive. We chose the appearance of the walls as ”White” and the floors as ”Wood” to appear more realistic and the material scripts were imported from Gazebo model database. The apartment has no joints since it was built as a static model.

3.2.1 Implementation

The apartment has initially been integrated in an empty world and to start the simula-tion there are two ways available, one is to spawn the apartment through Angen world file or by a command line. Example of the Meta-Tags that used in our Angen world file:

<i n c l u d e> < u r i>m o d e l : // a p a r t m e n t f l o o r 0</ u r i> <name>a p a r t m e n t 0</name> <p o s e>−11 11 0 0 0 80</ p o s e> </ i n c l u d e>

3.3

Actuators

The actuators are components used in our simulator to carry out tasks by for example applying forces on joints to move models and controlling the elements and the models to change the condition of the simulated environment. Angen simulator is composed of several actuators such as doors, elevator, lights and a mobile robot. The actuators are spawned into Angen world through a Gazebo world file using Meta-Tags blocks.

3.3.1 Design of the door model

The door model is a non static model because we want to use the physical engine to control the door by PID controller, it is comprised of two physical links both of them are modeled as box geometrical shapes, the collision and visual properties of the model are the same. The designed door has revolute joints that rotates along a single axis with either a fixed or continuous range of motion, to be able to control the door for open or close event. The first link is positioned at the bottom center and added to ease the positioning of the door. The second link is the actual door block where the hinge of the optical switch strip is attached. We also used fixed joints in the doors to support the door handle links, which is not supported by SDF, but to achieve our objective we

(33)

Chapter 3. The developed Angen simulator 24

used revolute joints with 0 axis for the joints. The actual door block has a revolute joint that rotates along ”Z” axis with a fixed range of motion, it is linked to the world as a parent and to the door block link as a child and its axis specified in the parent model frame. Three more links for the front and back door’s handle and a strip object are built and their joints are attached. The third link is modeled as a green strip attached on the top of the door that is important to make the optical switch sensor works properly. For example, in Angen simulator we only designed six doors regardless of how many apartments we have, they are added in the world description file a Meta-Tags similarly to the apartment models to reduce the storage size. In that way, the users are allowed to add as many controllable doors as they want to Angen simulator using only six door models. For example, we can add 18 doors for three apartments using the designed 6 doors without the need to design more additional doors as SDF. When designing the door we also considered the safe limits of the joints to be controlled to not exceed the upper and lower limits when the force is applied which would generate the dynamics that is caused by the physical friction of the joint between the contacted bodies.

3.3.1.1 Implementation of the door model

Door controller is built using the physics engine with a world plugin type to be able to close or open multiple doors at a time with the help of Gazebo built-in PID class. The flow of the program starts by loading the joints of each door and computing the step-time for the PID. Then computing the error between the current and target position which is zero since the pose of the door is not changing but the angle of the door is. The radians values of the upper and lower limits of the revolute joints are loaded in SDF of each model to constrain the forces applied on the joints within these limits. PID-gains and integral term limits (proportional gain, integral gain, derivative gain, integral upper limit, integral lower limit, the maximum value for the command and the minimum value for the command) are initialized in each time we publish the message to ROS topic. The gains and the limits parameters of each door were tunned to reach certain angle when opening and closing the door (lower -1.570000 and upper 0.001000). We have set the P gain term to 70 to reach the desired point but at this time the system was unstable therefore we increased the I gain term to 200 to stop oscillations, while the D gain term was relatively small 25 because it is sensitive to noise and setting it to large values would produce unstable system. After the compilation process, the resultant plugin (shared library) is attached to the Angen world file that we use when starting Gazebo. The world plugin is used to control multiple doors in the world with only one plugin and it is loaded in the world file. The below code is an SDF snippet of the bathroom door model

(34)

Chapter 3. The developed Angen simulator 25

showing the revolute joints of two bathroom doors of a double floor Angen apartments, the joints are to be controlled by joint position controller in the plugin:

< j o i n t name=” e n t r a n c e f 1 d o o r j o i n t 2 ” t y p e=” r e v o l u t e ”> < c h i l d> e n t r a n c e f 1 d o o r b l o c k</ c h i l d> <p a r e n t>w o r l d</ p a r e n t> <p o s e> −9.371533 6 . 5 0 9 1 6 6 2 . 2 0 0 0</ p o s e> <a x i s> <xyz>0 . 0 0 0 0 0 0 0 . 0 0 0 0 0 0 1 . 0 0 0 0 0 0</ xyz> < l i m i t> <l o w e r> −1.570000</ l o w e r> <up pe r>0 . 0 0 1 0 0 0</ u pp er> < !−−e f f o r t >10.000000</ e f f o r t> < v e l o c i t y>5 . 0 0 0 0 0 0</ v e l o c i t y−−> </ l i m i t> <dynamics /> </ a x i s> </ j o i n t> < !−−bathroom1 j o i n t−−> < j o i n t name=” e n t r a n c e f 1 d o o r j o i n t 8 ” t y p e=” r e v o l u t e ”> < c h i l d> e n t r a n c e f 1 d o o r b l o c k</ c h i l d> <p a r e n t>w o r l d</ p a r e n t> <p o s e>−9.371533 6 . 5 0 9 1 6 6 2 . 2 0 0 0</ p o s e> <a x i s> <xyz>0 . 0 0 0 0 0 0 0 . 0 0 0 0 0 0 1 . 0 0 0 0 0 0</ xyz> < l i m i t> <l o w e r>−1.570000</ l o w e r> <up pe r>0 . 0 0 1 0 0 0</ u pp er> < !−−e f f o r t >10.000000</ e f f o r t> < v e l o c i t y>5 . 0 0 0 0 0 0</ v e l o c i t y−−> </ l i m i t> <dynamics /> </ a x i s> </ j o i n t>

Figure 3.2: Door model of the kitchen room when it is opened.

The door controller message published to the door world plugin: std msgs/String header

string data

The message in the plugin carries string messages contain the name of each door model in the environment to be closed or opened.

(35)

Chapter 3. The developed Angen simulator 26

Figure 3.3: Door model of the livingroom when it is closed.

3.3.1.2 How the simulated door works

The developed plugin of the doors was written as a world custom ROS enabled plugin, therefore it is important to highlight the process of how we interface ROS system and pass messages. To start the communication with ROS master node, we have to construct a NodeHandle for each plugin. The first NodeHandle will initialize the controller, and the last one destructed to finalize and shutdown resources that the controller was using. We subscribe to ROS to to receive the string messages we constructed above. This requests a call to ROS master node, which keeps a registry of who is publishing and who is subscribing [97]. The string messages are passed to callback functions. In those callbacks, we initialize PID-gains and integral term limits for the joints. ROS will call the callback functions whenever a new message with a certain string messages arrive. Withing the subscription we can also define the size of the message queue, we used ’1’, so in this case when the queue reach one message the old message will be thrown away as the new one arrives. The subscriber object will be maintained until it is destructed and it will automatically unsubscribe from the topic. This communication on topics only happens when publishing and receiving the same type of the constructed messages between ROS topics. We use ROS command line tool ’rostopic pub’ to publish our string messages using the ’Once mode’ type as ’-1’ to keep the message latched for 3 seconds, then rostopic automatically quits without ending the process manually with ctrl-C. Finally, in order to process the callback functions and passing the messages using the middle-ware ROS, we have to spin ROS once which will not consume the CPU if there is no callbacks or services to check. The plugin is compiled with a rosbuild system that contains scripts for managing the CMake-based build system for ROS.

Example of a command sent to the bathroom door in the ground floor to be closed: rostopic pub -1 /gazebo/Door controller std msgs/String ”Close, bathroom0”

(36)

Chapter 3. The developed Angen simulator 27

Example of a command sent to the bathroom door in the second floor to be opned: rostopic pub -1 /gazebo/Door controller std msgs/String ”Open, bathroom1”

On the other hand, Gazebo simulator is launched using Gazebo command which starts Gazebo server and client all at once. So, when we pass messages to close or open the doors in Angen world via ROS topics, Gazebo server which employing the physics engine libraries checks whether the door’s angle reached its desired angular position.

The door is integrated through Angen world file or by a command line. Example of the Meta-Tags that used in our Angen world file:

<i n c l u d e>

< u r i>m o d e l : // b a t h d o o r 0</ u r i> <name>b a t h d o o r 0</name>

<p o s e> −9.371533 6 . 5 0 9 1 6 6 0 . 1 0 0 −1.7</ p o s e> </ i n c l u d e>

3.3.2 Design of the elevator model

The elevator model in Angen simulator comprised of one physical link to represent the rigid body, we use a COLLADA mesh to build the elevator model and layout the basic shape of the model as a non static model to use the physics engine and be able to properly move all the components. We set the collision geometry size to be less than the visual geometry size, because triangle mesh for collision geometry would slow down the system since the collision dynamics calculations are computationally expensive. The model also has a prismatic joint that slides along the Z axis with its upper limit as 4 meter high and lower limit as 0 . The joint we used is to connect the rigid body with the world and also to be used by PID controller to control the elevator. The following is a snippet SDF code of the elevator model:

Figure 3.4: Complete PID [83]

<model name=” m y e l e v a t o r ”> < s t a t i c> f a l s e</ s t a t i c> <p o s e>0 0 . 0 0 . 0 0 0 0</ p o s e> < l i n k name=” body1 ”>

(37)

Chapter 3. The developed Angen simulator 28

<g e o m e t r y> <mesh>

< u r i> f i l e : // meshes / e l e v a t o r 3 . dae</ u r i> < s c a l e>. 5 0 . 5 0 . 4 0</ s c a l e> </ mesh> </ g e o m e t r y> </ c o l l i s i o n> < v i s u a l name=” v i s u a l ”> <c a s t s h a d o w s> f a l s e</ c a s t s h a d o w s> <g e o m e t r y> <mesh>

< u r i> f i l e : // meshes / e l e v a t o r 3 . dae</ u r i> < s c a l e>. 6 0 . 6 0 . 7 7</ s c a l e>

</ mesh> </ g e o m e t r y> </ v i s u a l> </ l i n k>

3.3.2.1 Implementation of the elevator model

In Angen simulator, we used a PID controller for the elevator model exploiting the physical engine to move the elevator up and down. The plugin we wrote was a model type that can be attached to the elevator model and thus be controlled during simulation. The flow of the plugin starts by loading the model and the joint of the elevator and computing the step time for the PID controller. The error is computed between the current and target position (4 meters hight), then a force is applied on the joint by the physics engine. In the following steps, we describe how the elevator PID controller is working:

When loading the model, the PID constants are initialized at zero. Later, we compute the proportional (P) term for the error by increasing P gain to 200 to approach the desired position (4m hight). Since we are at 0 meter and want to reach 4 meters hight, the distance difference between the current position of the elevator and the desired position is calculated representing the error. As the elevator becomes closer to the desired position, it will slow down. However, because of the resistance to the change in its motion (inertia), it will try to return 4 meters back and the P term will react upon this and take the elevator up, which in turn would cause the elevator to oscillate around the desired position as shown in figure (3.4).

Since we do not want the elevator to oscillate we need to compute the integral term (I) that adds up the errors for each loop as it shown in figure (3.5). The I term affects how fast the elevator approaches its desired position, and this term should even be equal or higher that P when there is another force pushing against the elevator so we set the gain parameter to 200.

(38)

Chapter 3. The developed Angen simulator 29

Figure 3.5: PID with P term [83]

Figure 3.6: Distance with P and I term [83]

The derivative term (D) makes the elevator more steady and stable and will stop it close or exactly at the desired position as shown in figure (3.6), we set it relatively small to 25 because this term is sensitive to noise and setting it up to large values would affect the stability of the elevator. Finally, summing up all three terms and sending it to the joint of the elevator.

Figure 3.7: Elevator Model with a door on one side and open from the other side.

The elevator controller message published to the elevator model plugin: std msgs/String header

string data

(39)

Chapter 3. The developed Angen simulator 30

Figure 3.8: Elevator Model

each apartment to take the elevator model up to the second floor or down to the ground floor apartment.

3.3.2.2 How the simulated elevator works

The developed plugin of the elevator is written as a model custom ROS enabled plugin, therefore we interface with the middle-ware ROS system and pass messages with the same methodology we used with doors. Example of a command sent to the elevator to take it up to the second floor.

rostopic pub -1 /gazebo/Elevator controller std msgs/String ”Floor1”

Example of a command sent to Gazebo to take the elevator down to the ground floor. rostopic pub -1 /gazebo/Elevator controller std msgs/String ”Floor0” On the other hand, Gazebo simulator is launched using Gazebo command which starts Gazebo server and client all at once. So, when we pass messages via ROS to move the elevator up or down in Angen world via ROS topics, Gazebo server which employing the physics engine libraries checks whether the elevator reached its targeted position.

The elevator is integrated through Angen world file or by a command line. Example of the Meta-Tags that used in our Angen world file:

<i n c l u d e>

< u r i>m o d e l : // m y e l e v a t o r</ u r i>

<p o s e>1 . 3 5 2 6 9 4 1 . 2 0 2 1 7 8 0 . 0 0 0 −0.1400000</ p o s e> <p l u g i n f i l e n a m e=” l i b r o s p l u g i n . s o ” name=” r o s p l u g i n ” /> </ i n c l u d e>

References

Related documents

Linköping Studies in Science and Technology, Dissertation No... INSTITUTE

1954 Robert Gust av sson De velopment o f so ft sensor s f or monit.. oring

The leading question for this study is: Are selling, networking, planning and creative skills attributing to the prosperity of consulting services.. In addition to

To construct a broom from the branches of the trees growing close to the square.. To sweep the square and the small branches in the broom breaks one by one and now they are

Informanterna beskrev också att deras ekonomiska kapital (se Mattsson, 2011) var lågt eftersom Migrationsverket enligt dem gav väldigt lite i bidrag till asylsökande och flera

Abstract— Airrr .lUe aim of the study was to assess total daily energy expenditure (TDE), as measured by doubly labelled water (DLW), and describe its components in home-living

Thereafter I ad dress the responses of two contrasting subarctic- alpine plant communities: a rich meadow and a poor heath community, to factorial manipulations of

As mentioned above (section 3.1) optical sensors and also, in principle, all types of sensors are susceptible for signal drift over time. There are, however, several