• No results found

A method for collision handling for industrial robots

N/A
N/A
Protected

Academic year: 2021

Share "A method for collision handling for industrial robots"

Copied!
94
0
0

Loading.... (view fulltext now)

Full text

(1)

Institutionen för systemteknik

Department of Electrical Engineering

Examensarbete

A method for collision handling for industrial robots

Examensarbete utfört i Reglerteknik vid Tekniska högskolan i Linköping

av

Fredrik Danielsson och Anders Lindgren LITH-ISY-EX--08/4105--SE

Linköping 2008

Department of Electrical Engineering Linköpings tekniska högskola

Linköpings universitet Linköpings universitet

(2)
(3)

A method for collision handling for industrial robots

Examensarbete utfört i Reglerteknik

vid Tekniska högskolan i Linköping

av

Fredrik Danielsson och Anders Lindgren LITH-ISY-EX--08/4105--SE

Handledare: Christian Lyzell

isy, Linköpings universitet

Richard Warldén

Motoman Robotics

Examinator: Svante Gunnarsson

isy, Linköpings universitet

(4)
(5)

Avdelning, Institution

Division, Department

Division of Automatic Control Department of Electrical Engineering Linköpings universitet

SE-581 83 Linköping, Sweden

Datum Date 2008-05-16 Språk Language  Svenska/Swedish  Engelska/English   Rapporttyp Report category  Licentiatavhandling  Examensarbete  C-uppsats  D-uppsats  Övrig rapport  

URL för elektronisk version

http://www.control.isy.liu.se http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-ZZZZ ISBNISRN LITH-ISY-EX--08/4105--SE

Serietitel och serienummer

Title of series, numbering

ISSN

Titel

Title

Metod för kollisionshantering av industrirobotar A method for collision handling for industrial robots

Författare

Author

Fredrik Danielsson och Anders Lindgren

Sammanfattning

Abstract

This master’s thesis presents the development of a collision handling function for Motoman industrial robots and investigates further use of the developed software. When a collision occurs the arm is to be retracted to a safe home location and the job is to be restarted to resume the production. The retraction can be done manually, which demands that the operator has to have good knowledge in robot handling and it might be a time consuming task. To minimise the time for restarting the job after a collision and allowing employees that have limited knowledge in robot handling to retract and restart the job, Motoman provides an automatical retraction function. However, the retraction function may cause further collisions when used and therefor a new function for retracting the arm is needed. The new function is based on that the motion of the robot is recorded by sampling the servo values, which are then stored in a buffer. A job file is automatically created and loaded into the control system, and the position variables of the job file are updated using the contents of the buffer. This will ensure a safe retraction of the arm as long as the environment surrounding the robot remains the same.

The developed software made it possible to control the robot in real-time by changing the buffer information, which has lead to a cognitive system called the Pathfinder. By initiating the Pathfinder function with at least a start and an end point, the function generates a collision free path between the start point and the end point. A pilot-study has also been made concerning integration of a vision system with the Pathfinder to increase the decision handling for the function.

Nyckelord

(6)
(7)

Abstract

This master’s thesis presents the development of a collision handling function for Motoman industrial robots and investigates further use of the developed software. When a collision occurs the arm is to be retracted to a safe home location and the job is to be restarted to resume the production. The retraction can be done manually, which demands that the operator has to have good knowledge in robot handling and it might be a time consuming task. To minimise the time for restart-ing the job after a collision and allowrestart-ing employees that have limited knowledge in robot handling to retract and restart the job, Motoman provides an automatical retraction function. However, the retraction function may cause further collisions when used and therefor a new function for retracting the arm is needed. The new function is based on that the motion of the robot is recorded by sampling the servo values, which are then stored in a buffer. A job file is automatically created and loaded into the control system, and the position variables of the job file are updated using the contents of the buffer. This will ensure a safe retraction of the arm as long as the environment surrounding the robot remains the same.

The developed software made it possible to control the robot in real-time by changing the buffer information, which has lead to a cognitive system called the Pathfinder. By initiating the Pathfinder function with at least a start and an end point, the function generates a collision free path between the start point and the end point. A pilot-study has also been made concerning integration of a vision system with the Pathfinder to increase the decision handling for the function.

Sammanfattning

Detta examensarbete behandlar i första hand en metod för kollisionshantering av Motomans industri robotar men undersöker även ytterliggare användningsområ-den för användningsområ-den utarbetade metoanvändningsområ-den. När en kollision har inträffat och man vill dra tillbaka armen till ett säkert hemma läge för att återställa systemet och återupp-ta produktionen kan man i dagens system antingen köra tillbaka armen manuellt eller använda en automatisk funktion. Den manuella körning kräver att operatören har god kännedom om roboten samt att det oftast är tidskrävande vilket är nå-got man vill minimera när det gäller att få igång produktionen efter ett stopp. Den nuvarande automatiska funktionen minimerar tiden och underlättar för oper-atören att återställa jobbet och gör det lättare att återuppta produktionen. Denna funktion har dock sina begränsningar vilket kan leda till ytterliggare kollisioner när man drar tillbaka armen. En ny metod har utvecklats och implementerats för

(8)

att förbättra och underlätta tillbaka dragningen för operatören. Den nya metoden bygger på att man samplar robotens rörelse och lagrar den i en buffert. Bufferten används sedan för att återskapa rörelsen genom att ett jobb skapas automatiskt och positionsvariabler uppdateras från bufferten. På detta sätt kan en säker väg till hemma läget garanteras så länge den omgivande världen förblir konstant.

Den utvecklade mjukvaran har även gjort det möjligt att kontrollera roboten i real tid genom att ändra innehållet i bufferten vilket har lett vidare till ett kogni-tivt system, Pathfinder, som kan lära sig av sina misstag och rätta till existerande jobb. Genom att initiera Pathfindern med minst en start och en slut punkt så genererar den en kollisions fri väg mellan dessa båda punkter som sedan laddas upp som ett vanligt robot jobb. Det har även gjorts en förstudie i att komplet-tera Pathfinder funktionen med ett kamerabaserat vision system för att utvidga beslutsfattningen för funktionen.

(9)

Acknowledgments

We would like to thank the employees at Motoman Robotics AB, Kalmar, for never getting tired of our foolish questions and a special thanks to Richard Warldén who made this master’s thesis possible.

Thanks to Svante Gunnarsson and Christian Lyzell for commenting on the report and helping us throughout the process. Finally we would like to thank Per Lind-gren who gave us roof over our heads during our first month in Kalmar and all our friends and family that have shown their support.

(10)
(11)

Contents

1 Introduction 1

1.1 Background . . . 1

1.1.1 Research and Development . . . 2

1.1.2 Motoman Robotics AB . . . 2

1.2 Purpose . . . 3

1.3 Thesis outline . . . 3

I

Motion Control

5

2 Robotics 7 2.1 Industrial robots in general . . . 7

2.2 Sensors on the robot . . . 8

2.3 Kinematics of the robot . . . 8

2.4 System Overview . . . 9

2.5 Robot Control . . . 9

2.5.1 NX100 control system . . . 11

2.5.2 NX100 Parameters . . . 11

2.5.3 Safe home location . . . 12

2.6 Kinematics . . . 13

2.6.1 Rigid motion . . . 13

2.6.2 Forward Position Kinematics . . . 14

2.6.3 Inverse Position Kinematics . . . 15

2.6.4 Forward Velocity Kinematics . . . 17

3 Backtracking 19 3.1 Problem Overview . . . 19

3.2 How to retrieve a safe way home . . . 20

3.2.1 Using external sensors . . . 21

3.2.2 Path recording . . . 22

3.2.3 Test bench . . . 22

3.3 How to get a steady motion . . . 25

3.4 Environmental change . . . 28

3.5 Restore the monitored job . . . 28

3.6 INI-file . . . 29

(12)

3.7 Limitations . . . 29

4 Future use of the backtracking software 31 4.1 Real-time motion control . . . 31

4.2 Pathfinder . . . 34

4.2.1 Basic-Pathfinder . . . 34

4.2.2 Path processing . . . 38

4.2.3 Object Estimation . . . 39

5 Test Results on Part I 41 5.1 Backtrack . . . 41

5.2 Basic-Pathfinder . . . 41

II

Integration of a Vision System

47

6 Theory 49 6.1 The Stereo Problem . . . 49

6.1.1 System Overview . . . 50

6.2 Stereo Algorithms . . . 51

6.2.1 Block Matching . . . 51

6.2.2 Local Phase Estimation . . . 53

6.3 Image processing . . . 55

6.3.1 Object segmentation using histogram . . . 55

6.3.2 Otsu’s method . . . 57

6.3.3 Morphological operations . . . 58

7 Vision System 61 7.1 Problem Discussion . . . 61

7.2 The Developing Procedure . . . 63

7.3 Image processing . . . 64

7.4 Find the solution . . . 65

8 Test Results on Part II 67 9 Discussion 71 10 Conclusions And Future Work 73 10.1 Conclusions . . . 73

10.2 Future Work . . . 74

Bibliography 75 A Technical Specifications 77 B Kinematics 79 B.1 Forward Position Kinematics . . . 79

(13)

Chapter 1

Introduction

This chapter will first give an overview of the background to this master’s thesis and an introduction to Motoman Robotics Europe AB. Then the purpose of this master’s thesis is explained and finally the outline of this thesis is presented.

1.1

Background

This master’s thesis was carried out at Motoman Robotics AB in Kalmar and it covers a collision handling system for their industrial robots.

Today when a collision occurs the arm has to be retrieved, the cause of the col-lision needs to be corrected and the programmed job has to be restarted. This process has to be performed quickly to maintain the production. The arm can be retrieved manually or automatically by pressing a button. If the robot is re-trieved manually, especially in tight spaces, it will be a time consuming task and the robot operator has to have good knowledge in how to control the robot or a new collision might take place. To improve the process of restarting the job after a collision and allowing employees that have limited knowledge in robot handling to retract and restart the job, Motoman provides an automatical retraction function. However, there might occur problems when using Motoman’s current retraction solution, because it will only execute the programmed job backwards. This may cause problems when executing multi task jobs, which will be explained in Chapter 3.

The problem description for this thesis was developed in a collaboration between Richard Warlden at Motoman and us, and it was not entirely stated at the start of this project, because all the problems were not defined.

During the background research, the problem description developed.

(14)

1.1.1

Research and Development

During the research for Part I it was hard to find previously published articles about the collision problem because this solution had to be inhouse and specified for Motoman’s NX100 system. The reason for this is described in Chapter 3. Almost all the information was therefore collected from the Motoman system man-uals, documents and we also had great help from the employees at Motoman To be able to control and program the robot we took a course in robot handling at Motoman. After this we could generate jobs on our own and start to study the motion and the limits of the robot. All our tests were performed on a HP3L, but all the results can be applied on almost all the products in Motoman’s robot park with only small modifications in the software setup.

To increase our knowledge in robot motion we read several papers in robot kine-matics and motion control.

To find information for the pilot-study of a vision system in Part II we read literature and papers in image processing and had email contacts with the image processing group at the Department of Electrical Engineering at Linköping’s Uni-versity.

The software solutions for this project were implemented using C/C++ and Vx-Works for the robot handling and Matlab was used for the vision system.

In the early stages of the development process for the robot-handling-function a test bench was developed for easier error diagnosis. The test bench was developed in C# and a communicating protocol was developed in C++ to handle the com-munication between the test bench and the robot. In this test bench the main structure of the software was developed and tested.

1.1.2

Motoman Robotics AB

Motoman Robotics Europe AB was founded 30 years ago as a manufacturer and supplier of welding machinery, primarily to the automotive industry. The com-pany became part of the Japanese electronics group Yaskawa Electric Corporation in the mid-1990s.

Motoman products and services are well established in Europe together with other parts of the world with more than 150.000 unit installations in full production (more than 15.000 units in Europe).

Motoman Robotics Europe AB head office for the European organisation is lo-cated in Kalmar, Sweden. Development and manufacturing mainly takes place in production units in Sweden and Germany. Since Motoman started in 1976 they have steadily developed as a robot supplier. As early as 1996, they introduced a solution for controlling several robots with a single control system, which helped them to achieve there current position as one of the world’s leading robot manu-facturer.

(15)

1.2 Purpose 3

1.2

Purpose

The main purpose of this master’s thesis was to design and develop a software solution for a safe retraction of the robot arm. It should also investigate further use of the software, in applications like real-time motion control and autonomous pathfinder with integration of a vision system.

1.3

Thesis outline

This master’s thesis is divided into two main parts. Part one describes the devel-opment of the backtracking function and spin-off functions and the second part is a pilot-study of an integration of a vision system which is a complement to the spin-off functions. The contents of the following chapters are:

Part I - Motion Control

Chapter 2 - This chapter presents the basics of an industrial robot and the NX100 control system. This chapter also contains the kinematics theory which describes the motion of the robot.

Chapter 3 - A more detailed description of the problem is given and the so-lution methods for a safe retraction are presented.

Chapter 4 - This chapter presents the possibility for using the backtracking function in applications like real-time motion control and autonomous pathfinder.

Chapter 5 - This chapter displays the test results from the backtracking so-lution method and the autonomous pathfinder.

Part II - Integration of a Vision System

Chapter 6 - The theory for the chosen vision system is presented.

Chapter 7 - How the vision system was developed is explained and the prob-lems are discussed.

Chapter 8 - This chapter displays the test results from the vision system Chapter 9 - In this chapter the solution methods are discussed along with the working process.

Chapter 10 - Presents the conclusions from the discussion in Chapter 9 and discusses further work on the software.

(16)
(17)

Part I

Motion Control

(18)
(19)

Chapter 2

Robotics

Figure 2.1. A Motoman Robot

The word robot has been used to describe a wide range of different items, from things in our homes such as autonomous vacuum cleaners and lawn mowers to autonomous submarines and missiles. Almost anything that operates with some degree of autonomy, usually under computer control, has at some point been called a robot.

2.1

Industrial robots in general

In 1956 the company Unimation, founded by George Devol and Joseph F. En-gelberger, produced the first industrial robot, [1]. It was mainly used to transfer objects from one point to another, less then a dozen feet apart. Today, industrial robots are used for a wide range of applications including welding, painting, iron-ing, assembly, pick and place, packaging and palletiziron-ing, product inspection, and testing. The size of an industrial robot varies depending on the application and the amount of load it is designed to carry.

(20)

An industrial robot consists of a moving mechanical arm, see Figure 2.1. The arm consists of a sequence of rigid bodies connected by revolute or prismatic joints, also called axes. The most common industrial robot on the market today consists of six axes, where three axes are required to reach a point in space, and to fully control the orientation of the end of the arm three more axes are required. An electrical motor, a servo, controls every axis and a computer controls the move-ments of the system.

The robot uses pre-programmed jobs to perform a variety of tasks. The setup or programming of motions and sequences for an industrial robot is typically done by moving the robot to the desired location and storing the position together with the desired motion type. This can be very time consuming but in the last few years the industrial robotics business has provided enough speed, accuracy and "easy to use" interface for most of the applications.

2.2

Sensors on the robot

The use of sensors on industrial robots is of vital significance to achieve the high-performance robotic systems that exist today. There are various types of sensors available, often divided into sensors that measure the internal state of the robot (proprioceptive sensors) and sensors that provide knowledge about the surround-ing environment (heteroceptive sensors), [14]

Examples of proprioceptive sensors are encoders and resolvers for joint position measurements and tachometers for joint velocity measurements. Heteroceptive sensors include, for example, force sensors for end effector force measurements and vision sensors for object image measurements when the manipulator interacts with the environment.

Due to cost and reliability requirements, the sensors in a standard robot used today are usually proprioceptive sensors, which only measure the servo angular position. In some special applications heteroceptive sensors are used. For exam-ple, where an object should be picked from a transport band a vision system can be used or on welding robots which need to have a contact to the welding objects but can’t push too hard, force sensors are used.

2.3

Kinematics of the robot

The standard Motoman robot on the market today consists of 6 axes, S, L, U, R, B and T see Figure 2.2, and is a chain of serially linked axes. There is a lot of resemblance between the robot arm and a human arm.

The robot arm can be divided, as the human arm, into two main parts, an "arm part" and a "hand part" where the wrist is at servo B. The S, L and U axes control the "arm part" and generates the big movements. The "hand part" is controlled by

(21)

2.4 System Overview 9

R, B and T, and it is used for the precision movements of the tool center point, TCP. The R servo is placed before the wrist, but it has the same effect on the TCP as twisting the underarm has on the hand.

In this thesis we have defined three different reference systems to make it easier for the reader when we are referring to them throughout the text. They are

• The Base Frame • The Wrist Frame • The Tool Frame

The robot’s base position is when the axes are aligned as shown in Figure 2.2. In this position all the servo values are zero and the kinematics calculations in this thesis are based on this definition.

The Base Frame

The base frame has its origin defined in the center point of the robot base as shown in Figure 2.2. The S-servo will have its rotation ωS around ˆz in the base frame.

The Wrist Frame

The wrist frame origin has been defined in point P, see Figure 2.2, on the robot. The relation between the base frame and wrist frame is only depending on the three first servos S, L and U, which, as described above, control the big movements.

The Tool Frame

The tool frame has its origin defined in the TCP of the robot, see Figure 2.2. The relation between the wrist frame and the tool frame does only depend on the servos R, B and T.

2.4

System Overview

The robot cell consists of a robot arm, an NX100 control box and a programming pendent box, see Figure 2.3, where the NX100 control box generates commands that the robot arm executes. The program pendent box is the user interface toward NX100 where the robot can be programmed.

2.5

Robot Control

The robot is controlled by programmed jobs, see Table 2.1 and Figure 2.4 for a short example of a job. A job consists of job lines and on every job line there is an instruction where a position, a motion type and a velocity has been given. The three main motion types are

(22)

Figure 2.2. The dynamics of the robot

Figure 2.3. System Overview

• MOVL - Moves the TCP, see Section 2.3, along a straight line toward the

desired coordinate.

• MOVC - Makes circles. At least three or more coordinates have to be used

to define the border of a circle. The MOVC instructions will then generate a circular motion.

• MOVJ - Moves the robot arm in a way that is optimal for the servos.

The velocities can either be defined as joint speed or tool speed. The joint speed is set between 0-100% and means the maximum allowed speed for all the servos during the execution of an instruction, see Appendix A for maximum servo speed definitions. The tool speed is set in the range 0-9000 cm/min and means the maximum allowed speed for the TCP during the execution of an instruction.

(23)

2.5 Robot Control 11 0 NOP 1 MOVL P000 V=500.0 2 MOVJ P001 VJ=0.25 3 MOVC P001 V=500.0 4 MOVC P002 V=500.0 5 MOVC P003 V=500.0 6 MOVJ P003 VJ=0.25 7 MOVJ P004 VJ=0.25 8 MOVL P000 V=500.0 9 END

Table 2.1. Short example of a job where P00i are the positions.

Figure 2.4. The result from the job example in Table 2.1.

2.5.1

NX100 control system

The NX100 control system controls the arm by generating pulses for the servos. The NX100 makes it possible to control up to 36 axes, a synchronized control of four robots and external axis. The programming capacity is 60.000 steps, 10.000 instructions and 40 input and 40 output signals (can be extended to 1024/1024). Because of the control system’s PC architecture it is possible to communicate with other systems via a communication protocol.

2.5.2

NX100 Parameters

A robot can be seen as a closed box that can be manipulated by programmed jobs which are executed in the NX100 control system. One way to generate these jobs

(24)

is to change to the TEACH mode, which is one of the three modes on the robot. When changed to TEACH the robot can be programmed by the program pendent box by moving it through the desired route.

The other two modes are PLAY and REMOTE, where the play mode is activated when a pre-programmed job should be executed autonomously. The REMOTE mode is a monitor mode, which can be used with a simulation program called MotoSim.

It is possible to upload information to some register in the NX100 by using IO:s on the robot or by using functions in the NxAddon programming library. The uploaded information can then be used in the pre-programmed jobs.

There are three kinds of variables that are extra significant for this thesis, especially for Chapter 3. These are

• C-variables • D-variables • P-variables

C-variables

A job line always consists of a motion type, a position variable and a motion velocity as described in Section 2.5. When a job is programmed using the pendent box, the position variable is not visible on the job line, only the motion type and the velocity is visible, but the position is stored in a C-variable.

The C-variables only exist inside the NX100 and can’t be modified from the outside when the job is running. They are only possible to modify before the job has been loaded because all C-variables have to be pre-defined.

D-variables

D-variables are of the type double in the NX100-system, and they are possible to change from the outside when a job is executed.

P-variables

The P-variables are, like C-variables, position variables. The difference between these two is that the P-variable can be changed during the execution of the job. They can not be changed directly, but by using the SETE instruction in NX100 a P-variable can get a D-variable value.

2.5.3

Safe home location

The definition of a safe home location (SHL) is a place that the arm can retract to. There can be several SHL in a cell and a SHL can be specified by using existing cubes in NX100. The cubes are specified with a height, a width, a depth and the location of it in the robot cell. Another way to specify a SHL is to insert a SET instruction in the programmed job file which sets a predefined IO port and defines where in the job file the SHL is located.

(25)

2.6 Kinematics 13

2.6

Kinematics

The theory in this chapter is primarily taken from [7], [4], [8] and [14].

2.6.1

Rigid motion

Figure 2.5. Two reference frames

The relation between two frames (x0, y0, z0) and (x1, y1, z1), seen in Figure 2.5,

can be described with a combination of a pure rotation and a pure translation, called a rigid motion. The position of a point p1in frame 1 with respect to frame

0 is given by

p0= R10p1+ d10 (2.1)

where R1

0is the rotation of frame 1 relative to the reference frame 0 and d10is the

translation of frame 1 with respect to the reference frame 0.

Homogeneous transformation

For a easier representation, the relation between the two frames can be described as a homogeneous transformation given by

(p0 1) T = T01(p1 1) T =  R1 0 d10 0 1  (p11) T (2.2) where T1

0 includes all the necessary information about the position and rotation

of frame 1 with respect to the reference frame 0. The inverse transformation T0 1 is given by T10=  R0 1 −R01d10 0 1  (2.3)

Equation (2.2) can easily be extended to describe more complex relations between frames. If there were three frames, the mapping from frame 2 to frame 0 is accomplished by a multiplication of the individual homogeneous transformations.

(26)

2.6.2

Forward Position Kinematics

Forward position kinematics is a method for finding the position of the TCP expressed in the base frame when the servo angles and the length of the axes are known. The robot can be seen as a set of rigid links connected together with various joints.

The transformation between the end points on an axis are based on the angle of rotation and amount of linear displacement.

The joint angles are

q = [qS, qL, qU, qR, qB, qT] T

(2.5) To express the position of the TCP in the base frame the homogeneous transforma-tion matrix between the tool frame and the base frame is needed. To calculate this transformation matrix, the homogeneous transformation between the base frame and the wrist frame is first calculated. The complete transformation from a point defined in the wrist frame to the base frame is given by the following equations

pbs= dSbs+ RSbspS (2.6)

pS= dLS+ RLSpL (2.7)

pL= dUL + RULpU (2.8)

pU = dwrU + RwrU pwr (2.9)

The rotation of the wrist frame in relation to the base frame and the position of the center point of wrist frame with respect to the base frame follows from (2.6),(2.7),(2.8) and (2.9) Rwrbs = RSbsR L SR U LR wr U (2.10) dwrbs = dSbs+ RSbsdLS+ RSbsRSLdUL+ RSbsRLSRULdwrU (2.11) The position of the wrist frame center point with respect to the base frame can be reduced to dwrbs =   dhcos qS dhsin qS dv   (2.12)

where dh and dv are the horizontal and vertical distancs given by

dh= lLS + l U

Lsin(qL) + lUBcos (qU − qL) (2.13)

dv = lULcos(qL) + lBUsin (qU− qL) (2.14)

The complete homogeneous transformation between the base frame and the wrist frame is given by pbs 1  = Tbswr pwr 1  =R wr bs d wr bs 0 1  pwr 1  (2.15)

where RwrU = I because the relation between frame U and the wrist frame is only a translation along the positive z-axis of frame U.

(27)

2.6 Kinematics 15

The relation between the wrist frame and the tool frame is given by the following equations pwr = dRwr+ RRwrpR (2.16) pR= dBR+ R B RpB (2.17) pB= dTB+ RTBpT (2.18)

The rotation of the tool frame in relation to the base frame and the position of the tool frame center point with respect to the base frame follows from (2.16),(2.17) and (2.18)

Rtoolwr = RRwrRBRRTB (2.19)

dtoolwr = RRwrRBRdTB (2.20)

The complete homogeneous transformation between the wrist frame and the tool frame is given by pwr 1  = Twrtoolptool 1  =R tool wr d tool wr 0 1  ptool 1  (2.21)

Finally the position of the TCP expressed in the base frame follows from (2.15) and (2.21) pbs 1  = TwrbsTtoolwr ptool 1  (2.22) or similar as

pbs= Rtoolbs ptool+ dtoolbs (2.23)

Rtoolbs = RwrbsRtoolwr (2.24)

dtoolbs = dwrbs + Rwrbsdtoolwr (2.25) In Appendix B.1 the transformations are explained more thoroughly.

2.6.3

Inverse Position Kinematics

The inverse kinematics problem is, given a position and orientation of the TCP, to compute the corresponding joint angles. Compared to forward kinematics, where a unique closed form of a solution always exists, this is a much more complex problem.

It is not certain that a solution to the inverse kinematics problem exists, for ex-ample if the desired position is beyond the reach limit of the robot. That is to say outside its workspace volume and if a valid solution exists it is not certain that it is unique. An example of multiple solutions can be seen in Figure 2.6, where there are 4 different configurations to reach the same position for the wrist frame center point by combining the binary decisions "forward/backward" and "elbow up/elbow down".

If the R,B and T-servos are included there will be an infinite amount of solu-tions for reaching the desired position with the TCP. A simple way to solve this problem and minimize the calculations is to lock the tool frame in a certain angle

(28)

Figure 2.6. Multiple solutions

by predefine the rotation matrix Rtoolbs , see Section 2.6.2. If this is the case the position of the wrist frame center point expressed in the base frame is given by

dwrbs = dtoolbs − Rtoolbs dTB (2.26) The S joint angle qS will then be

qS = arctan dxwrbs, ± dywrbs (2.27)

A positive dywrbs refers to that the robot is facing "forward" and negative is

"back-ward". The horizontal and vertical distances dhand dvfrom the wrist frame center

point to the center point of frame S are given by

dh=

q

dxwr 2bs + ±dywr 2bs , dv = dzwrbs (2.28)

The cosines rule gives the angle for qU

qU = ± arccos dh2+ dv2− dUL 2 − dwr U 2 2dU LdwrU ! (2.29)

A positive value of qU will give the "elbow up" configuration shown in Figure 2.6

and a negative value will give an "elbow down" configuration. The last servo angle,

qL, is given by qL= arctan  dh dv  − arctan  dwrU sin qU dU L + d wr U cos qU  (2.30)

The inverse position for the wrist uses Rtoolwr as an input, this is straightforwardly derived from the known rotation matrix Rtoolbs and the rotation matrix Rwrbs,which can be calculated as soon as qS, qLand qU are known.

RTwr = Rtoolwr = R bs wrR

tool

bs (2.31)

Two solutions exist, one when qB is positive called "non-flip" and one when qB is

(29)

2.6 Kinematics 17

2.6.4

Forward Velocity Kinematics

This problem is similar to forward position kinematics problem, but the result will be the velocity of the tool reference frame. To calculate the velocity the joint velocities and facts from Section 2.6.2 have to be known.

Given joint angles q from equation (2.5) and the joint velocities are

˙

q = [ ˙qS, ˙qL, ˙qU, ˙qR, ˙qB, ˙qT] T

(2.32)

The relation between joint positions q and the position of the TCP is nonlinear, which can be seen in the transformation matrix T, but the relationship between the joint velocities ˙q and the TCP velocity in the base frame vbsis linear. If one

servo drives twice as fast the vbswill be twice as high.

If the length of the axis, l, the rotation speed for the axis in point A, ˙qA and the

velocity in the endpoint A, vA is know it is possible to calculate the velocity in

the other end point B, vB with

vB= ˙qA× ~AB + vA (2.33) where ~ AB = l (2.34)

With the results from forward position kinematics 2.6.2 where the positions and transformation for the joints have been calculated, the velocity, vbs for the TCP

in the base reference frame is found by recursion from base frame to wrist frame by knowing the joint velocities ˙q and the positions of the joints.

(30)
(31)

Chapter 3

Backtracking

This chapter will present a more detailed problem description and discuss different solution methods for a safe retraction of the arm.

3.1

Problem Overview

The concept Backtracking is meant to be used in tight spaces or when the robot is controlled by a person without any knowledge in robot handling. The function has to be active at all times and it can’t depend on specific servo or alarm situations and it also has to work after an emergency break. Whenever the robot has to be stopped during production it has to be possible to retrieve the robot to the last passed safe home location by only pushing a button. When the retraction has been completed, the monitored job has to be reset to start on the job line where the retraction ends so that the job can safely be restarted by pressing the play button on the PP. The function has to work on the existing NX100 control system with a minimum amount of outside modifications of the robot.

When using robots in the industry there is always a possibility that a collision may occur. The collision can be the result of careless programming or a change in the environment surrounding the robot. When a collision occurs on a Motoman Robot the emergency breaks will be activated and the current to the servos will be turned off to prevent the robot from damaging itself and the surrounding objects even more. When the collision alarm has been reset the robot has to be retrieved from the collision area. This can be done manually or, as in most cases, using the existing retrieving function in the NX100. The existing retrieving function can be activated by pressing a back button on the programming pendant box but this does not always ensure a safe retraction of the robot arm.

Industrial robots usually have some different tasks it should perform, which can be totally independent from each other. For example it lifts up a box on a feeder band, put some metal in a lathe and lifts off another box from another feeder band, see Figure 3.2.

(32)

All these tasks are written in the same job file and named with labels and do not have to be executed in a chronological order. They might instead be controlled by some external sensors which say when a task is ready to be executed.

When a collision has occurred and the existing retraction function is activated, the system will execute the job in the reversed order by stepping line by line, see Figure 3.1. If the job is divided into small tasks, as described in the example above, and a collision occurs when performing the motions specified by task 2, the arm will be retracted correctly until it reaches the end of task 2. When it reaches the end of task 2 it will execute the motions in task 1 and this may cause a new collision, especially when it works in narrow spaces.

A Motoman robot consists of at least six axes but they usually have some

exter-Figure 3.1. Using the back button

nal axes that make it move along a working line or holds on to its working piece. To ensure a safe retraction, the arm and the external axes must be retrieved in a synchronized way to avoid further collisions.

In some cases there are several robots that are synchronized and working to-gether, which means that they will move around each other while performing dif-ferent tasks. If one of the robots should collide, with an object or another robot, all the robots have to be retrieved in a synchronized way or a new collision may occur.

3.2

How to retrieve a safe way home

The first thought about how to solve this problem was to use external sensors and a sensor fusion algorithm to guide the robot back, as described in [16] and [5], but

(33)

3.2 How to retrieve a safe way home 21

Figure 3.2. Retracting arm

during the work we realized that it was a complex problem to create a general solution with only sensors as further explain below.

3.2.1

Using external sensors

An industrial robot has a specific motion compared to a mobile robot, because it is always connected to the ground. This means that when the head has passed an object the danger is not over, because no points along the 6 axes can pass through the collision area.

The conclusion was that the whole arm had to be covered by a sensor-shield to guarantee total collision avoidance when the arm was retracted.

One method is to let a vision system monitor the arm and control the movement. This approach demands that the vision system monitors the whole arm at all times, which might not be possible if the arm is working inside an object or in tight spaces.

Another method is to use sensors mounted on the robot arm that cover all the axes surfaces to avoid collisions when retracting the arm. This would however mean major modifications of the robot and can be an expensive task, which would be hard to promote on the market as a selling product.

Because of the reasons mentioned above and that Motoman wished to have a simple solution that would not demand major modifications of the robot, none of the methods seemed suitable to solve the retraction problem.

Instead the method described in the next section was chosen. This is a more gen-eral solution and works on all NX100 systems without any modifications on the robot.

(34)

3.2.2

Path recording

By sampling the motion of the robot a collision free path can be retrieved. This ensures a safe retraction of the arm and does not demand any modifications out-side the NX100 control system.

To ensure a collision free path, the sample rate has to be high enough, otherwise the arm might collide with additional objects when retracted, because it starts to diverge too much from the original path, see Figure 3.3. If the sample rate is too high when calling for pulse information, the fetching will take too much of the CPU in NX100 so the rest of the system will fail and the system will be stopped. To be able to store a big amount of information from the sampled robot motion a

Figure 3.3. Bad Sampling

circular buffer was introduced. The advantage with a circular buffer, instead of a dynamic buffer, is that if the robot keeps on going the dynamic buffer would keep on growing and it would cause a memory overflow after a while. This problem will never occur with a circular buffer, because when it is full it will just overload the old data. The disadvantage with this is that if the circular buffer is too small it can’t bring the robot all the way back to the safe home location but a safety measurement has been done. When a pulse is loaded into the buffer it is also sent to a text file. The text file allocates less memory then the buffer to store the same amount of data and can be used to retract the arm if the buffer is overloaded. This also makes it possible to use the software for executing text files created on an external unit.

Another possibility is to instead send data to a text file. It is possible to send the data to a memory located on a external unit. This increases the possibilities in the amount of storing data so a data overflow will never occur. To store data on an external unit has not been tested accurately and is more of a future work. The preparations have been done and it would not require much work to implement this.

3.2.3

Test bench

Our first idea was to use an existing function in the API for the NX100 to generate the motion of the arm. By sending the sampled pulses to the function the desired retraction path would be executed. It is hard to make error diagnosis on the

(35)

3.2 How to retrieve a safe way home 23

Figure 3.4. System overview of the test bench

NX100 control system once the software is loaded. A test bench needed to be developed and placed on an external unit. In the test bench the backtracking software could be tested and an error diagnosis of the created software became possible. To be able to load pulses from the robot to the buffer and evaluate the solution method, a communication protocol was needed. In the beginning a manual was used which described the communication between the NX100 and an external unit. This manual did not however contain all the necessary information for developing our own communication with the robot. To retrieve the necessary information the communication between the NX100 and the program MotoSim was monitored by using a software called SNIFFER. When the information had been retrieved the communication protocol was developed. In the resulting test bench it is possible to perform motions of the robot that have been stored in the buffer. As seen in Figure 3.5 the test bench consists of the following main functions:

• Start: The pulses from the robot are retrieved and stored in the buffer. • Stop : Stops the sampling of the motion.

• Read: Displays the content of the buffer.

• Execute Forward: Executes the motion specified by the pulses stored in the

buffer. The motion is executed from the first sampled pulses up to the last sampled pulses.

(36)

the buffer. The motion is executed from the last sampled pulses up to the first sampled pulses.

Figure 3.5. The test bench

When we started to execute the motion stored in the buffer we discovered that the motion became very unsteady. The arm moved towards the point specified by the first loaded pulse and then stopped and then started to move towards the point specified by the second loaded pulse and so on. Because the sampled path consists of a large number of pulses the solution of using the existing function in the API to perform the motion did not give a satisfactory result. Instead another approach was used which will be described in Section 3.3.

(37)

3.3 How to get a steady motion 25

3.3

How to get a steady motion

There is an existing API in the NX100 which provides an interface for sending a move instruction to the robot from a module inside the NX100 or an external unit. The problem is that it’s only possible to send one at a time. This means that the robot will only execute one instruction at a time and then stop and wait. The result will be an unsteady motion when trying to execute several move instructions.

To be able to generate a smooth motion for the robot the NX100 system reads 4 instructions ahead in a job file so it can optimize its motion from that, see chapter 2. By allowing the software to create a job file and load it into NX100 the above advantage could be used.

Two different solutions that generate a steady motion have been developed. One is to let the software construct a job file which uses the P-variables, see Chap-ter 2, to execute the motion and then load the buffered data into the P-variables. Because of limitations in NX100 the P-variables can only be modified before the robot initiates the job. After the initiation the only way to modify the P-variable is to update the D-variables from the buffer and use the SETE instruction. By continuously updating the D-variables from the buffer in every job cycle the robot will modify the P-variables on its own.

A crucial point in this solution is to know when to update the D-variables from the buffer. If the updating is too slow the same set of D-variables will be executed again and the same motion will be repeated. If this occurs the arm will move the shortest distance from the last to the first position specified by the D-variables. This may cause an additional collision, see Figure 3.6. If the D-variables instead

Figure 3.6. The effect of the path when the update is too slow

are updated too fast there will be a gap in the motion information and the arm will skitter, see Figure 3.7. This may also cause an additional collision.

(38)

Figure 3.7. The effect of the retraction path when the update is too fast

To solve this problem a communication between the executing job and the module was established. By using IO outputs on the robot, that both the module and the job could write to, the problem of updating the D-variables could be solved. This however lead to some additional problems regarding the steady motion.

When the updating of the new position variables exceeds the time for the job to complete the motion instructions, the job will stop and wait for the data trans-fer to be completed. The only way to avoid this stop is to increase the time for the job to execute its motion instructions. There are two ways to accomplish that, either decrease the motion speed or decrease the sample rate. The problem with decreasing the motion speed is that it does not guarantee that the stop won’t occur but by making a dynamic sampling algorithm the problem could be solved.

The Algorithm is based on the knowledge of the current executing job speed and the speed of the future retraction, see equation (3.1)-(3.6). By knowing this it is possible to make sure that the time to execute the retraction instructions exceeds the time for uploading the new position variables.

The dynamic sampling algorithm also gives the benefit of optimizing the sam-ple rate. A slower speed in the monitored job doesn’t require the same samsam-ple rate to guarantee a safe retraction of the arm, so by decreasing the sample rate when possible memory can be saved for later.

The distance between the two sample points with sampling time t in the mon-itored job is given by X[t]. The value for X[t] is predicted from the information in t − 1, given by

(39)

3.3 How to get a steady motion 27

where Tsis the sample rate and Vois the velocity for the TCP. It is only possible

to receive the joint velocities from NX100, so Vohad to be calculated with forward

velocity kinematics, see section 2.6.4.

The equation for the distance between the two sample points in the created job is given by

X[t] = Tmove[t − 1]Vb (3.2)

where Tmove is the time it takes to move from point i to point i + 1 and Vb

(constant) is the velocity between the points.

The time Tload, which is the time it takes to update the new positions from the

buffer to the D-variable, can not exceed Tmove or a stop will occur. Tload was

measured to 10 ms/pulse. This will give us the following condition

Tmove[t] > Tload (3.3)

where Tload is given by

Tload = 10 NN rOf Axis (3.4)

By using equations (3.1), (3.2), (3.3) and (3.4) the minimum sample time is given by

Ts[t] >

Vb10 NN rOf Axis

Vo[t]

(3.5)

The equation used in the software is

Ts[t] =

10 NN rOf AxisVb

Vo[t]

+ K (3.6)

where a safety margin K has been added. This will update the sample rate in every sampling cycle by calculating V0 and using equation (3.6) to calculate the

sampling rate.

The other solution to the problem of steady motion is to store all the sampled data from the buffer into the C-variables, see Chapter 2, on the NX100. The C-variables will be initiated before the job is executed and this will guarantee an

(40)

entirely smooth motion with none of the disturbances mentioned above.

It is possible to store the sampled data in up to 15000 C-variables depending on how much of the memory that is occupied. This means that it is more than un-likely to run out of C-variables.

This solution however does not support some of the features described in chapter 4.1, like the possibility to a real time modification of the position variables.

3.4

Environmental change

When retracting the arm the environmental changes must also be considered. If the robot picks up an object and a collision occurs when moving the object, the stored path from the collision point up to the picking point will be safe. However, if the object is not realised at the picking point the path from the picking point to the safe home locate may not be safe because the object that is now connected to the robot may cause a collision. To solve this problem the value of the IO-ports that decide the different environmental setting have to be saved as well. If the IO-ports that control the environmental changes are known they can also be saved in the buffer. The stored environmental changes are analyzed before the job file is created. When a change in the environment occurred between two-stored positions in the buffer a job instruction called DOUT was inserted along with the address of the IO-port and the new binary value.

This environmental change handling is only possible when the Backtrack module uses the C-variables method, because the solution for the other method became too complex.

3.5

Restore the monitored job

When the function is activated the current job information of the monitored job is retrieved and stored. The information consists of the job name and the current job line that is executed. When the retraction has been completed, the monitored job is reloaded and reset to the job line where the retraction ends so that the job can safely be restarted by pressing the play button on the Program Pendent box. If this is not done and the operator presses the play button the arm will travel the shortest distance to the motion specified by the job line where the collision occurred and this may cause further collisions.

The solution method using D-variables allows the operator to stop the retraction of the arm by pressing the hold button on the Program Pendent box. When the hold button is pressed the software will retract the arm to the closest job line in the monitored job and reloads the job. This is accomplished by connecting the loaded pulse in the buffer to the corresponding job line in the monitored job. When the software detects that the hold button has been pressed the pulses will be loaded into the executed job file until it reaches the closest job line.

(41)

3.6 INI-file 29

3.6

INI-file

The INI-file makes it possible to modify the backtracking software based on the application and the model of the robot. When the NX100 control system is acti-vated it sets the parameters of the software based on the information provided in the INI-file, see Figure 3.8. The INI-file provides the following information:

• Buffer : Specifies the size of the buffer.

• Address for the triggering IO-port: Specifies the IO-port which will be used

to trigger the retraction.

• Address for the sampling triggering: Specifies the IO-port which will be used

to trigger the sampling of the motion.

• Control parameters: Specifies the IO-ports which control different

envior-mental settings.

• Control group: Specifies the robot and the external axes that will be used

in the application.

• Robot information: Length of axis and servo limits.

Figure 3.8. System Overview of the Backtrack module

3.7

Limitations

A safe retraction of the arm might not always be guaranteed. For example if an object has been picked from an area and a unit that the NX100 does not control places a new object on the area, the arm will collide with the new object when it tries to replace the object that it carries. If the backtracking function is to be used it is up to the operator that creates the job-file to decide how far the arm is to be retracted when a collision occurs.

(42)
(43)

Chapter 4

Future use of the

backtracking software

The creation of the Backtracking software gave new possibilities for manipulating the motions of the robot. We investigated the possibilities to use the software to more than just retracting the arm to a safe zone. This chapter will present the result of this investigation.

4.1

Real-time motion control

This section describes a method for controlling a Motoman robot in real-time by combining the backtracking function with an external unit. This new possibility to control the motion became the foundation for the cognitive system in Section 4.2.

One of the motion functions created in the backtracking software has opened the possibility to create a real-time motion control. The main approach for the function is that sets of X path points, see Figure 4.1, where one path point consists of one pulse for each axis of the robot, are loaded into a created job on NX100 in every cycle from the buffer, see Section 3.3. If the buffer would consist of pulses

Figure 4.1. How the path consists of pulse sets

that generates route A, it will be possible to replace pulses in the buffer where the

(44)

new pulses generate route B instead, see Figure 4.2. By doing this, a form of a real-time motion control will be achieved.

To avoid disturbances in the motion, the time to execute the set of path points

Figure 4.2. Real-time control

in every cycle has to be longer then the upload time for the new set of path points for the next cycle. If this is not the case, the robot will stop and wait until the path points have been uploaded. The distance the arm has to travel, before the change of the path is executed, is determined by the sample distance between the path points representing the current path and the amount of path points used in each set, see Figure 4.3.

If the sampling distance between the path points is small and only one point is used to represent each set, the distance traveled before the path is changed will be shorter. However, to avoid the disturbance mentioned above, the speed Vs of

the executing job has to be limited to fulfill (4.1), which will lead to a slower motion of the arm. If a higher speed is wanted, the distance between the sample points has to be enlarged at the cost of a longer motion distance before the path is changed. There will also be a small disturbance due to the restart of every cycle. This will however always occur and the only way to minimize its appearance when executing a path is to reduce the number of cycles needed to reach the end point of the path. This can be done by using a larger sample distance between the path points, reducing the number of total points in the path, and/or use more path points in each set. This will give a smoother motion but a loss in the motion control.

TT imeOf M otion> NN rOf AxisNN umberOf P athP oints [ms] (4.1)

TT imeOf M otion=

X Vs

(4.2)

X is the traveled distance when the set n, shown in Figure 4.4, is executed. If the

(45)

4.1 Real-time motion control 33

Figure 4.3. The distance traveled before the path is changed depending on the amount

of path points used in each set.

Figure 4.4. A set consisting of one path point

desired path of how to reach it, there is another solution which gives a more direct real-time motion control. If the original path only consists of a start point and an end point, the path can be changed by just updating the endpoint. The job is designed with only one motion instruction with a constraint which is controlled by an IO-signal.

When the software receives a new end point it updates the D-variables and when the update is completed the IO signal goes high. The current move instruction is aborted and the job will start over in a new cycle and the arm will start to move towards the new end point, see Figure 4.5. This will lead to no limitations of the speed between the points because the path will be changed as soon as the update is completed.

(46)

Figure 4.5. Change of end point

4.2

Pathfinder

By using the backtracking software to create a buffer and a job file, which is updated using D-variables as described in Section 3.3, the motion of the arm can be changed during the execution of the job by loading new data into the buffer. The new data will be loaded into the D-variables and the new motion will be executed in the next job cycle. This makes it possible to develop software solutions for applications that demand an autonomous change of the motion for the arm while the job is being executed.

This section will describe a method for how the robot can learn from its mistakes and generate a collision free path using the backtracking software. By using a vision system a more advanced decision method can be used to find a path around the collision area which will be discussed in Chapter 7.

4.2.1

Basic-Pathfinder

The basic pathfinder function is used to generate a collision free path from point A to point B and back to point A, which can be loaded into the NX100 as a job file. The assumption is that the robot is positioned and moved as shown in Figure 4.6, and a safe way can be retrieved by moving over the objects. By adding more points[A, B, C, D . . . ] when initiating the pathfinder, constraints can be given of certain points that has to be passed when travel from point A to D.

(47)

4.2 Pathfinder 35

Figure 4.6. Collision free path

is used to generate a linear path consisting of SA−B samples. The linear path is

then stored in the buffer, shown as the black line in Figure 4.7, and the backtrack-ing software creates a job file, consistbacktrack-ing of D-variables and MOVL instructions, and loads it into the NX100. The generated job file will read the stored data in the D-variables as the cartesian coordinates and the rotation of the TCP, expressed in the base frame, instead of pulses. A part of the path in the buffer will be loaded into the D-variables and the robot will start to move along the path between the two points A and B.

When a collision occurs the current position of the TCP , expressed as the carte-sian coordinates and the rotation of the TCP, is retrieved from the NX100. The arm is then retracted by loading a part of the path, in the reversed order, from the buffer into the D-variables and then executing the job. When the arm has been retracted the path from the current position to point B will be abandoned. Instead an alternative path from the current position to point B will be calculated. The new path will be calculated using the following two steps.

• The path from the current position to the collision point is calculated by

moving the collision point in the positive z-direction of the base frame and generating a linear path between the two points consisting of SR−C samples

which are stored into the buffer, shown as the green line in Figure 4.7.

• The path from the collision point to end point B is a generated linear path

consisting of SC−B samples between these two points which are stored into

the buffer, shown as the blue line in Figure 4.7.

This procedure is repeated until the arm reaches the end point B.

To ensure that the arm doesn’t get stuck in an endless loop when trying to reach the modified collision point, the number of path points to retract the arm is based on the location of the collision point. If the retraction point is located as shown in Figure 4.8, the robot will not be able to move over the object unless the arm is retracted to the second retraction point. The retraction condition is given by

(48)

Figure 4.7. Modified path

p

(xc− xr)2+ (yc− yr)2> D (4.3)

where D is a pre-defined minimum distance and (xc, yc) and (xr, yr) are

coor-dinates for the collision point and the retraction point. If the condition is not satisfied for the chosen retraction point a new retraction point is chosen until the condition is fulfilled. When the condition is fulfilled the path from the collision point to the chosen retraction point is loaded into the D-variables and the job is executed. However, problems may occur when retracting the arm from the colli-sion point. If the arm has collided with a flexible object some of the passed sample

Figure 4.8. retraction based the distance to collision point

points may not be valid as part of the collision free path, see Figure 4.9. The arm may have managed to pass the object without triggering the collision alarm, but when another collision occurs it might collide with the object when the arm is being retracted, see Figure 4.9. However, the sensitivity of the collision sensors can also be modified but if the sensitivity is too low the collision sensor may be triggered simply by the motion of the arm. Another solution to this problem is to insert a new retraction point on the line between the two collision points. The point is then loaded into the D-variables and the arm is moved to the new point, Figure 4.9 shows the modified path. The insertion of the new point is only done if there is no sample point between the two collision points. If the collision occurs

(49)

4.2 Pathfinder 37

when the arm is being retracted to a second retraction point as shown in Figure 4.10 the arm will retract to the first retraction point and modify the second retrac-tion point. When the arm has successfully reached the modified second retracretrac-tion point the new path to the original collision point is generated and executed.

Figure 4.9. Insertion of retraction point

When the arm has reached the end point B, the path may still cause collisions when the path is executed from point B to point A. If the arm has collided with a flexible object and managed to pass it, without triggering the collision sensors, and none of the above situations have occurred, the path may cause a collision when moving from point B to point A, see Figure 4.11. Therefor the path is executed from point B to point A and if a collision occurs the arm is retracted to the closest passed sample point and the next path point in the motion direction is modified as shown in Figure 4.11. If the modified path point is the point A another point has to be inserted at the original location of the point A, see Figure 4.12, to ensure that the arm will end its motion at the desired start point. When the safe path has been generated it is processed to remove unnecessary points in the path before it is used as a job. The path processing will be described in Section 4.2.2. After the path has been processed the remaining path points are used to create a job file consisting of C-variables, as described in section 3.3. However, the created job file for this application will interpret the stored data in the C-variables as cartesian

(50)

Figure 4.10. Modified second retraction point

Figure 4.11. Correction of generated path

coordinates and rotation of the TCP instead of pulses. The job is then loaded on to the NX100 and can be used as any other manually created job.

In the basic path finder the arm may collide with the same object several times due to the simple path planing. However, if there were some constraints on the environment a more advanced path planing could be used or the basic pathfinder could be used to locate and estimate the size of the objects in the environment. This will be discussed in Section 4.2.3.

4.2.2

Path processing

When the safe way around the objects has been generated the path will consist of a number of points. When creating a job file consisting of MOVL instructions the path points that describe a change in the motion direction are the only ones needed for the future job, see Figure 4.13. By looking at how the gradient changes between the points in the path it can be determined were in the path the information is

(51)

4.2 Pathfinder 39

Figure 4.12. Modified A point

redundant and can be removed. The algorithm is

norm 5 g = (x2− x1, y2− y1, z2− z1) k(x2− x1, y2− y1, z2− z1)k

norm 5 f = (x3− x2, y3− y2, z3− z2) k(x3− x2, y3− y2, z3− z2)k

(4.4)

with the constraints

if (norm 5 g = norm 5 f )

”remove point 2” (4.5)

where norm 5 g and norm 5 f are the normalized gradients between the points. By stepping through the path points and removing points, a robot job without unnecessary information can be generated.

Figure 4.13. Resampling

4.2.3

Object Estimation

When a collision occurs using the basic pathfinder the collision point is retrieved and stored and the robot tries to move over the object by planing a linear path. If the objects are pre-defined as spheres, the collision points can be used to estimate the location and size of the object to improve the path planing. When the arm

References

Related documents

Now we have seen that the Casimir operator W 2 describes the spin of a massive particle in quantum field theory, and since the Pauli-Lubanski spin is invariant under the

Process Technical Aspects: Design of treatment chains that can treat the wastew- ater from Hurva wastewater treatment plant (WWTP) into drinking water quality..

The begining of Knot theory is in a brief note made in 1833, where Carl Friedrich Gauss introduces a mathematical formula that computes the linking number of two space curves [0]..

• Företagens enda sociala ansvar är att maximera sin lönsamhet (Friedman, 1970). Vi anser att de fördelar och nackdelar som debatten medför ökar problematiken angående

9 Optional: Configuration of a wireless network connection in Access Point mode 9 Optional: Configuration of client links in Client Links mode.. 9 Optional: Configuration of

The most important reasons for operating a CDP are to increase cross-selling of other products, followed by increased service level for the customers and increased income from

To illustrate how profit is not the best means of making a new hospital, Paul Farmer contrasts a private finance hospital construction in the city of Maseru in Lesotho with

The main focus of this paper will be to prove the Brouwer fixed-point theorem, then apply it in the context of a simple general equilibrium model in order to prove the existence of