• No results found

Laser on kinetic operator

N/A
N/A
Protected

Academic year: 2022

Share "Laser on kinetic operator"

Copied!
164
0
0

Loading.... (view fulltext now)

Full text

(1)

DOCTORA L T H E S I S

Department of Computer Science and Electrical Engineering EISLAB

Laser On Kinetic Operator

Håkan Fredriksson

ISSN: 1402-1544 ISBN 978-91-7439-157-2 Luleå University of Technology 2010

Håkan Fr edr iksson Laser On Kinetic Operator

ISSN: 1402-1544 ISBN 978-91-7439-XXX-X Se i listan och fyll i siffror där kryssen är

(2)
(3)

Laser On Kinetic Operator

akan Fredriksson

EISLAB

Dept. of Computer Science and Electrical Engineering Lule˚ a University of Technology

Lule˚ a, Sweden

Supervisors:

Prof. Kalevi Hyypp¨ a and Dr. Jan van Deventer

European Union

Structural Funds

(4)

Printed by Universitetstryckeriet, Luleå 2010 ISSN: 1402-1544

ISBN 978-91-7439-157-2 Luleå 2010

www.ltu.se

(5)

To my family

iii

(6)

iv

(7)

Abstract

This thesis highlights a variety of aspects within the field of mobile robots. The hinge is system architecture for mobile robots. For a mobile robot to be put into operation all necessary systems have to be functioning and control mechanisms have to be established.

The robot may be set to operate autonomously as well as remotely controlled. Either way, different sensors and algorithms may be needed for navigation, obstacle detection, and other functionalities. In this thesis, a methodology to design a flash for illuminating retroreflective beacons is proposed. The flash is used in a navigation system based on CMOS cameras. Furthermore, a method to create a 3D environment model from 2D range data is presented. The result is used in a vehicle simulation software. Two different methods for a vehicle to find its way in an environment are proposed. The first method is intended for use in a winter environment. The idea is to make the vehicle drive between the snow banks on the sides of a road, created when removing the snow from the road with a snow plough. The snow banks are detected using an algorithm working on data from a range measuring laser. The other method finds the free space where the vehicle may move without running into any obstacle. Vehicle modelling and control are discussed. The problem to steer a tractor that is to operate on a rough and slippery surface is studied and a method to compensate for the slippage of the front wheels is presented. An implemented and tested kinematic model for a teleoperated track loader is also presented. Finally, a method for smooth path-following is suggested. The method utilises motion control to force the vehicle to behave smoothly, combined with continual re-routing to, if possible, let the vehicle ”cut corners” and take the shortest possible path.

The overall goal, to put a mobile robot into operation, may be reached by combining the different aspects discussed in this thesis.

v

(8)

vi

(9)

Contents

Part I 1

Chapter1 – Introduction 3

1.1 Outline . . . 3

1.2 Overview . . . 4

Chapter2 – Laser On 5 2.1 Sensors . . . 5

2.2 Algorithms . . . 8

Chapter3 – Kinetic Operator 13 3.1 Modelling . . . 13

3.2 Control . . . 14

Chapter4 – Summary 17 4.1 Papers . . . 17

4.2 Discussion . . . 22

References 25

Part II 33

Paper A – Multi source flash system for retroreflective beacon detection in CMOS cameras 35 1 Introduction . . . 37

2 Flash system equations . . . 40

3 Implementation . . . 44

4 Conclusions and discussion . . . 47

5 Future work . . . 48

PaperB – Range Data in Vehicle Dynamic Simulation 51 1 Introduction . . . 53

2 Approach . . . 55

3 Results . . . 60

4 Conclusion . . . 62

PaperC – snowBOTs: A Mobile Robot on Snow Covered Ice 65 1 Introduction . . . 67

vii

(10)

2 The snow edge detection method . . . 68

3 Test area and equipment . . . 77

4 Closing the loop . . . 79

5 Laser measurements during snowfall . . . 80

6 Conclusion and discussion . . . 82

7 Future work . . . 82

8 Acknowledgement . . . 83

PaperD – Circle Sector Expansions for On-Line Exploration 85 1 Introduction . . . 87

2 The Circle Sector Expansion Method . . . 88

3 Implementations and Tests . . . 95

4 Conclusions . . . 98

PaperE – Gyro feedback of a hydraulic steering system 103 1 Introduction . . . 105

2 Method . . . 109

3 Experimental results . . . 114

4 Conclusion . . . 117

5 Future work . . . 117

PaperF – Track loader kinematics 119 1 Introduction . . . 121

2 Track loader . . . 121

3 Kinematics . . . 123

4 Tests . . . 124

5 Ongoing and Future work . . . 127

PaperG – Smooth Path-Following for Autonomous Vehicles 131 1 Introduction . . . 133

2 The Method . . . 133

3 Motion Constraints . . . 136

4 Path-following controller . . . 140

5 Simulation Results . . . 142

6 Conclusion . . . 143

A Side Acceleration . . . 147

B Stability analysis . . . 147

viii

(11)

Acknowledgments

Many are the people I would like to thank for making my work with this thesis possible. I start with my supervisors Prof. Kalevi Hyypp¨a and Dr. Jan van Deventer for support and guidance, and for hiring me in 2005 after my Master thesis work. I wish to thank all my co-authors. Without You the included papers would not be what they are. Though the discussions sometimes have been painfully long, especially when close to deadlines, the results have always been great. Thanks; Kalevi Hyypp¨a, Ulf Andersson, Tomas Berglund, Sven R¨onnb¨ack, Mikael Nybacka, Fredrik Brostr¨om, and ˚Ake Wernersson. Many thanks to my brother-in-law Brian Cournoyer for the ideas and discussion regarding the title of the thesis. It was a huge relief when I finally realised how all my work could be structured and fit into the title. Now it is obvious, there could be no other way. Thanks also to all my friends and colleagues at the university. It has been a pleasure discussing both work as well as non-work related stuff. Finally, I would like to thank my family, wife Karolina, and my kids Cornelia and Elvira.

The work in this thesis was funded by ProcessIT Innovations, Centre for Automotive Systems Technologies and Testing (CASTT), and Centrum F¨or Medicinsk Teknik och Fysik (CMTF).

ix

(12)

x

(13)

Part I

1

(14)

2

(15)

Chapter 1 Introduction

Figure 1.1: Laser On Kinetic Operator.

1.1 Outline

This thesis summarise my experience in the wide area of mobile robotics. Figure 1.1 outlines the content: Laser On Kinetic Operator. A laser based measurement system, SICK LMS111, facing a Kinetic Operator, Caterpillar 973c track loader.

3

(16)

4 Introduction

The first part of the title, Laser On, interprets as perception of the environment.

That is, assign a meaning to what you sense. To sense, you need a sensor. To assign a meaning, you need algorithms. The Laser sensor may be either, looking On, or, mounted On board of, the Kinetic Operator.

A metaphrase of Kinetic Operator is a worker with motion. In this thesis, the worker is a robot that moves in an environment: A Mobile Robot. A mobile robot is useless without control. Control is most often easier if the robot is properly modelled.

1.2 Overview

The thesis is divided into two parts. Part I gives an introduction to my research and the included papers. Part II consists of seven scientific papers. Paper A-F are published, and Paper G will be published.

In Part I, I first connect Papers A-D to Perception (Laser On). After that, I relate Pa- pers E - G to the area of Mobile Robots (Kinetic Operator ). At the end, I summarise and give my personal reflections to each paper, followed by a discussion on future interesting areas of research.

(17)

Chapter 2 Laser On - Perception Sensors and Algorithms

Laser On; Perceive the environment

2.1 Sensors

This thesis distinguishes and gives an introduction to two different types of perception sensors: beacon sensors and ranging sensors. In Figure 2.1 both types of sensors can be seen mounted on the MICA wheelchair [1, 2]. On top of the pillar in the rear part of the wheelchair two different beacon sensor systems are mounted. On the table in the front of the wheelchair a ranging sensors can be seen. All these sensors are described in more detail later in this chapter.

2.1.1 Beacon Sensor

A beacon sensor is designed to detect only a specific type of objects (beacons) in the environment. Depending on the area of use, the system (beacons and sensor) are designed in such a way that the beacons can easily be detected and at the same time reduce the risk of confusing beacons with other objects. The raw sensor output from a beacon sensor is angle and/or distance to one or more beacons in the environment. A common use of beacon sensors is for navigation, i.e. to find out where you are.

Retroreflective Beacons

One navigation system that uses artificial beacons as references for the navigation is the NDC8 system, also known as LazerWay. This system was invented at Lule˚a University of Technology [3], and is today produced by Kollmorgen [4]. The navigation system is used in a variety of industrial applications, including the automated LHD (Load Haul Dump) vehicles in the LKAB underground mine in Kiruna, Sweden.

5

(18)

6 Perception

Figure 2.1: MICA - A computerised wheelchair with self driving capability. In the rear end of the wheelchair, on top of the pillar, two beacon sensor systems are mounted. In the front, on the table, a ranging sensor is mounted.

To detect the artificial beacons the NDC8 system uses a laser scanner. The scanner is visible on top of the wheelchair in Figure 2.1. When a necessary number of beacons are detected, the vehicle position and heading can be found by comparing the beacon positions to the known beacon map. The NDC8 system can estimate the position of a vehicle with an uncertainty of a few centimetres.

A CMOS camera based navigation system using the same retroreflective beacons as the NDC8 system has been presented at Lule˚a University of Technology [5]. The system

(19)

2.1. Sensors 7 is intended for short range (less than 20m) indoor use, and consists of four individual camera modules mounted perpendicular to each other. This system is shown in Figure 2.1, on top of the pillar in the rear part of the wheelchair. Detection of beacons is done in hardware in each camera module. The output are angles and distances to the detected beacons. Paper A describes the flash constructed to improve the detection of beacons in the system.

Since a navigation system using retroreflective beacons requires infrastructure to be installed in the environment, it is only suitable to use in known environments. The system has no possibility to detect obstacles nor traverse areas without retroreflective beacons. If the system is to be used in an environment where the driving surface is rough, i.e. when the vehicle and hence also the sensor wiggles, it might be a problem to detect the beacons at large distances. However, this is not as much of a problem for a camera based navigation system, since the cameras have a wide vertical field of view.

Satellite Beacons

The most common Global Navigation Satellite System (GNSS) is probably NAVSTAR GPS, developed and maintained in the USA. This systems uses satellites as references (beacons) in the navigation process. Galileo is another GNSS system under development by the European Union (EU). Furthermore, Russia has a system under restoration called GLONASS.

A NAVSTAR GPS receiver estimates its position by measuring distances to four or more medium Earth orbit satellites [6]. The uncertainty of the position estimate is, for a standard civilian GPS receiver, in the order of ±5m. It is possible to increase the accuracy for the position estimate in several different ways. One way is to use differential GPS [7]. Another is to fuse GPS data with data from an Inertial Measurement Unit (IMU) [8].

In Paper B a NovAtel GPS/IMU system is used, in combination with a range mea- suring laser, to measure the 3D environment of a go-kart track. The 3D environment shown in Figure 2.3 was created using this measurement system.

2.1.2 Ranging Sensor

A ranging sensor measures distance and angle to obstacles in the environment. Hence, it may also be used as a type of beacon sensor, if the beacon is designed in such a way that it may be recognised and distinguished in the environment. Depending on the type of sensor, some materials may be problematic to detect. As an example, an optical sensor may have problems detecting glass windows and mirrors.

A ranging sensor is very versatile since it may be used for several different applications at the same time. For instance mapping, obstacle avoidance, and navigation. However, in order to produce something useful out of the raw data, some algorithms have to be implemented.

A laser range finder, also called a laser scanner, is a ranging sensor. The laser measures distances to objects in its environment. A common laser scanner is the SICK LMS200 [9],

(20)

8 Perception

seen mounted in the front of the wheelchair in Figure 2.1. This laser scanner measures distances (up to 80m) to objects in a plane with an 180field of view. Another example of a similar type of laser scanner is the SICK LMS111 seen in Figure 1.1. That laser has a better field of view (270) but a shorter distance interval (up to 20m).

2.2 Algorithms

Raw sensor data need to be processed in order to provide useful information. The work presented in Papers B, C, and D, concerns algorithms working on raw sensor data. This thesis touches two areas of information processing. First, environmental mapping by combining data from different sensors. Secondly, two different methods for a mobile robot to find its way.

2.2.1 Environmental Mapping

It is possible to produce 3D environment models of 2D laser measurements by moving or rotating the laser. We have performed tests with laser scanners mounted on a roof rack on a car. When the laser is mounted with a tilt angle, it is possible to recreate a 3D environment model as the vehicle moves. Two different examples of 3D environment models created this way are shown in Figure 2.2 and 2.3.

Indoor Mapping

The 3D image seen in Figure 2.2 shows one of the tunnels in the LKAB underground mine in Kiruna, Sweden. The 3D environment model was created by fusion of data from two LMS200 range measuring lasers and one NDC8 navigation system. The laser data were rotated and translated into a global coordinate frame and plotted in Matlab as a surface. Related environmental mappings are done in [10] and [11] using a different navigation approach.

During our test in the underground mine, we only had information about the vehicle position and heading in 2D. Hence, the vehicle was assumed to always be on the same height. We also assumed that the pitch and roll angles of the laser were constant. In reality they are not, due to motions in the vehicle while driving. Still, this kind of 3D representation of the collected data provides a powerful visualisation of the environment.

Outdoor Mapping

Figure 2.3 shows an outdoor 3D environment model, created using range data from an LMS200, combined with position and orientation data from a NovAtel GPS/IMU navigation system. The environment is plotted in Agency9:s [12] software 3DMaps. This software allows the user to ”fly” around in the environment model and easily visualise the data from different perspectives. The same technique to collect outdoor 3D environmental data is also used in Paper B. The resulting environment model is there converted and used in a vehicle dynamic simulation software.

(21)

2.2. Algorithms 9

Figure 2.2: A 3D representation of a tunnel in a mine (above), plotted using Matlab, and the corresponding real world photograph (below). The 3D environment model is created by combining information from two ranging sensors (SICK LMS200) and a navigation system using retroreflective beacons (NDC8). A number of beacons are visible in the photograph of the tunnel.

(22)

10 Perception

Figure 2.3: A 3D environment model, plotted using Agency9 3Dmaps software (above), and the corresponding photograph (below). The 3D environment model is created by combining infor- mation from a ranging sensor (SICK LMS200) and a navigation sensor (NovAtel GPS/IMU).

(23)

2.2. Algorithms 11

2.2.2 Find a Path

Two different ways for a mobile robot to find a possible path in an environment are presented in Papers C and D respectively.

The snow-bank-following method presented in Paper C was part of the work of making the tractor IceMaker I, seen in Figure 3.2, autonomous. An akin approach on a different vehicle is presented in a master thesis [13].

The Circle Sector Expansion (CSE) method, presented in Paper D, may be used for exploration of an unknown environment. The tree structure that is created by the CSE method provides one path that a vehicle may use to traverse an environment without running into obstacles. The method is used in [14] for oncomming car detections.

(24)

12 Perception

(25)

Chapter 3 Kinetic Operator - Mobile Robot Modelling and Control

Kinetic Operator; A worker with motion

3.1 Modelling

A kinematic model describes the motion of a vehicle, without reference to forces acting on the vehicle or the vehicle mass [15, 16]. This, in contrast to a dynamic model that includes the vehicle mass and forces that influence the vehicle motion [17]. In Paper E a kinematic model is used to estimate and control the effective steering angle of a John Deere 4720 tractor. The result is a, to some extent, yaw stabilised [18] tractor, that is to operate on a slippery surface. Paper F tests a similar kinematic model on a skid steered vehicle, a Caterpillar 973c track loader. Related kinematic modelling of skid steered vehicles can be found in the literature [19, 20, 21].

3.1.1 Dead reckoning

Knowing the vehicle heading and speed, it is possible to estimate the vehicle position based on the previously determined position. This kind of navigation is called dead reckoning navigation. Dead reckoning is often used in mobile robot applications as it is very useful for short term navigation, as well as a complement to other navigation principles.

On a differentially driven vehicle, such as the MICA wheelchair, it is possible to monitor how the vehicle moves, i.e. perform dead reckoning, using odometers (that measure distance travelled) on the drive wheels. If slip occurs between the tires and the ground, which more or less always is the case [22], an error is introduced into the estimate of the vehicle motion. To reduce the influence of tire slip, an Inertial Measurement Unit (IMU), can be used together with the dead reckoning system [23]. An IMU may consist of either one or more rate gyros (that measure rotation speed) and/or one or

13

(26)

14 Mobile Robot

more accelerometers (that measure acceleration). In some cases, when a vehicle is to operate on a flat surface, the dead reckoning system may be improved using a rate gyro in combination with odometers [24]. The rate gyro improves the estimate of the vehicle heading, which in turn has a great impact on the resulting estimated position. Another possibility to improve the dead reckoning is to use redundant odometry information [25].

Paper F shows results when testing dead reckoning on the Caterpillar 973c track loader shown in Figure 3.1. Two cases are studied. Dead reckoning using solely track speed sensors, and dead reckoning using track speed sensors in combination with a rate gyro. The result is compared with the position information from a GPS system.

Figure 3.1: Caterpillar 973c. A computerised, remote controlled, and partly autonomous track loader.

3.2 Control

One way to step away from manual on-board control of a vehicle is to add equipment for computer control. Using a computer to manipulate the control signals it is possible to enable both remote control, as well as automatic control, while still having the possibility of manual on-board control.

(27)

3.2. Control 15

Figure 3.2: Icemaker I. A computerised and remotely controlled John Deere 4720 tractor.

The remotely controlled Caterpillar 973c, seen in Figure 3.1, uses GIMnet [26] as software infrastructure. GIMnet utilises Ethernet for communication between software modules. Hence, if a wireless network (WLAN) is used, wireless remote control is estab- lished. A video that shows the first test of the remotely controlled caterpillar is found at [27].

Today, three large manufacturers of Load Haul and Dump vehicles, offer computer controlled automation systems for use in underground mines. Caterpillar [28] has a system called MineGem. Sandvik [29] offers a system called AutoMine. Atlas Copco [30] has their Scooptram Automation system. These systems are partly described in the literature [31, 32, 33, 34, 35]. The common navigation and control approach in these automation systems is to use a ranging sensor (laser scanner) in combination with dead reckoning. For localisation, the range information is used to identify where in the mine the vehicle is. Hence, no extra infrastructure, such as beacons, need to be installed in the mine for navigation purposes. However, to strengthen the localisation, additional beacons are introduced in some of these systems, at intersections, dump places, and other important locations.

Another automation system for LHD:s is the SALT4 system in use in the LKAB mine in Kiruna, Sweden. This system differs from the other three mentioned systems since it uses a beacon navigation system and no ranging sensor. Hence, it requires infrastructure (reflective tape) to be installed in the area of the mine where the LHD is to operate.

(28)

16 Mobile Robot

The benefit with this approach is that the LHD may be localised at all locations in the operational area, as long as there is a sufficient amount of visible beacons. Beacons may disappear, for example when blasting.

The tractor IceMaker I, seen in Figure 3.2, was remotely controlled on an ice covered lake in the northern part of Sweden [36]. The idea with that tractor was early snow removal on thin lake ice. Paper E describes the feedback control system that, when the tractor is remotely operated, controls the steering angle of the front wheels. Other snow removal applications described in the literature includes gang ploughing using differential GPS [37], and other computer guidance systems [38, 39, 40] that assist the driver.

Feedback control, in the sense of path-following for mobile robots, has been reported in the literature [41, 42]. The dog-rabbit principle, used in Paper G is sometimes referred to as the virtual vehicle approach [43, 44]. This principle can be used on most wheeled and tracked vehicles by adding a ”virtual” steered front wheel at a given position in front of the driving axle. Hence, the method described in Paper G can be used on several different types of vehicles.

(29)

Chapter 4 Summary

LOKO-motion; the act of guiding a Kinetic Operator using a Laser

4.1 Papers

The papers included in Part II and my personal contribution in each one of them are described in this section.

Paper A: Multi source flash system for retroreflective beacon detetction in CMOS cameras

Authors: H˚akan Fredriksson and Kalevi Hyypp¨a Summary

We present a method for improving a flash system for retroreflective beacon detection in CMOS cameras. Generally, flash systems are designed in a manner that make them suited for beacon detection in a small range interval. We strive to increase the flash system range interval by exploiting the directional properties of the retroreflector. Due to these properties light sources placed relatively far away from the optical axis of the camera will contribute only when the retroreflector is far away. This fact can be used to compensate for the one over distance squared dependency of optical power. Underlying theory and formulae are presented. A flash system consisting of several light emitting diodes was designed considering the presented method. Simulations show that the usable flash range of the improved system can be almost doubled compared to a general flash system. Tests were performed indicating that the presented method works according to theory and simulations.

References

[45, 46, 47, 48, 49, 50, 51, 52]

17

(30)

18 Summary

Personal Reflection

The main work behind this paper is done by me, with the support from Kalevi Hyypp¨a.

I analysed the problem, implemented the ideas, and tested the system. The work with the flash was started as a project to test and possibly improve the CMOS camera based navigation system, seen in Figure 2.1. To improve the beacon detection I also redesigned the software in the camera system in conjunction to the improved flash system.

The CMOS camera based navigation system has the potential to become a cheap (relatively) replacement for the NDC8 navigation system. The CMOS system can become useful in small indoor environments like homes and offices. Cheap, since it basically uses standard CMOS cameras and no rotating mechanics. It has a shorter range, due to both lower light sensitivity in the camera chip as well as lower radiant intensity from the light source. Interesting future work would be to orderly test the flash system and beacon detection algorithm by comparing the system with the NDC8 navigation system.

Paper B: Range data in vehicle dynamic simulation

Authors: H˚akan Fredriksson, Mikael Nybacka, and Kalevi Hyypp¨a Summary

This paper presents a way to merge range data into the vehicle dynamic simulation soft- ware CarSim 7.1. The range data consists of measurements describing the surface of a road, and thus, creates a close to real life 3D simulation environment. This reduces the discrepancy between the real life tests and simulation of vehicle suspension systems, dampers, springs, etc. It is important for the vehicle industry to represent a real life environment in the simulation software in order to increase the validity of the simula- tions and to study the effects that uneven roads have on the systems. Furthermore, a 3D environment based on real life data is also useful in driving simulators, when for example, analysing driver behaviour, testing driver response, and training for various driving condi- tions. To measure and collect data, a car was equipped with instruments and a computer.

On top of the car, a SICK LMS200 2D lidar was mounted tilted downwards, facing the road in front of the car. To create the 3D environment, all the individual measurements were transformed to a global coordinate system using the pose (position and orientation) information from a high-class navigation system. The pose information made it possible to compensate for the vehicle motion during data collection. The navigation system con- sisted of a GPS/IMU system from NovAtel. To reach high navigation performance, the raw GPS/IMU data were post-processed and fused with data from three different fixed GPS base stations. The range data were modified with a Matlab script in order to parse the data into a file that could be read by CarSim software. This created the 3D road used in the vehicle dynamic simulations. The measurements were collected at a go-kart track in Lule˚a, Sweden. Finally, tests have been performed to compare simulation results be- tween using a 2D surface (i.e. flat) and a 3D surface (close to real life). It is seen that the simulation results using the 2D surface is clearly different from the 3D surface simulation.

(31)

4.1. Papers 19

References [53, 54, 55, 56, 57]

Personal Reflection

The processing of the raw range measurements where done by me, whilst the vehicle dynamic simulations where done by Mikael Nybacka. All authors contributed to the writing.

One area of future work would be to test and compare simulated vehicle data with real measurements from the test car. This would however require further studies, and an expansion of the test system. One problem to solve is the connection of real world vehicle position to simulated vehicle position. If this is not done accurately, the simulation results may not be comparable with real world measurements.

Paper C: snowBOTSs: A Mobile Robot on Snow Covered Ice

Authors: H˚akan Fredriksson, Sven R¨onnb¨ack, Tomas Berglund, ˚Ake Wern- ersson, and Kalevi Hyypp¨a

Summary

We introduce snowBOTs as a generic name for robots working in snow. This paper is a study on using scanning range measuring lasers towards an autonomous snow-cleaning robot, working in an environment consisting almost entirely of snow and ice. The problem addressed here is using lasers for detecting the edges generated by ”the snow meeting the road”. First the laser data were filtered using histogram/median to discriminate against falling snowflakes and small objects. Then the road surface was extracted using the range weighted Hough/Radon transform. Finally the left and right edges of the road was detected by thresholding. Tests have been made with a laser on top of a car driven in an automobile test range just south of the Arctic Circle. Moreover, in the campus area, the algorithms were tested in closed loop with the laser on board a robotized wheelchair.

References

[58, 59, 60, 61, 62, 63, 64, 65]

Personal Reflection

The work was performed as a part of the project with the tractor IceMaker I. All authors contributed to the general ideas behind the paper. I did the implementation, made the tests, and wrote the main part of the paper. The filtering of laser data where further analysed in [66].

Paper D: Circle Sector Expansions for On-Line Exploration

Authors: Sven R¨onnb¨ack, Tomas Berglund, H˚akan Fredriksson, and Kalevi Hyypp¨a

(32)

20 Summary

Summary

A novel and effective method denoted circle sector expansion (CSE) is presented that can be used to generate reduced Voronoi diagrams. It is intuitive and can be used to efficiently compute possible paths for a vehicle. The idea is to model free space instead of the features in the environment. It is easy to implement and can be used while a vehicle moves and collects new data of its surrounding. The method is directly applicable and has properties for fast computations of safety margins while at the same time having low complexity. We have successfully implemented the algorithm and its methods and performed real-life tests using an autonomous wheelchair equipped with a range scan- ning laser, a rate gyro, and wheel-encoders. Tests showed good results for supporting the use of CSE. The results are applicable for example to improve assistive technology for wheelchair users.

References

[67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84]

Personal Reflection

My contribution to this paper is discussions and writing. I have used the method in my research with the wheelchair MICA. The ideas for Paper G arise from using the CSE method to explore an unknown indoor environment. A path created by the CSE method is quite jagged and if not handled with care will make the vehicle wiggle and, from an outside observer point of view, behave somewhat unpredictably.

Paper E: Gyro feedback of a hydraulic steering system

Authors: H˚akan Fredriksson, Ulf Andersson, and Kalevi Hyypp¨a Summary

We study the problem to steer, teleoperated as well as autonomously, a tractor that is to operate on a rough and slippery surface. The main idea is to control the effective steering angle of the vehicle. A benefit with this approach is that the vehicle will strive to move in a predefined way irrespective of the surface conditions. To test if the approach is usable we implemented a straight forward p-controller in combination with an estimator of the effective steering angle of the tractor. Our tests indicates that the approach can be useful. Future work would include studies on more advanced control strategies and estimators to reduce some of the problems that we address.

References

[85, 86, 87, 88, 22, 18, 89, 36]

Personal Reflection

I was responsible for the implementation of the control system, while all authors con-

(33)

4.1. Papers 21 tributed in discussions and writing. With me as supervisor, the control system imple- mentation was first done as a student project course. Parts were later refined as a master thesis [90]. The origin to this paper was presented at a conference [91].

It would be interesting to look further into the problems related to autonomous snow removal. Driving on a rough and slippery winter surface does introduce problems in the handling of the vehicle. Adding either a snow blower or a snow plough makes it even more challenging. A skilled human driver does handle these situations. Can it all be done autonomously?

Paper F: Track Loader Kinematics

Authors: H˚akan Fredriksson, Ulf Andersson, and Kalevi Hyypp¨a Summary

We study the problem to teleoperate a Caterpillar 973c track loader. The track loader and a system for teleoperation are described. A tested and working kinematic model used for dead reckoning is also presented. Track speed sensors combined with a rate gyro are used as input to the model. The model shows good results when tested and compared to a GPS navigation system.

References [26, 19, 20, 27]

Personal Reflection

I designed the control system while all authors contributed in discussions and writing.

Bo Dahlgren contributed with his great knowledge on how the machine works and his exceptional mechanical assembly skills.

Ongoing and future work does include implementation of autonomous functionality on the track loader. Implementation of a version of the path-following algorithm, presented in Paper G, has made the track loader self driving.

Paper G: Smooth Path-Following for Autonomous Vehicles

Authors: H˚akan Fredriksson, Ulf Andersson, Fredrik Brostr¨om, and Kalevi Hyypp¨a

Summary

This paper studies the problem of smooth path-following for autonomous vehicles. A smooth path-following method and a stability analysis are presented. The problem is addressed by combining continual re-routing with vehicle motion control. Allowing the vehicle to deviate from the path, whilst constraining the vehicle motion, produces a smooth route for the vehicle. Performance of the method is shown by simulations.

References

(34)

22 Summary

[43, 44, 41, 42, 92, 93, 94, 2, 95, 96]

Personal Reflection

The basic idea to smooth the path-following by pushing the rabbit as far away as possi- ble was first implemented on the MICA wheelchair by Fredrik Brostr¨om. This was not sufficient, however, since sporadic switching of aiming point causes the vehicle to wiggle and behave unpredictably. To improve the vehicle behaviour, I formalised and analysed the problem thoroughly. I introduced constraints on vehicle motion and unravelled the relationship between the steered front wheel and how changes in steering angle, as well as changes in velocity of the steered front wheel, propagate into vehicle motion. With reasonable constraints for the vehicle motion it is possible to control the velocity of the vehicle as a function of the distance to the rabbit. The stability analysis of the control problem was done by Ulf Andersson. All authors contributed to the writing.

The presented method may be used as described in this paper. It may as well be slightly modified to fit the needs of the specific area of use. As an example, if more precise path-following is required, one can limit how far the rabbit is pushed away. In the case of precise path-following, it is necessary to take into account that the method will strive to make the steered front wheel to follow the path. The rear end of the vehicle will take a shortcut during turns. Hence, it may hit obstacles on the way during sharp turns. Especially when using a ”virtual” front wheel, it is necessary to take this into account. The problem may be solved by taking into account the vehicle kinematics in the path-planning phase.

4.2 Discussion

This thesis summarise my work in the field of mobile robots. Every part mentioned in this thesis has a purpose and a goal. The backbone is to put a mobile robot into operation. During my work, I touched sensor system design and algorithm development, as well as mobile robot modelling and control. All of which are important aspects to fulfil the higher goal; make the robot operate. The depth of the thesis is given by seven scientific papers that each give a thorough analysis of the stated problem.

From my point of view the really interesting work starts when these different problems are combined with the aim to solve a larger problem. The included papers are spread out over several different research areas. However, the hinge is to reach the higher goal;

make a mobile robot operate. One approach for a vehicle to drive along a winter road is presented in Paper C. The CSE method, presented in Paper D, may be used to find a possible path for a vehicle to traverse an unknown environment without running into obstacles. To get a vehicle to follow a calculated path, a path-following method like the one presented in Paper G, may be used. That path-following method requires that the vehicle is properly modelled. Two different vehicle models are presented in Papers E and F. The first includes stability control of the vehicle heading and the latter presents solely one tested vehicle model. To determine the vehicle position a navigation system has to be introduced. Paper A presents a small part of such a system. Sometimes it might be

(35)

4.2. Discussion 23 of interest to simulate the vehicle behaviour instead of doing practical tests. Paper B shows some results using this approach.

One future interesting area of research would be sparse navigation, combining beacon navigation, dead reckoning, and ranging information. A combination of several different sensors and navigation techniques has the potential to increase the navigation capability of the mobile robot. Hence, expand the work area and the usage of the robot.

(36)

24 Summary

(37)

References

[1] H. Fredriksson and K. Hyypp¨a, “GIMnet on the MICA wheelchair,” in GIMnet 2010 Symposium, Helsinki, Finland, 2010.

[2] R. Almqvist, F. Brostr¨om, and J. Brynolf, “Laser guided vehicles: implementing and testing a GIMnet based software platform on the electric wheelchair MICA,”

Master’s thesis, Lule˚a University of Technology, Sweden, 2009.

[3] K. Hyypp¨a, “On a laser anglemeter for mobile robot navigation,” Ph.D. dissertation, Lule˚a University of Technology, Sweden, Apr 1993.

[4] Kollmorgen, “http://www.kollmorgen.com,” June 2010.

[5] M. Evensson, A. Marklund, K. Kozmin, and K. ˚Ahsberg, “Ett kamerabaserat nav- igeringssystem,” Master’s thesis, Lule˚a University of Technology, Sweden, 2002.

[6] P. Misra and P. Enge, Global Positioning System, 2nd ed. Gang-Jamuna Press, 2001.

[7] G. Morgan-Owen and G. Johnston, “Differential GPS positioning,” Electronics &

Communication Engineering Journal, vol. 7, Feb 1995.

[8] S. Sukkarieh, E. Nebot, and H. Durrant-Whyte, “A high integrity IMU/GPS naviga- tion loop for autonomous land vehicle applications,” IEEE Transactions on Robotics and Automation, vol. 15, June 1999.

[9] SICK, “SICK and LMS200 and laser measurement system,” http://www.sick.com, Dec 2003.

[10] S. Thrun, D. H¨ahnel, D. I. Ferguson, M. Montemerlo, R. Triebel, W. Burgard, C. R.

Baker, Z. Omohundro, S. Thayer, and W. Whittaker, “A system for volumetric robotic mapping of abandoned mines,” in ICRA, 2003, pp. 4270–4275.

[11] D. Silver, D. Ferguson, A. Morris, and S. Thayer, “Topological exploration of sub- terranean environments,” Journal of Field Robotics, vol. 23, no. 6-7, pp. 395–415, 2006.

[12] Agency9, “http://www.agency9.se,” June 2010.

25

(38)

26 References

[13] N. Karvonen, “Time-efficient algorithms for laser guided autonomous driving,” Mas- ter’s thesis, Lule˚a University of Technology, Sweden, 2009.

[14] S. R¨onnback, “Circle sectors for detection of oncoming cars in range data,” vol. 2, sep. 2008, pp. 17–40 –17–45.

[15] “http://en.wikipedia.org/wiki/kinematics,” October 2010.

[16] R. Rajagopalan, “A generic kinematic formulation for wheeled mobile robots,” Jour- nal of Robotic Systems, vol. 14, no. 2, 1997.

[17] F. Lei and H. Yong, “Study on dynamic model of tractor system for automated navigation applications,” Journal of Zhejiang University - Science A, vol. 6, pp. 270–275, 2005, 10.1007/BF02842055. [Online]. Available:

http://dx.doi.org/10.1007/BF02842055

[18] J. Ackermann, “Robust car steering by yaw rate control,” in IEEE 29th conference on Decision and Control, 1990.

[19] J. L. Mart´ınez, A. Mandow, J. Morales, S. Pedraza, and A. Garc´ıa-Cerezo, “Approx- imating kinematics for tracked mobile robots,” I. J. Robotic Res., vol. 24, no. 10, pp. 867–878, 2005.

[20] A. Mandow, J. L. Mart´ınez, J. Morales, J.-L. Blanco, A. Garc´ıa-Cerezo, and J. Gon- zalez, “Experimental kinematics for wheeled skid-steer mobile robots,” in IROS, 2007, pp. 1222–1227.

[21] J. Yi, H. Wang, J. Zhang, D. Song, S. Jayasuriya, and J. Liu, “Kinematic modeling and analysis of skid-steered mobile robots with applications to low-cost inertial- measurement-unit-based motion estimation,” Trans. Rob., vol. 25, no. 5, pp. 1087–

1097, 2009.

[22] J. Markdahl, “Traction control for off-road articulated vehicles,” Master’s thesis, KTH Royal Institute of Technology, 2010.

[23] K. Park, H. Chung, J. Choi, and J. G. Lee, “Dead reckoning navigation for an autonomous mobile using a differential encoder and a gyroscope,” ICAR’97, pp.

441–446, 1997.

[24] H. Chung, L. Ojeda, and J. Borenstein, “Accurate mobile robot dead-reckoning with a precision-calibrated fiber-optic gyroscope,” Robotics and Automation, IEEE Transactions on, vol. 17, no. 1, pp. 80 –84, feb. 2001.

[25] D. Xu, M. Tan, and G. Chen, “An improved dead reckoning method for mobile robot with redundant odometry information,” ICARCV’02, Dec 2002.

(39)

References 27 [26] J. Saarinen, A. Maula, R. Nissinen, H. Kukkonen, J. Suomela, and A. Halme,

“GIMnet - infrastructure for distributed control of generic intelligent machines,”

in Robotics and Applications and Telematics, W¨urzburg, Germany, 2007.

[27] H. Fredriksson, “Remote controlled Caterpillar 973c,”

http://www.youtube.com/watch?v=DjpRvBn6CBk, July 2010.

[28] Caterpillar, “http://www.caterpillar.com,” September 2010.

[29] Sandvik Mining and Construction, “http://www.miningandconstruction.sandvik.com,”

September 2010.

[30] Atlas Copco, “http://www.atlascopco.com,” September 2010.

[31] R. Madhavan, M. Dissanayake, and H. Durrant-Whyte, “Autonomous underground navigation of an LHD using a combined ICP-EKF approach,” in IEEE International Conference on Robotics and Automation, vol. 4, 1998, pp. 3703–3708.

[32] J. Larsson, “Reactive navigation of an autonomous vehicle in underground mines,”

Licentiate Thesis, ¨Orebro Universitet, Sweden, 2007.

[33] E. S. Duff, J. M. Roberts, and P. I. Corke, “Automation of an underground mining vehicle using reactive navigation and opportunistic localization,” in Australasian Conference on Robotics and Automation, 2002.

[34] B. J. Dragt, F. R. Camisani-Calzolari, and I. K. Craig, “An overview of the au- tomation of load-haul-dump vehicles in an underground mining environment,” in Proceedings of the 16th IFAC World Congress, 2005, 2005.

[35] H. M¨akel¨a, “Overview of LHD navigation without artificial beacons,” Robotics and Autonomous Systems, vol. 36, no. 1, pp. 21–35, 2001.

[36] H. Fredriksson, “Teleoperated tractor - IceMaker I,”

http://www.youtube.com/watch?v=BvuyEAh3m8U, June 2010.

[37] L. Alexander, A. Gorjestani, and C. Shankwitz, “DGPS-Based Gang Plowing,” Uni- versity of Minnesota, Tech. Rep., April 2005.

[38] H.-S. Tan, B. Bougler, and P. Kretz, “A steering guidance system for snowplow - an interesting control problem,” vol. 5, 1999, pp. 5114 –5119 vol.5.

[39] H.-S. Tan, F. Bu, and D. Nelson, “Application of vehicle lateral control - automated snowblower,” jun. 2006, p. 6 pp.

[40] A. Gorjestani, L. Alexander, B. Newstrom, P.-M. Cheng, M. Sergi, C. Shankwitz, and M. Donath, “Driver assistive systems for snowplows,” University of Minnesota, Tech. Rep., March 2003.

(40)

28 References

[41] L. Aguilar M., P. Soueres, M. Courdesses, and S. Fleury, “Robust path-following control with exponential stability for mobile robots,” vol. 4, may. 1998, pp. 3279 –3284 vol.4.

[42] R. Solea and U. Nunes, “Trajectory planning with velocity planner for fully- automated passenger vehicles,” sep. 2006, pp. 474 –480.

[43] M. Egerstedt, X. Hu, and A. Stotsky, “Control of mobile platforms using a virtual vehicle approach,” Automatic Control, IEEE Transactions on, vol. 46, no. 11, pp.

1777 –1782, nov. 2001.

[44] K. Macek, R. Philippsen, and R. Siegwart, “Path following for autonomous vehicle navigation with inherent safety and dynamics margin,” jun. 2008, pp. 108 –113.

[45] S. K. Nayar, K. Ikeuchi, and T. Kanade, “Surface reflections: Physical and geometri- cal perspectives,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. PAMI-13, no. 7, pp. 611–634, 1991.

[46] V. V. Barun, “Imaging of retroreflective objects under highly nonuniform illumina- tion,” Optical Engineering 35(07), 1996.

[47] J. Rennilson, “Specialized optical systems for measurement of retroreflective mate- rials,” in Proc. SPIE Vol. 3140, p. 48-57, Photometric Engineering of Sources and Systems, 1997.

[48] B. So, Y. Jung, and D. Lee, “Shape design of efficient retroreflective articles,” Ma- terials Processing Technology, 130-131, 2002.

[49] V. V. Barun, “Estimations for optimal angular retroreflectance scale of road-object retroreflective markers,” in Proc. SPIE Vol. 3207, p. 118-125, Intelligent Trans- portation Systems, 1998.

[50] K. Hyypp¨a, “On a laser anglemeter for mobile robot navigation,” Ph.D. dissertation, Lule˚a University of Technology, Sweden, Apr 1993.

[51] R. McCluney, Introduction to radiometry and photometry. Artech House, 1994.

[52] B. K. P. Horn, Robot Vision. McGraw-Hill Book Company, 1986.

[53] T. Murano, T. Yonekawa, M. Aga, and S. Nagiri, “Development of high-performance driving simulator,” in SAE 2009 World Congress, April 2009.

[54] “CarSim,” http://www.carsim.com, May 2009.

[55] B. Schick, S. Witschass, and D. Legrand, “3d-track - give the simulation the chance for a better work!” in SAE Technical publication, 2006.

[56] “NovAtel,” http://www.novatel.com, May 2009.

(41)

References 29 [57] E. w. Weisstein, “Rotation matrix,” MathWorld - A Wolfram Web Resource, May

2009, http://mathworld.wolfram.com/RotationMatrix.html.

[58] N. Vandapel, S. Moorehead, W. Whittaker, R. Chatila, and R. Murrieta-Cid, “Pre- liminary results on the use of stereo, color cameras and laser sensors in antarctica,”

in International Symposium on Experimental Robotics, March 1999.

[59] W. Wijesoma, K. Kodagoda, and A. Balasuriya, “Laser and vision sensing for road detection and reconstruction,” in The IEEE 5th Internationa Conference on Intel- ligent Transportation Systems, 2002. Proceedings., Singapore, 2002.

[60] Z. Xu, “Laser rangefinder based road following,” in Mechatronics and Automation, 2005 IEEE International Conference, Niagra Falls, Canada, July 2005.

[61] S. Moorehead, G. R. Simmons, D. Apostolopolous, and W. Whittaker, “Autonomous navigation field results of a planetary analog robot in antarctica,” in ESA SP-440:

Artificial Intelligence, Robotics and Automation in Space, aug 1999.

[62] Lule˚a University of Technology, “Mobile robots,”

http://www.csee.ltu.se/∼hf/mobilerobots, Aug 2007.

[63] J. Craig, Introduction to Robotics: Mechanics and Control. Boston, MA, USA:

Addison-Wesley Longman Publishing Co., Inc., 1989.

[64] J. Forsberg, U. Larsson, and ˚A. Wernersson, “Mobile robot navigation using the range-weighted hough transform,” Robotics and Automation Magazine, IEEE, 1995.

[65] S. R¨onnb¨ack, T. Berglund, H. Fredriksson, and K. Hyypp¨a, “On-line exploration by circle sector expansion,” in IEEE International Conference on Robotics and Biomet- rics - ROBIO 2006, 2006.

[66] S. R¨onnback and A. Wernersson, “On filtering of laser range data in snowfall,” vol. 2, sep. 2008, pp. 17–33 –17–39.

[67] S. Fortune, “A sweepline algorithm for voronoi diagrams,” Algorithmica 2(2), pp.

153–174, 1987.

[68] P. Blaer, “Robot path planning using generalized voronoi diagrams,”

http://www.cs.columbia.edu/ pblaer/projects/path planner/, Feb 2006.

[69] C. M. Gold and J. Snoeyink, “A one-step crust and skeleton extraction algorithm,”

Algorithmica, vol. 30, pp. 144–163, 2001.

[70] R. Ogniewicz and M. Ilg, “Voronoi skeletons: theory and applications,” in IEEE Conference on Computer Vision and Pattern Recognition CVPR ’92, June 1992, pp. 63–69.

(42)

30 References

[71] N. Rao, “Robot navigation in unknown generalized polygonal terrains using vision sensors,” IEEE Transactions on Systems, Man and Cybernetics,, vol. 25, no. 6, pp.

947–962, June 1995.

[72] R. Mahkovic and T. Slivnik, “Constructing the generalized local voronoi diagram from laser range scanner data,” in IEEE Transactions on Systems, Man and Cyber- netics, Part A, 2000, pp. 710 – 719.

[73] T. Pendragon and L. While, “Path-planning by tessellation of obstacles,” in Proceed- ings of Conferences in Research and Practice in Information Technology(ACSC’03), M. Oudshoorn, Ed., vol. 16, Adelaide, Australia, 2003.

[74] P. Beeson, N. K. Jong, and B. Kuipers, “Towards autonomous topological place detection using the extended voronoi graph,” in IEEE International Conference on Robotics and Automation (ICRA’05), 2005.

[75] E. Acar, H. Choset, and J. Lee, “Sensor-based coverage with extended range detec- tors,” IEEE Transactions on Robotics and Automation, vol. 22, no. 1, Feb 2006.

[76] O. Takahashi and R. Schilling, “Motion planning in a plane using generalized voronoi diagrams,” IEEE Transactions on Robotics and Automation, vol. 5, no. 2, pp. 143–

150, April 1989.

[77] G. Sakellariou, M. Shanahan, and B. Kuipers, “Skeletonisation as mobile robot navigation,” in Towards Autonomic Robotic Systems (TAROS-04),, 2004.

[78] S. M. LaValle, Planning Algorithms. Cambridge University Press, 2006.

[79] J. Borenstein and Y. Koren, “Real-time obstacle avoidance for fast mobile robots in cluttered environments1,” in IEEE International Conference on Robotics and Automation, 1990.

[80] E. Haines, Graphics Gems IV, P. Heckbert, Ed. Academic Press, 1994.

[81] S. R¨onnb¨ack, D. Rosendahl, and K. Hyypp¨a, “A MATLAB/Java interface to the MICA wheelchair,” The 1st IFAC Symposium on Telematics Applications in Au- tomation and Robotics, pp. –, July 2004.

[82] J. A. da Cruz Pinto Gaspar, “Omnidirectional vision for mobile robot navigation,”

Ph.D. dissertation, Universidade T´ecnica de Lisboa,Instituto superior T’ecnico, Dec 2002.

[83] C. udas de Wit, H.Khennouf, C.Samson, and O.J.Sordalen, Nonlinear control design for mobile robots, Nonlinear control for mobile robots. World Scientific series in Robotics and Intelligent Systems, 1993, ch. 5.

[84] OSG Community, “Open Scene Graph,” www.openscenegraph.org, Dec 2005.

(43)

References 31 [85] CASTT, “http://www.ltu.se/castt,” June 2010.

[86] A. F. Andreev, V. Kabanau, and V. V. Vantsevich, Driveline Systems of Ground Vehicles: Theory and Design. CRC Press, 2010.

[87] T. D. Gillespie, Fundamentals of Vehicle Dynamics. Society of Automotive Engi- neers Inc., 1992.

[88] U. Kiencke and L. Nielsen, Automotive Control Systems, For Engine, Driveline, and Vehicle, 2nd ed. Springer Verlag, 2005.

[89] T. M. Hunt and N. Vaughan, The Hydraulic Handbook. Elsevier Science, 1996.

[90] P. Danielsson, “Teleoperated tractor: development of a graphical user interface,”

Master’s thesis, Lule˚a University of Technology, Sweden, 2008.

[91] H. Fredriksson, P. Danielsson, S. R¨onnb¨ack, and K. Hyypp¨a, “snowBOTs: Gyro guided steering of a teleoperated tractor during winter conditions,” in International Workshop on Research and Education in Mechatronics, Bergamo, Italy, 2008.

[92] M. Bak, N. Poulsen, and O. Rawn, “Path following mobile robot in the presence of velocity constraints,” Technical University of Denmark, Deparment of Automation, Tech. Rep., 2001.

[93] J.-C. Latombe, Robot Motion Planning. Norwell, MA, USA: Kluwer Academic Publishers, 1991.

[94] U. Wiklund, U. Andersson, and K. Hyypp¨a, “AGV navigation by angle measure- ments,” in Proc. 6th Int. Conf. Automated Guided Vehicle Systems. IFS Ltd and authors, Oct 1988, pp. 199–212.

[95] G. Franklin, J. Powell, and A. Emami-Naeini, Feedback Control of Dynamic Systems.

Prentice Hall, 2006.

[96] K. ˚Astr¨om and B. Wittenmark, Computer Controlled Systems and Theory and De- sign. Prentice Hall, 1997.

(44)

32 References

(45)

Part II

33

(46)

34

(47)

Paper A Multi source flash system for retroreflective beacon detection in CMOS cameras

Authors:

H˚akan Fredriksson, Kalevi Hyypp¨a

Reformatted version of paper originally published in:

Society of Photo-Optical Instrumentation Engineers, Optical Engineering

 2008, SPIEc

35

(48)

36

(49)

Multi source flash system for retroreflective beacon detection in CMOS cameras

H˚akan Fredriksson, Kalevi Hyypp¨a

Abstract

We present a method for improving a flash system for retroreflective beacon detection in CMOS cameras. Generally, flash systems are designed in a manner that make them suited for beacon detection in a small range interval. We strive to increase the flash system range interval by exploiting the directional properties of the retroreflector. Due to these properties light sources placed relatively far away from the optical axis of the camera will contribute only when the retroreflector is far away. This fact can be used to compensate for the one over distance squared dependency of optical power. Underlying theory and formulae are presented. A flash system consisting of several light emitting diodes was designed considering the presented method. Simulations show that the usable flash range of the improved system can be almost doubled compared to a general flash system. Tests were performed indicating that the presented method works according to theory and simulations.

1 Introduction

In this paper we propose a practical method on how to design a flash system intended to illuminate retroreflective beacons for detection in a CMOS camera. We start by giving a brief description on the optical properties of retroreflectors. We then introduce underlying theory and formulae. Furthermore, we present an improved flash system, simulated and built according to the ideas presented in this article.

The general idea behind the paper is to design a flash that can compensate for the normal one over distance squared drop off in optical power. This is done by taking advantage of the directional properties of the retroreflector. Such a flash can improve the beacon detection distance for all camera systems with a limited dynamic range.

1.1 Properties of retroreflective surfaces

A retroreflective surface has the property that it reflects most of the incoming light within a very narrow angle right back to the source with only a small dependency of incident angle. This property is in contrast to an ordinary bright surface like a paper sheet, which has a very diffuse reflection, or a mirror, which has a specular reflection [1]. In the area of road safety and road markers, work has been done to calculate, simulate, and measure retroreflective properties of different retroreflective materials [2, 3, 4, 5].

37

(50)

38 Paper A

Retroreflective materials will appear much brighter than non retroreflective surfaces when illuminated from a light source close to the optical axis of the camera, see FIG.

1. How much, depends on the properties of the retroreflective material and the power

Figure 1: In the figure two retroreflective beacons at app. 3m and 6m distances are clearly visible, while the background is dim. This figure shows the effect of taking a photograph of a retroreflective surface with a strong flash close to the camera. The picture is taken with an ordinary digital camera with a built in flash.

of the surrounding light sources. Due to the directional properties of the retroreflective material light sources placed relatively far away from the optical axis of the camera will contribute only when the retroreflector is far away. This fact can be used to compensate for the 1/distance2 dependency of optical power.

1.2 Detecting retroreflective surfaces

When designing a flash system for retroreflective beacon detection in a CMOS camera there are mainly two criterions that has to be fulfilled. The first criteria is the requirement

(51)

1. Introduction 39 to detect beacons at large distances. To improve the detection distance one needs to increase the power of the flash. The second criteria is to avoid blooming around bright objects. If the flash is too strong the beacons will appear too bright for the camera chip and cause the individual pixels to saturate and bleed into surrounding pixels. With the camera and lens parameters fixed, the only way to avoid blooming is to limit the power of the flash.

One way to both increase the detection distance and at the same time avoid the blooming problem is to use a (more expensive) wide dynamic range (WDR) camera.

Such camera can be adjusted so that blooming does not occur and at the same time give an acceptable low light performance. Though the use of a WDR camera can improve the performance of the system it might not always be enough. Also, in some cases it could be preferable to use a cheap CMOS camera with low dynamic range and still get reasonable dynamic performance. Therefore, we introduce the flash design described in this paper.

1.3 General Background

The camera/flash system referred to in this paper is part of the prototype navigation system seen in FIG. 2. The system uses retroreflective beacons as fixed reference points.

Figure 2: An ordinary LED flash consisting of 16 Light Emitting Diodes mounted in a circle around the camera lens. All LEDs are placed at equal distance from the optical axis of the camera. The flash is mounted on a CMOS camera based navigation system prototype.

In FIG. 1 two of these beacons are shown at different distances. The camera/flash system

(52)

40 Paper A

estimates the distances and headings to the beacons and uses that information as input to a navigation process. Navigation systems that use this type of beacons as references have been on the market for several years; NDC8, also known as LazerWay, developed by one of the authors [6] and today produced by Danaher Motion is one of those systems.

Common for the present systems is that they use a scanning laser for detection of beacons.

We are working on a camera based system with no moving parts.

2 Flash system equations

In this section we present the equations necessary for designing an LED flash system for retroreflective beacon detection with a CMOS camera. All the calculations and de- scriptions are based on the assumption that the camera and the flash are mounted with the optical axis in the horizontal plane. The flash is supposed to illuminate beacons as the ones shown in FIG. 1. The beacons are assumed to be in the vertical middle of the image, and they are expected to be found anywhere in the horizontal plane. Hence the important field of view lies in the horizontal plane. Though the vertical field of view is not forgotten, it is not of great importance in our application.

In the end of this section we will present a formula for calculating the maximum received optical power by a receiver situated close to the light source, as a function of distance and angle to the beacon. In our system the receiver is one single pixel in the CMOS camera chip. The formula takes into account all the individual LEDs relationships to the receiver.

2.1 Optical power from Light Emitting Diodes

The radiant intensityIS, from a single LED is dependent on the emitting angleθ, where θ = 0 is along the optical axis of the LED. This dependency is assumed to be symmet- ric around the optical axis, and is in these calculations approximated with a Gaussian distribution,

IS(θ) = IS0e−2(θS0θ )2. (1) The parameterθS0is the angle where the intensity has dropped to IeS02 , andIS0is the on axes radiant intensity produced by the diode.

To improve the horizontal emitting angle of a complete flash with n diodes, the individual diodes can be slightly tilted in the horizontal plane by the parameter θSi, and hence the total radiant intensity in the horizontal planeIStot(θ) for the whole flash becomes

IStot(θ) = IS1e−2(θ−θS1θS10 )2

+... + ISne−2(θ−θSnθSn0)2, (2) where ISi and θSi0 are the parameters for the diode i. This expression is valid under the assumption that r is much smaller than R, see FIG. 3. If the diodes are placed symmetrically around the receiver, the optical axis of the complete flash can be considered to be the same as for the receiver.

(53)

2. Flash system equations 41

Figure 3: Source, reflective beacon, and receiver constellation with multiple sources. The dis- tanceR is actually much greater than r, but the figure is scaled to enhance the angle α. In a real application the diodes are placed symmetrically around the receiver so that the optical axis of the flash and the receiver can be considered to be the same, though the optical axis of the individual LEDs are not.

2.2 BRDF of retroreflective beacons

The irradianceEB, received by the beacon can according to [7] be expressed with EB= IS

R2, (3)

where R is the distance between the source and the beacon. This expression is valid under the assumption that the incoming radiant intensityIS is constant over the whole reflector, and the source is considered to be a point source.

The Bidirectional Reflectance Distribution Function (BRDF) is used to calculate the reflected radiance from the beacon towards the receiver. The general definition of the BRDF is defined as the ratio of differential radiance to differential irradiance

fBi, φi;θe, φe) = δLBe, φe)

δEBi, φi), (4)

where (θi, φi) and (θe, φe) are the directions of the incoming and exiting light respectively [8].

The reflected radianceLB, from the beacon towards the receiver, can when the source is considered to be a point source according to [6] be written in the form

LB=EBfBi, φi;θe, φe). (5) We ignore the small influence the incident angle to the beacon gives on the BRDF function since this mainly will give a damping on the reflected radiance. We also assume that the BRDF of a retroreflective beacon is symmetric around the incident angle, and only dependent on the angleα between the source and the detector direction, see FIG.

3.

(54)

42 Paper A

In [6] the BRDF is assumed to have a Gaussian distribution and the expression fB= 2ηB

πα2B0e−2(αB0α )2 (6)

is presented. The beacon specific parameters ηBandαB0 are representing the efficiency and distribution of the reflected light respectively.

An approximation of the angle α between the source LED and receiver optics seen from the beacon can be calculated with

α = arctan(r cos(θ)

R ) r cos(θ)

R , (7)

where r is the small distance between the source and the receiver and R is the distance to the beacon, see FIG. 3. This expression is valid when the LEDs are situated in the horizontal plane of the receiver. If the LEDs are placed elsewhere the impact of the parameter cos(θ) is reduced.

When combining (6) and (7) it becomes clear that the BRDF is dependent on the distancer between the source and receiver, and the distance R to the beacon. Since the distancer is fixed in a given flash constellation, the BRDF can be seen as a function of R andθ, fB(R, θ). On the receiver optical axis, i.e. for θ = 0, the BRDF is only a function of R, fB(R).

In a multiple source configuration, see FIG. 3, where the distance ri between the source and the receiver differs for different sources, the BRDF has to be calculated for every unique source,

fBi= 2ηB

πα2B0e−2(αB0Rri )2. (8)

2.3 Optical power at the receiver

The optical receiver in our system is a CMOS camera chip with a lens system. Since we are interested in avoiding blooming in the camera chip we have to calculate the maximum received optical power in every pixel of the chip. The calculation is done under the assumption that the beacon is so close to the camera that the width of the beacon is visible in at least one whole pixel.

The optical powerφD reaching the detector, i.e. one pixel in the camera chip, can according to [7] be calculated with

φD=KCLB, (9)

whereKCcontains the parameters of the detector and the lens. The necessary parameters are the area of the detector AD, the transmittance of the lensTL, the diameter of the lensDL, and finally the focal distancefL. To calculateKC we use the formula

KC =πADTLNA2, (10)

References

Related documents

The conclusions drawn in this thesis are that Apoteket International has started activities abroad to adapt to the liberalization of the market, it has reorganized

Thanks to the pose estimate in the layout map, the robot can find accurate associations between corners and walls of the layout and sensor maps: the number of incorrect associations

46 Konkreta exempel skulle kunna vara främjandeinsatser för affärsänglar/affärsängelnätverk, skapa arenor där aktörer från utbuds- och efterfrågesidan kan mötas eller

The increasing availability of data and attention to services has increased the understanding of the contribution of services to innovation and productivity in

Av tabellen framgår att det behövs utförlig information om de projekt som genomförs vid instituten. Då Tillväxtanalys ska föreslå en metod som kan visa hur institutens verksamhet

Figure 12 shows the main window of the graphical user interface, when the client is con- nected to the controller program on the tractor.. 4.4.4 Component Description of the

The keywords used when describing the tasks of the operator of the future were: interpretation, system control, communication, analysis, adjustments, cooperation,

Instead, the steering angle of the tractor is estimated by a motion model, using vehicle velocity and gyro information as input.. One advantage with this setup is the lack of need