• No results found

GRAHN,ANTON&TH˚ALIN,ADAM FraMe

N/A
N/A
Protected

Academic year: 2022

Share "GRAHN,ANTON&TH˚ALIN,ADAM FraMe"

Copied!
63
0
0

Loading.... (view fulltext now)

Full text

(1)

IN

DEGREE PROJECT TECHNOLOGY, FIRST CYCLE, 15 CREDITS

STOCKHOLM SWEDEN 2020,

FraMe

Design and construction of an automatically framing camera mount

GRAHN, ANTON

THÅLIN, ADAM

(2)
(3)

FraMe

Design and construction of an automatically framing camera mount

GRAHN, ANTON & TH˚ALIN, ADAM

Bachelor’s Thesis at ITM Supervisor: Nihad Subasic

Examiner: Nihad Subasic

TRITA-ITM-EX 2020:42

(4)
(5)

Abstract

The scope of this bachelor’s thesis was to investigate the use of infrared light to track an object in an image. The goal of the report was to build a full-scale prototype of a camera mount to understand what type of setup is ideal for deliv- ering good tracking performance with infrared (IR) light, an IR-camera and a reflector. The tracker used IR-lights that shines light on the reflector which makes it the bright- est spot in the image. Visible light was removed with an analog filter in front of the camera. A microcomputer, in this case a Raspberry Pi 3B+ was used to process the im- age from the camera, find the brightest spot and then turn the camera with two servo-motors. This resulted in a two- axis motion that made sure that the brightest spot always stays in the middle of the frame. The testing of the system was done in two steps. First of all, five different shapes of reflector was tested to establish which shape ensures the best tracking performance, in all lighting conditions. The results from the testing was then compared to other vision- based tracking methods covered in other bachelorss thesis’

at KTH. The results showed that IR-tracking perform well in conditions with low ambient light while other vision- based tracking methods, like color tracking works better in conditions with lots of light.

Keywords: Mechatronics, Raspberry Pi, Camera, Infrared, Video tracking

(6)

Design och kontruktion av ett automatiskt kamerafäste.

I den h¨ar rapporten unders¨oks anv¨andningen av infrar¨ott ljus f¨or att sp˚ara ett objekt i en bild. M˚alet med rap- porten var att bygga en fullskalig prototyp av ett kame- raf¨aste f¨or att unders¨oka vilken typ av upps¨attning som kunde leverera god prestanda g¨allande bildsp˚arning med hj¨alp av infrar¨ott (IR) ljus, en IR-kamera och en reflex.

F¨or sp˚arningen anv¨andes IR-lampor som lyste p˚a reflexen i syfte att g¨ora den till den ljusaste punkten i bilden. Synligt ljus filtrerades bort med ett analogt filter som sattes framf¨or kameran. En mikrodator i form av en Raspberry Pi 3B+

anv¨andes f¨or att behandla bilden fr˚an kameran, identifiera ljusaste punkten och sedan rotera kameraf¨astet med hj¨alp av tv˚a servomotorer. Rotationen skedde kring tv˚a axlar f¨or att se till att reflexen alltid befann sig mitt i bild. Testning- en av upps¨attningen gjordes i tv˚a steg. F¨orst analyserades fem olika former p˚a reflexen f¨or att unders¨oka vilken form som b¨ast f¨ors¨akrade att reflexen alltid ¨ar ljusaste punkten i bilden, ¨aven vid olika ljusf¨orh˚allanden. Testresultaten fr˚an det andra testet kunde sedan j¨amf¨oras med tidigare kandi- datuppsatser som skrivits vid KTH. Det konstaterades att IR-sp˚arning ger b¨attre prestanda vid ljusf¨orh˚allanden med svagt ljus, medan sp˚arning med hj¨alp av f¨argigenk¨anning ger b¨attre prestanda vid f¨orh˚allanden med mycket ljus.

Nyckelord: Mekatronik, Raspberry Pi, Kamera, IR, Bild- sp˚arning

(7)

Acknowledgements

We would like to thank Nihad Subasic for support during the entire project. We would also like to thank Seshagopalan Thorapalli Muralidharan and Staffan Qvarn- str¨om for helping us with parts and construction ideas.

Grahn & Th˚alin Stockholm, May 2020

(8)

1 Introduction 1

1.1 Background . . . 1

1.2 Purpose . . . 1

1.3 Scope . . . 2

1.4 Method . . . 2

2 Theory 3 2.1 Controller . . . 3

2.2 Vision Based Tracking Methods . . . 3

2.3 Infrared Camera . . . 4

2.4 Servo Motors . . . 6

2.5 Light Emitting Diode . . . 7

2.6 OpenCV . . . 8

2.7 Control Theory . . . 8

3 Method 11 3.1 Hardware . . . 11

3.1.1 Ideas . . . 11

3.1.2 Modeling . . . 11

3.1.3 Manufacturing . . . 12

3.2 Software . . . 13

3.2.1 Modeling . . . 13

3.2.2 Image processing . . . 13

3.2.3 Tuning . . . 14

3.2.4 Multiprocessing . . . 15

4 Results 17 4.1 Finished Prototype . . . 17

4.2 IR Reflector Study . . . 18

4.2.1 Disturbance test . . . 18

4.2.2 Performance Study . . . 19

5 Conclusion and Discussion 21 5.1 Discussion . . . 21

(9)

5.2 Conclusion . . . 21 5.3 Recommendations for future work . . . 22

Bibliography 23

Appendices 25

A Specifications for the servo motors 27 B Specifications for the Raspberry pi 3B+ 31 C Specifications for the PiNoIR v2 camera 35

D Python code 37

D.0.1 FraMe.py . . . 37 D.0.2 brightKlass.py . . . 45 D.0.3 pid.py . . . 47

(10)

2.1 Graph over the visual spectrum received by a regular digital camera, x-axis in nm. [10] . . . 5 2.2 Graph over the visual spectrum received by a camera sensor modified

for the NIR band, x-axis in nm. [10] . . . 5 2.3 Block chart describing a servo controlled motor [11]. . . 7 3.1 Render of final model, created in Siemens Solid Edge 2019 [21] and

KeyShot 9.3 [22]. . . 12 3.2 Block chart describing the control system, created in Adobe Illustrator

2020 [23]. . . 13 4.1 Full scale model used for validation and testing, image taken on iPhone 8. 17 4.2 Reflector test cases, created in Adobe Illustrator 2020 [23]. . . 18

(11)

List of Abbreviations

DCDirect Current FIR Far Infrared GaAsGallium Arsenide

GPIOGereral-Purpose In-/Output

IDEIntegrated Development Environment IR Infrared

KTHKungliga Tekniska H¨ogskolan LEDLight Emitting Diode

LWIR Long Wavelength Infrared MWIRMid Wavelength Infrared NIR Near Infrared

OpenCVOpen source Computer Vision PID Proportional, Integral and Derivative PLA Polyactic Acid

PWMPulse-Width Modulation RGBRed-Green-Blue

SBC Single Board Computer SWIR Short Wavelength Infrared

(12)
(13)

Chapter 1

Introduction

This is a bachelor’s thesis report in mechatronics at Kungliga Tekniska H¨ogskolan (KTH) aiming to investigate infrared tracking techniques by making a camera mount that can track a target and center that target in the camera’s view.

1.1 Background

Recording yourself and uploading it on the internet has become more popular in the last decades, thanks to the internet and online social media. All you need to upload a video on the internet is a video and an internet connection. Regardless of genre, people who are recording videos alone are often doomed to have the camera in a fixed position. For example, a lecturer who wants to show several things that require a lot of space needs to make an awkward pause in order to rearrange the camera, which can cause the viewer to be distracted. This could be avoided with a target tracking pan-tilt camera mount.

1.2 Purpose

The purpose of this project is to investigate in what way an infrared camera tracking system can be implemented on a consumer camera. The system should be relatively affordable and easy to use. It should also be compatible with a variety of consumer cameras. The tracking system is thought to be contained on the unit itself, no external cables or sensors should be needed. The tracking system is intended to work in normal lighting conditions and have performance comparable to a human operator in a range of 0 − 10 m. A goal with the camera system is that it will produce smooth videos without stuttering. The main goal with this thesis is to answer the following questions:

• What shape of reflector on the tracked object assures good performance for infrared tracking?

(14)

• How does the performance of an infrared tracking system compare to other comparable vision-based tracking systems?

1.3 Scope

To simplify the image processing part, the target will be a light reflector that is lit on by infrared lights. That assures that it emits infrared light, which makes it easier to identify the brightest spot. All tests were performed indoors and with a mid-size system camera. This project was also on a budget, which is set to 1000 SEK and the time frame stretched from January 2020 to May 2020. In order to succeed, the following steps need to be completed.

• Build a biaxial camera mount that can rotate a consumer camera.

• Construct a tracking system utilizing an infrared camera that controls the camera mount to track a target in a smooth way.

• Test a variety of infrared reflectors for use on the tracked object.

The camera mount is not intended as a commercial product, it should be thought of as a proof of concept of the underlying technology.

1.4 Method

This project was divided into a number of different sub-tasks. The work started with a theoretical study to investigate what has been done previously in this do- main and what kind of components that was necessary. The next task was to build a functioning prototype to test out the system. Based on the learnings from the pro- totype a final product was constructed. This product was then used to investigate the main goals of this report.

(15)

Chapter 2

Theory

This chapter describes the theoretical background of the sub-systems used in this project.

2.1 Controller

To be able to control the movement of a system, a controller is needed. For this project, a single board computer (SBC) was used as a controller to program and control the movements of the camera mount. There are several controllers on the market to choose from depending on performance requirements for the specific type of use. For this project an Arduino Uno was issued to encourage the use of a micro controller.

But for this project an Arduino would not have been able to provide enough working memory or clock-speed for tasks like image processing and controlling two motors simultaneously. That is why an SBC, like the Raspberry Pi 3 B+ was a better op- tion. It has a higher clock-speed than the Arduino Uno and it can run an operating system, which allows libraries and other software to be installed. Depending on what libraries that are installed, it can be programmed in almost any language. [2]

2.2 Vision Based Tracking Methods

Vision based tracking means that a specific target is tracked based on input from a computer vision system, usually an image or a video frame from a camera [7].

In order to track a target, the target needs to be defined. An earlier Bachelor’s thesis at KTH gives an example of how color tracking can be used to identify the target. Color tracking means that the system analyses a video frame and pick out the spot where the chosen color has the highest intensity [3].

There is also a method that uses image correlation. For target tracking, a tem-

(16)

plate image is needed to define the target. The system then convolves the two images to determine where in the video frame the template image fits the best. [6]

The method used in this project will be infrared tracking. It uses almost the same process as color tracking but instead of visible light, it locates the spot where the infrared light has the highest intensity. To make it easier, the target will be a reflec- tor and one or more infrared lights will be pointed towards the reflector. Depending on what filters that are applied to the camera, it is possible to get different image outputs. If infrared light, near visible light is let through, it will give a night vision image, often used in security camera systems. If infrared light, far from visible light are let through, heat will also be seen in the image. For target tracking, the former is the better choice, because it produces less disturbances. The target should be the brightest spot, not the warmest spot.

2.3 Infrared Camera

Infrared (IR) light is a part of the electromagnetic spectrum with wavelengths from 0.7 to 300 µm [8]. This spectrum is divided in to five bands; Near Infrared (NIR), Short Wavelength Infrared (SWIR), Mid Wavelength Infrared (MWIR), Long Wave- length Infrared (LWIR) and Far Infrared (FIR), where NIR has the shortest wave- lengths, this means that it is closest to our visual spectrum. LWIR infrared and FIR are the bands that are also known as thermal infrared. These cover the frequencies emitted by hot bodies and are the bands received by a thermal imaging camera. NIR and SWIR are known as reflected infrared. Surveillance systems and night vision technology utilizes the NIR band. Often combined with active illumination, that is some kind of light source illuminating the scene in the specified frequency band [9].

The NIR band is cheap and practical, a normal camera sensor can be utilized given the right filtration. This brings down cost and makes it the most commonly used infrared camera system.

The received spectrum for a regular digital camera is depicted in figure 2.1, the active color spectrum is framed by the light blue line. A filtration system consisting of two filters makes this layout possible, one filter blocking out ultraviolet light and one filter blocking out infrared light. The black line shows the sensors sensitivity to different light frequencies.

(17)

CHAPTER 2. THEORY

Figure 2.1: Graph over the visual spectrum received by a regular digital camera, x-axis in nm.

[10]

If the sensor is modified with a different filtration system, an NIR-camera can be created, see figure 2.2. The infrared filter has been removed and has been replaced with the exact opposite, an infrared longpass filter. This filter lets through light with a longer wavelength than a specified amount (in this case 720 nm). This creates a received band in the near infrared range, limited by the longpass filter and the sensors sensitivity. [10]

Figure 2.2: Graph over the visual spectrum received by a camera sensor modified for the NIR band, x-axis in nm. [10]

The camera that was used was a Raspberry PiNoir v2. The word Noir in the name,

(18)

means black in French, and suggests that it is a camera that is made for night vision purposes. It has no IR-filter that blocks the infrared light. The camera could also be equipped with two IR-LEDs ´a 0, 6 A each. The IR-LEDs needed to be powered separately from the camera, and was used as the source of infrared light. The camera could have been any camera without an IR-filter, and the IR-LED lamps could have been any IR-LED lamps. However, with this kind of setup available on the market, designed to fit the Raspberry Pi 3 B+, this was the cheapest and simplest option.

2.4 Servo Motors

A servo motor is an electric motor with very precise state feedback, whether it is torque, angle or angle velocity. The servo motor system consists of a number of different parts, see figure 2.3. The interface panel is in this case a SBC serving as a controller. It sends a command signal to the servos positioning controller. The positioning controller sends a signal to the servo control that supplies the motor with power. To ensure that the motor is turned to the right angle, a positional sensor sends a signal back to the positioning controller in a process called Negative Feedback. Negative feedback essentially means that the controller continuously ad- justs to the outputs actual state not just what it is calculated to be. This is critical for a smooth and accurate motor control [11].

In this project, a small and inexpensive type of servo motor commonly referred to as an RC Servo. This type of servo uses a small brushed Direct Current (DC) motor geared through a number of nylon gears [12]. This enables a smooth motion without stuttering which is critical for this case. The servo uses a closed loop system with a PID-controller to regulate the positioning of the motor. On an RC Servo the positional angle is read by a potentiometer and fed in to the positioning controller to close the loop .

(19)

CHAPTER 2. THEORY

Figure 2.3: Block chart describing a servo controlled motor [11].

In this project a so called standard-size RC-Servo will be used. The chosen servo is the FeeTech FS5106B. This servo was chosen for its high torque and low cost, see appendix A for full details.

2.5 Light Emitting Diode

A Light Emitting Diode (LED) is a type of illumination device. LEDs are based on semiconductor technology and emit a narrow spectrum light when voltage is applied [13]. The emitted spectrum can vary, there is LED lights emitting wavelengths from 250 nm and longer. The light spectrum for a LED is determent choosing a specific semiconductor and mixing in impurities to the semiconductor. This process is called doping.

In this project LEDs using IR-light will be utilized for illumination. IR-LEDs com- monly use a semiconductor material called Gallium Arsenide (GaAs) [14] (p.527- 529). To precisely modify the emitted spectrum additional metals are doped in, for example zinc that enables wavelengths longer than 870 nm.

(20)

2.6 OpenCV

Open Source Computer Vision Library (OpenCV) is a library of functions used for computer vision. OpenCV was originally developed as an internal project at Intel in 1999 [18]. It is however as the name implies open-source, meaning that anyone can modify the source code and it is available for free for anyone to use. The library is used by a wide range of organizations from government agencies to companies like Google and IBM [19]. The OpenCV main library contains over two thousand optimized algorithms for different types of image processing. For example stitch images together to create a panorama, recognize a human face in an image and follow eye movements. In this project the main uses of OpenCV will be import, filtering and finding the brightest spot in a given image. The library has interfaces to four different programming languages; C++, Python, Java and MATLAB. In this project, the code will be written in Python.

2.7 Control Theory

Control theory is about making a system behave in a desired way. For example, when the throttle on a car is pushed down, the desired outcome is that the engine revs up and when it is released, the engine revs down. In this project, the desired outcome is that the target is centered in the picture. So, when the tracked object is moved from the center, the system should rotate the camera until the object is centered again.

In order to achieve such behavior, there are a few different controllers to choose from. One of the simplest controllers is a PID controller. The letters PID stand for Proportional, Integral and Derivative respectively. PID controllers that does the following computation:

u(t) = g(e(t)) = KP · e(t) + KI

Z t

t−∆te(τ)dτ + KD

d

dte(t), (2.1) where

e(t) = r(t) − y(t), (2.2)

have been around since the 18th century and because of its simplicity and ability to provide good performance, it is still by far the most common type of controller used in industrial systems [15].

It adjusts the control signal u(t) depending on the control error e(t) between a reference signal r(t) and the output signal y(t). The proportional term is intuitive.

It says that if there is a positive error, the output signal needs to increase and vice versa. The integrating term gives information about the past by integrating the error over time. If the error has been negative in the past, it decreases the output signal. The differentiating term is doing the exact opposite, it gives information

(21)

CHAPTER 2. THEORY

about the future. If the error decreases too fast, it slows the system down in order to lessen the potential overshoot. This kind of control (2.1) is called feedback con- trol because it uses the momentary output to compute the control error.

By observing (2.1), one can understand that if the error is zero, the output sig- nal will also be zero. Therefore the error will never be able to stay at zero. This is called a static error and is needed to provide a steady output signal.

Tuning is made by choosing the gain constants KP, KIand KD. The gain constants can be computed and fine tuned to meet certain requirements. But according to Glad & Ljung [15] is it not uncommon that no mathematical model for the system exists, which means that the gain constants must be determined by trial and error.

One method for trial and error is:

• First, set KI = 0 and KD = 0 and increase KP until the system starts to oscillate. KP should then be set to half of that value.

• Increase KI until the system settles fast enough, but watch out for instability.

• Increase KD until the oscillations are small enough, but watch out for insta- bility.

Settling means that the output signal stays within 5% from the reference value. A method like this one gives a good starting point but does not guarantee good perfor- mance. The system often needs further adjustments. Then it is helpful to know the effects of respective gain and how they interact with each other in practice. Such experience cannot be taught by book [16].

One thing to look out for when tuning a control system is instability, which means that the oscillations of the system’s amplitude increases over time instead of de- creasing or remain constant. The system will in such case never settle, but instead keep oscillating without control.

(22)
(23)

Chapter 3

Method

This chapter will cover development and implementation of both hardware and software needed to meet the goals of this bachelor’s thesis.

3.1 Hardware

In this section; idea generation, modeling and manufacturing of the hardware will be covered. The goal of the hardware is to create a structure that can pan and tilt a mid-sized consumer camera.

3.1.1 Ideas

Prior to designing the hardware of the camera mount a state-of-the-art search was conducted. A state-of-the-art search is a search of what products and ideas are leading the development-field right now. It includes looking at a number solutions with similar functionality to determine a design strategy. A number of different solutions was examined; regular motorized camera mounts, self-leveling gimbals and movement solutions for security cameras. Finally a design similar to a security camera was determined to be the best option in this case. It is built up by a rotating plate that handles the cameras panning motion, that is the rotation side-to-side.

On that rotating plate sits a tilting platform that handles the cameras up and down motion.

3.1.2 Modeling

Modeling of the parts was done in a Computer Aided Design (CAD) software. More specifically Solid Edge 2019 [21] by Siemens AG. Solid Edge is a solid modeling software used in engineering applications. The final model can be seen in figure 3.1.

(24)

Figure 3.1: Render of final model, created in Siemens Solid Edge 2019 [21] and KeyShot 9.3 [22].

3.1.3 Manufacturing

The manufacturing of parts for this project can be categorized in three main meth- ods; 3D-printing, laser cutting and lathe turning. The majority of manufactured parts were 3D-printed, an additive method where molten plastic is extruded layer by layer to create a 3D shape. This offers fast manufacturing times and high flexibility at the design stage. A 3D model is exported from a CAD software and processed for printing through a specialized software tailored to the 3D-printer. The drawbacks are a relatively low precision, most noticeably in places like holes for axles etc. The material used for 3D-printed parts was Polyactic Acid (PLA). It has a good combi- nation of low cost and good 3D-printing characteristics [17].

For some parts laser cutting was more appropriate. Laser cutting is ideal when large flat surfaces need to be cut in two dimensions. In this case a two dimensional draft is exported from the CAD software to the laser cutter. The precision is in general high. The laser cutter used in this project can cut both plastic and wood in dimensions up to 8 mm. In this case 4 mm plywood was used for the flat surfaces.

Finally a small steel axle was manufactured on a manual lathe. Lathe is a sub- tracting manufacturing method where the manufactured part is rotating and a sta- tionary tool cuts the surface. This is a great method to ensure high precision on axis-symmetrical parts, that is parts that are symmetrical around one axis.

(25)

CHAPTER 3. METHOD

3.2 Software

In this section, modeling and tuning of the software will be covered.

3.2.1 Modeling

The system consists of three sub-systems; camera, control system and the servo motors. The whole process can be described as the closed loop system seen in Figure 3.2. The camera captures a video frame, which is the input to H. The image processing unit H takes the video frame as input and returns four coordinates, the x- and y- coordinates of the target and the center of the picture. To regulate the error, two PID-controllers, Fx and Fy, are used to regulate the angle of the pan and tilt servo motors respectively. Given an angle in degrees, G transforms the value to a pulse-width length and uses that value as input to the servo motor, which then moves the camera to the given angle. The camera then captures a new video frame from the new position and thereby, the loop is closed.

Figure 3.2: Block chart describing the control system, created in Adobe Illustrator 2020 [23].

3.2.2 Image processing

The goal with the image processing part was to find the spot in the video frame where the IR-intensity is the highest and return its coordinates. It was also made to return the coordinates of the center, as this was the desired location of the target.

This is similar to DesinoBot [3], an earlier bachelor’s thesis’ made at KTH, that uses color tracking. But instead of using a digital color filter, IR light was filtered with

(26)

an analog filter. The filter was installed in front of the camera lens. The camera would still read the video frame as a Red-Green-Blue (RGB) image. To transform the image to only one color, OpenCV was used to transform the image into a grey- scale image. A blurring filter was applied to get rid of potential disturbances. By blurring, the area with the highest light intensity was tracked, not just the brightest pixel. OpenCV has built in methods for blurring, in this case Gaussian Blur was applied.

OpenCV was then used to determine the coordinates of the center and the brightest spot of the image. The distance between the two spots (the control errors ex and ey) could then be computed with vector geometry.

ei = targeti− centeri, i= x, y (3.1) 3.2.3 Tuning

Because no mathematical model of the system was made, the method described in section 2.7 was used to get a starting point for the PID-gains. The system was able to settle but the servo motors did jitter a lot. The system was then slowed down by lowering the PID-gains, with the intention to make the servo motors jitter less. It showed that it did not have any effect on the jittering, the system just took longer time to settle.

After some troubleshooting, it was decided to test another GPIO Python 3 library, a library that allows the Python 3 program to control the GPIO ports on the Rasp- berry Pi. Initially, the system used the RPi.GPIO library to send the Pulse-With Modulation (PWM) signal to the servo motors. It was made by setting a duty cycle

d(α) = α 8

180+ 3.5, (3.2)

where α is a value between 0 − 180, representing the angle, as input value. The new GPIO library, pigpio [20], was controlled by setting a pulse-width length l(α) [µs],

l(α) = 1000 + α1000

180 (3.3)

as input value. It made wonders for the servo motors, as the jittering completely vanished. The PID-controller was then tuned again with the same method and adjusted until the system performed well enough to be used for this project. No specific requirements on the performance was defined, except the one mentioned in section 1.3, ”/.../ controls the camera mount to track a target in a smooth way.”.

In other words, it should perform as a person holding the camera. Meaning that, a small overshoot is okay but it must settle, even if the settling time is not the shortest it can be. The PID-controller may be one of the simplest solutions, but it provided good enough performance for this project, so there was no need to look any further for a more advanced controller.

(27)

CHAPTER 3. METHOD

3.2.4 Multiprocessing

In the early stages of the development of the control system, a problem occurred.

The servo motors could sometimes move too far without knowing it because the camera needed to wait until the servo motors had finished moving until it was able to capture a new frame.

To get rid of such loss of information, it was decided to run several processes simul- taneously. It was made by using the Python 3 library multiprocessing. The system was then divided into four processes that would run simultaneously, the camera, a PID-controller for each servo motor and a function that sends the output to the servo motors. In order for the four processes to share the global variables, process safe variables were created with the multiprocessing library. So when one process updates a process safe variable, all other processes that uses that variable take part of the change. In that way, the camera does not need to wait for the servo motors to have finished moving before capturing a new frame. Instead, it is the image processing function, H, that limits the update rate of the input. The system could still be fine-tuned by adding a time delay within a process, which was made in the PID-process in order to give the motors some time to apply the change of the input signal before receiving a new input signal.

(28)
(29)

Chapter 4

Results

4.1 Finished Prototype

The finished prototype can be seen in figure 4.1. The model is created as described in Chapter 3 with exception of the IR-camera mount. The camera mount is built out of LEGO which is useful for trying out different arrangements of the lights and camera.

The finished model uses two FeeTech FS5106B servos, one PiNoIR v2 IR-Camera, two 3W Raspberry Pi IR-Lights and a Raspberry Pi 3B+ microcomputer.

Figure 4.1: Full scale model used for validation and testing, image taken on iPhone 8.

(30)

4.2 IR Reflector Study

In this section, test results are presented. The testing was made in two steps. The first step was to find a reflector that was easily identified by the system. Then, the two reflectors that performed best advanced to the next step, where they were used to test the performance of the system while the test person was in motion.

4.2.1 Disturbance test

The disturbance test was made by letting the system settle on the reflector, and then light a flashlight that emits a small amount of IR-light next to it in order to try to out-conquer the reflector as the brightest point. If it is a good reflector, the system will not identify the flashlight as the brightest point. In order to compare with earlier made vision based trackers, the reflectors were tested in different light conditions, just like the earlier bachelor’s project SIYA [5], but with three grades of performance instead of two (yes/no). The results from the reflector study can be seen in table 4.1.

• Good = The flashlight does not affect the tracking of the reflector.

• Bad = The flashlight sometimes affect the tracking of the reflector.

• Does not settle = The flashlight always affect the tracking of the reflector.

Figure 4.2: Reflector test cases, created in Adobe Illustrator 2020 [23].

(31)

CHAPTER 4. RESULTS

Ceiling light on Ceiling light off

Blinds up

Case 1: Does not settle Case 2: Does not settle Case 3: Does not settle Case 4: Does not settle Case 5: Does not settle

Case 1: Good Case 2: Good Case 3: Bad Case 4: Bad Case 5: Good

Blinds down

Case 1: Does not settle Case 2: Does not settle Case 3: Does not settle Case 4: Does not settle Case 5: Does not settle

Case 1: Good Case 2: Good Case 3: Good Case 4: Bad Case 5: Good

Table 4.1: Test results from disturbance test.

4.2.2 Performance Study

A performance study was made with the two best reflectors from the previous test, Case 1 and Case 2, to test the overall performance of the system. It was made by first letting the system settle on the reflector, and then having the test person to walk back and forth, from side to side, three times. This was also tested in different light conditions, and graded with the same grades of performance. The results from the performance study can be seen in table 4.2.

Ceiling light on Ceiling light off Blinds up Case 1: Does not settle

Case 2: Does not settle Case 1: Good Case 2: Good Blinds down Case 1: Does not settleCase 2: Does not settle Case 1: Good Case 2: Good

Table 4.2: Test results from the performance study.

(32)
(33)

Chapter 5

Conclusion and Discussion

In this chapter the results will be discussed and a conclusion will be drawn. It also includes some recommendations for future work.

5.1 Discussion

The results from the IR Reflector Study shows that the reflectors with relatively large surface area (Case 1, 2 and 5) are the best for tracking. It was also clear that it is better with a reflector that is visible from all angles without being blocked by arms etc. These criteria led us to pick Case 1 and Case 2 as our best reflectors.

This was then compared to other tracking methods.

None of the reflectors works well when the ceiling light is turned on. As long as the ceiling light is out of frame the tracker worked as intended. But when the camera angles up and gets the light-source in frame, it stays locked to that posi- tion. The lights used in the study where halogen bulbs, they emit a lot of IR light in comparison to for example LED lights.

The performance study clearly showed that FraMe operates best in low light. This is the opposite behavior to the color tracking systems like SIYA [5]. In that case, the tracker was not able to operate without ceiling lights. A similar result can be seen in DesinoBot [3], where a stable and good lighting environment was essential for good performance.

5.2 Conclusion

What shape of reflector on the tracked object assures good performance for infrared tracking?

The reflector should meet two criteria:

(34)

• A big surface area, this assures that the reflector is the brightest spot, and reduces the influence of disturbances.

• A design that is view-able from all angles. This assures that the camera keeps tracking even if the tracked person rotates.

Case 1 and Case 2 meets these criteria and was the best preforming arrangements in our testing.

How does the performance of a infrared tracking system compare to other com- parable vision-based tracking systems?

The main measurable difference to other vision-based tracking systems is in the performance in different lighting conditions. Infrared tracking excels in low-lighting conditions and struggles in conditions with strong light sources. This is the exact opposite to the other vision-based tracking methods that uses color tracking.

5.3 Recommendations for future work

To overcome the previously discussed weakness of both color tracking and infrared tracking, it would be interesting to investigate if it is possible to construct a hybrid solution. A system that implements both strategies and adapts seamlessly between the two depending on lighting conditions.

Another thing that would be interesting to explore is in what way a reflector can be more discreetly implemented in clothes. For example if it is possible to create a shirt with reflective material that is woven in to the material in a way that does not disturb its aesthetics.

Finally it would be useful to investigate in what way the movement could be made smoother. The RC-servos used in this report provide excellent torque and speed but they do have a tendency to jerk and shake in some situations. Could this motion be damped or geared? Or should a completely different motor design be implemented for smoother motion?

(35)

Bibliography

[1] What is Arduino?. arduino.cc. Accessed 2020-02-11: https://www.arduino.

cc/en/guide/introduction

[2] Raspberry Pi 3 Model B+ The Raspberry Pi foundation. Accessed 2020-02-14:

https://www.raspberrypi.org/products/raspberry-pi-3-model-b-plus/

[3] Andersson Santiago, Gabriel and Favre, Martin. DesinoBot: The construction of a color tracking turret. Bachelor’s thesis in mechanical engineering, ITM, KTH, Stockholm, Sweden, 2015. Available: http://kth.diva-portal.org/smash/

record.jsf?pid=diva2%3A915902&dswid=-100

[4] Burman, Gustav and Erlandsson, Simon. ACM 9000: Automated Camera Man. Bachelor’s thesis in mechanical engineering, ITM, KTH, Stockholm, Swe- den, 2018. Available: http://kth.diva-portal.org/smash/record.jsf?pid=

diva2%3A1217345&dswid=2915

[5] Adeeb, Karam and Alveteg, Adam. SIYA - Slide Into Your Albums: De- sign and construction of a controllable dolly camera with object recognition. Bachelor’s thesis in mechanical engineering, ITM KTH, Stockholm, Swe- den, 2019. Available: http://kth.diva-portal.org/smash/record.jsf?pid=

diva2%3A1373568&dswid=-5686

[6] K.J Deopuijari, K.S. Tiwari. Image Correlation Based Tracking in Video Frames. Special Issue on International Journal of Electrical, Electronics and Computer Systems, ISSN (Print): 2347-2820 V-4 I-2For 3rd National Con- ference on Advancements in Communication, Computing and Electronics Technology [ACCET-2016]held at M. E. S. College of Engineering, Pune 11–12, February 2016. Avalible: https://pdfs.semanticscholar.org/e9c1/

3fb1fc37abc4c2fb670a3cdb69228928354b.pdf

[7] Sanna ˚Agren. Object tracking methods and their areas of application: A meta- analysis. Master’s thesis in Computing Science, Ume˚a Universitet, Stock- holm, Sweden, 2017. Available: http://www8.cs.umu.se/education/examina/

Rapporter/SannaAgrenFinal.pdf

(36)

[8] Liew, S. C. Electromagnetic Waves. Centre for Remote Imaging, Sensing and Processing. Accessed 2020-02-11: https://crisp.nus.edu.sg/˜research/

tutorial/em.htm

[9] Jon L. Grossman. Thermal Infrared vs. Active Infrared: A New Technology Begins to be Commercialized. irinfo.org. Accessed 2020-02-11: https://www.

irinfo.org/03-01-2007-grossman/

[10] Basics of Infrared Photography. IR-photo.net. Accessed 2020-02-11: http://

ir-photo.net/ir_imaging.html

[11] SERVO CONTROL FACTS. A Handbook Explaning The Basics Of Motion, BALDOR ELECTRIC COMPANY, Fort Smith, Arkansas. Accessed 2020-01-23:

https://www.baldor.com/Shared/manuals/1205-394.pdf

[12] Chuck McManis. R/C Servos 101. Sun Microsystems Inc. Accessed 2020-01-23:

http://pages.cs.wisc.edu/˜bolo/shipyard/servos101.html

[13] C. Michael Bourget. An Introduction to Light-emitting Diodes. Orbital Tech- nologies Corporation, Electrical Engineering, 2008. Available: https://doi.

org/10.21273/HORTSCI.43.7.1944

[14] Safa Kasap and Peter Capper. Springer Handbook of Electronic and Photonic Materials. Springer International Publishing AG, 2017. DOI:

10.1007/978-3-319-48933-9. Avalible: https: // link. springer. com/ book/

10. 1007/ 978-3-319-4893-9

[15] Torkel Glad & Lennart Ljung, ”Vad ¨ar reglerteknik?” in Reglerteknik, Grundl¨aggande teori, fj¨arde upplagan. Link¨oping, Sweden: Studentlitteratur, 2006. ch.1, sec. 6, p. 20

[16] Torkel Glad & Lennart Ljung, ”˚Aterkopplade system (Syntes I)” in Re- glerteknik, Grundl¨aggande teori, fj¨arde upplagan. Link¨oping, Sweden: Stu- dentlitteratur, 2006. ch.3, sec. 3, pp. 53-58

[17] PLA filament guide. alladskrivare.se. Accessed 2020-03-18: https://

alla3dskrivare.se/filament/pla/

[18] Ivan Culjak, David Abram, Tomislav Pribanic, Hrvoje Dzapo, Mario Cifrek.

A brief introduction to OpenCV. 2012 Proceedings of the 35th International Convention MIPRO, May 2012, pp.1725-1730. ISBN: 9781467325776. Available:

https://andor.tuni.fi/discovery/fulldisplay/ieee_s6240828/358FIN_

TAMPO:VU1?lang=fi

[19] OpenCV team. About. opencv.org. Accessed 2020-03-24: https://opencv.

org/about/

(37)

BIBLIOGRAPHY

[20] PiGPIO team. The pigpio library. http://abyz.me.uk/. Accessed 2020-04-09:

http://abyz.me.uk/rpi/pigpio/

[21] Siemens Digital Industry Software. Solid Edge 2019. https://solidedge.

siemens.com/en/

[22] Luxion Inc. KeyShot 9.3. https://www.keyshot.com/

[23] Adobe Inc. Adobe Illustrator 2020. https://www.adobe.com/se/products/

illustrator.html

(38)
(39)

Appendix A

Specifications for the servo motors

(40)

产品规格书

Specification of Product V1.0

Page 1/2

产品名称:6V 6公斤模拟舵机 Product Name:6V 6kg.cm Analog Servo 产品型号 Model No.FS5106B

1.使用环境条件 Apply Environmental Condition

No. Item Specification

1-1 存储温度 Storage Temperature Range -30℃~80℃

1-2 运行温度 Operating Temperature Range -15℃~70℃

2.测试环境 Standard Test Environment

No. Item Specification

2-1 温度 Temperature range 25℃ ±5℃

2-2 湿度 Humidity range 65%±10%

3.机械特性 Mechanical Specification

No. Item Specification

3-1 尺寸 Size A:40.8mm B:20.1mm C:38mm D:49.5mm

3-2 重量 Weight 40.5g

3-3 齿轮类型 Gear type Plastic Gear

3-6 机构极限角度 Limit angle 200°±5°

3-7 轴承 Bearing 2 Ball bearings

3-8 出力轴 Horn gear spline 25T

3-9 摆臂 Horn type Plastic,POM

3-10 外壳 Case Nylon & Fiberglass

3-11 舵机线 Connector wire 300mm ±5 mm

3-12 马达 Motor Metal brush motor

3-13 防水性能 Splash water resistance NO

4.电气特性 Electrical Specification (Function of the Performance)

No.

工作电压 Operating Voltage Range 4.8V 6V

4-1* 静态电流 Idle current(at stopped) 5mA 7mA

4-2* 空载速度 No load speed 0.18sec/60° 0.16 sec/60°

4-3* 空载电流 Runnig current(at no load) 160 mA 190 mA

4-4 堵转扭矩 Peak stall torque 5kg.cm 6kg.cm

69.56oz.in 83.47oz.in

4-5 堵转电流 Stall current 980 mA 1100mA

(41)

FEETECH RC Model Co.,Ltd.

产品规格书

Specification of Product V1.0

Page 2/2

产品名称:6V 6公斤模拟舵机 Product Name:6V 6kg.cm Analog Servo 产品型号 Model No.FS5106B

5.控制特性 Control Specification:

No. Item Specification

5-1 控制信号 Command signal Pulse width modification

5-2 放大器类型 Amplifier type Analog comparator

5-3 脉冲宽度范围 Pulse width range 700~2300µsec

5-4 中立位置 Neutral position 1500 µsec

5-5 旋转角度 Running degree 180°(±5°)(when 700~2300 µsec)

5-6 死区宽度 Dead band width 5 µsec

5-7 旋转方向 Rotating direction Counterclockwise (when 1000~2000 µsec)

6.外形图 The Drawings

(42)
(43)

Appendix B

Specifications for the Raspberry pi 3B+

(44)

Overview

The Raspberry Pi 3 Model B+ is the latest product in the Raspberry Pi 3 range, boasting a 64-bit quad core processor running at 1.4GHz, dual-band 2.4GHz and 5GHz wireless LAN, Bluetooth 4.2/BLE, faster Ethernet, and PoE capability via a separate PoE HAT

The dual-band wireless LAN comes with modular compliance certification, allowing the board to be designed into end products with significantly reduced wireless LAN compliance testing, improving both cost and time to market.

The Raspberry Pi 3 Model B+ maintains the same mechanical footprint as both

the Raspberry Pi 2 Model B and the Raspberry Pi 3 Model B.

(45)

Raspberry Pi 3 Model B+

2

Broadcom BCM2837B0, Cortex-A53 64-bit SoC @ 1.4GHz

1GB LPDDR2 SDRAM

2.4GHz and 5GHz IEEE 802.11.b/g/n/ac wireless LAN, Bluetooth 4.2, BLE

Gigabit Ethernet over USB 2.0 (maximum throughput 300Mbps)

4 × USB 2.0 ports

Extended 40-pin GPIO header 1 × full size HDMI

MIPI DSI display port MIPI CSI camera port

4 pole stereo output and composite video port H.264, MPEG-4 decode (1080p30); H.264 encode (1080p30); OpenGL ES 1.1, 2.0 graphics

Micro SD format for loading operating system and data storage

5V/2.5A DC via micro USB connector 5V DC via GPIO header

Power over Ethernet (PoE)–enabled (requires separate PoE HAT)

Operating temperature, 0–50°C

For a full list of local and regional product approvals, please visit www.raspberrypi.org/products/raspberry -

pi-3-model-b+

The Raspberry Pi 3 Model B+ will remain in production until at least January 2023.

Processor:

Memory:

Connectivity:

Access:

Video & sound:

Multimedia:

SD card support:

Input power:

Environment:

Compliance:

Production lifetime:

Specifications

(46)
(47)

Appendix C

Specifications for the PiNoIR v2 camera

(48)

Part number: RPI NOIR CAMERA BOARD

8 megapixel camera capable of taking infrared photographs of 3280 x 2464 pixels

Capture video at 1080p30, 720p60 and 640x480p90 resolutions

All software is supported within the latest version of Raspbian Operating System

The Raspberry Pi NoIR Camera Module v2 is a high quality 8 megapixel Sony IMX219 image sensor custom designed add-on board for Raspberry Pi, featuring a fixed focus lens. It's capable of 3280 x 2464 pixel static images, and also supports 1080p30, 720p60 and 640x480p60/90 video. It attaches to Pi by way of one of the small sockets on the board upper surface and uses the dedicated CSi interface, designed especially for interfacing to cameras.

8 megapixel native resolution sensor-capable of 3280 x 2464 pixel static images

Supports 1080p30, 720p60 and 640x480p90 video

Camera is supported in the latest version of Raspbian, Raspberry Pi's preferred operating system

The board itself is tiny, at around 25mm x 23mm x 9mm. It also weighs just over 3g, making it perfect for mobile or other applications where size and weight are important. It connects to Raspberry Pi by way of a short ribbon cable.

The high quality Sony sensor itself has a native resolution of 8 megapixel, and has a fixed focus lens on-board. In terms of still images, the camera is capable of 3280 x 2464 pixel static images, and also supports 1080p30, 720p60 and 640x480p90 video.

The NoIR Camera has No InfraRed (NoIR) filter on the lens which makes it perfect for doing Infrared photography and taking pictures in low light (twilight) environments.

Applications

- Infrared photography -Low light photography - Monitoring plant growth - CCTV security camera

(49)

Appendix D

Python code

D.0.1 FraMe.py

1

1 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # 2 # #

3 # # N a m e : F r a M e . py

4 # #

5 # # D e s c r i p t i o n : T h i s is the m a i n s c r i p t for F r a M e . It is

6 # # ran on a R a s p b e r r y Pi 3 B + and c o n t r o l s

7 # # two s e r v o m o t o r s t h a t r o t a t e a c a m e r a in

8 # # o r d e r to i d e n t f y a t a r g e t and k e e p it

9 # # c e n t e r e d in the p i c t u r e .

10 # #

11 # # The s c r i p t u s e s two c l a s s e s t h a t are put

12 # # in s e p a r a t e f i l e s :

13 # #

14 # # - -" b r i g h t K l a s s . py " - -

15 # # C o n t a i n s the c l a s s t h a t p r o c e s s e s an

16 # # i m a g e and r e t u r n s the c o o r d i n a t e s for

17 # # the b r i g h t e s t s p o t in the p r o s e s s e d

18 # # i m a g e .

19 # # * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

20 # # - -" pid . py " - -

21 # # C o n t a i n s the c l a s s t h a t d e f i n e s the

22 # # PID - c o n t r o l l e r t h a t is u s e d to c o n t r o l

23 # # the a n g l e s of the s e r v o m o t o r s .

24 # # * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

(50)

25 # #

26 # # No u s e r i n t e r f a c e is needed , but the

27 # # GPIO - p i n s for e a c h s e r v o m o t o r can be

28 # # d e f i n e d via c o m m a n d l i n e a r g u m e n t s . If

29 # # no a r g u m e n t s are given , d e f a u l t is set

30 # # to 2 and 22 , for the pan - and tilt - s e r v o

31 # # r e s p e c t i v e l y .

32 # #

33 # # The s t r e a m f r o m the p i c a m e r a can be

34 # # d i s p l a y e d l o c a l l y or f o r w a r d e d to an

35 # # an e x t e r n a l d e v i c e via an

36 # # SSH - c o n n e c t i o n .

37 # #

38 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # 39 # #

40 # # A u t h o r s : Grahn , A n t h o n & T h a l i n 41 # #

42 # # C o u r s e : B a c h e l o r ’ s t h e s i s in M e c h a t r o n i c s ,

43 # # M F 1 3 3 X V T 2 0

44 # #

45 # # I n s t i t u t e : D e p a r t m e n t of M e c h a t r o n i c s , S c h o o l of 46 # # I n d u s t r i a l E n g i n e e r i n g and M a n a g e m e n t , 47 # # R o y a l I n s t i t u t e of T e c h n o l o g y , S t o c k h o l m 48 # #

49 # # L a s t e d i t e d : 2020 -05 -22 50 # #

51 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # 52 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # 53

54 i m p o r t os 55 i m p o r t cv2 56 i m p o r t sys 57 i m p o r t t i m e 58 i m p o r t p i g p i o 59 i m p o r t s i g n a l 60 i m p o r t n u m p y as np 61 f r o m pid i m p o r t PID

62 f r o m p i c a m e r a i m p o r t P i C a m e r a

(51)

APPENDIX D. PYTHON CODE

63 f r o m b r i g h t K l a s s i m p o r t B r i g h t 64 f r o m m u l t i p r o c e s s i n g i m p o r t M a n a g e r 65 f r o m m u l t i p r o c e s s i n g i m p o r t P r o c e s s 66 f r o m i m u t i l s . v i d e o i m p o r t V i d e o S t r e a m 67

68 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # 69 # S i g n a l h a n d l e r for C T R L + C : E x i t s p r o g r a m in a d e s i r e d way # 70 # if K e y b o a r d I n t e r r u p t is r a i s e d t h r o u g h C T R L + C # 71 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # 72

73 def C T R L C _ h a n d l e r ( sig , f r a m e ) : 74 p r i n t(" C T R L + C p r e s s e d ")

75 s e t A n g l e s (90 , 90 , M a n a g e r () . V a l u e (" r ", F a l s e ) ) 76 t i m e . s l e e p (2)

77 cv2 . d e s t r o y A l l W i n d o w s () 78 sys . e x i t ()

79

80 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

81 # #

82 # o b j _ c e n t e r : C a p t u r e s a v i d e o f r a m e w i t h the c a m e r a and # 83 # u p d a t e s the 2 D - c o o r d i n a t e s for the c e n t e r #

84 # and the o b j e c t #

85 # #

86 # IN : P r o c e s s s a f e v a r i a b l e s for e a c h c o o r d i n a t e #

87 # #

88 # OUT : U p d a t e s the p r o c e s s s a f e v a r i a b l e s b a s e d on i n p u t #

89 # f r o m the c a m e r a #

90 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # 91

92 def o b j _ c e n t e r ( objX , objY , cenX , cenY , radie , s e r v o P i n ) : 93

94 p r i n t(" o b j _ c e n t e r r e a d y to go ! ") 95 t i m e . s l e e p (1)

96 s i g n a l . s i g n a l ( s i g n a l . SIGINT , C T R L C _ h a n d l e r )

97 vs = V i d e o S t r e a m ( u s e P i C a m e r a = True , r e s o l u t i o n =(320 , 2 4 0 ) ) . s t a r t ()

98 t i m e . s l e e p ( 2 . 0 ) 99

(52)

100 obj = B r i g h t ( r a d i e ) 101

102 w h i l e s e r v o P i n . v a l u e : 103 f r a m e = vs . r e a d () 104 o r i g = f r a m e . c o p y ()

105 c e n X . value , c e n Y . v a l u e = .5*int( f r a m e . s h a p e [ 1 ] ) , -.5*

int( f r a m e . s h a p e [ 0 ] ) 106 l o c X Y = obj . b r i g h t ( f r a m e )

107 o b j X . value , o b j Y . v a l u e = l o c X Y [0] , -1* l o c X Y [1]

108

109 # D i s p l a y the i m a g e

110 o r i g = cv2 . r e s i z e ( orig , (1280 , 7 2 0 ) , i n t e r p o l a t i o n = cv2 . I N T E R _ L I N E A R )

111 cv2 . c i r c l e ( orig , (int( 1 2 8 0 * o b j X . v a l u e / 3 2 0 ) , int( -720*

o b j Y . v a l u e / 2 4 0 ) ) , radie , (0 , 255 , 0) , 2) 112 cv2 . i m s h o w (" i m s h o w ", o r i g )

113

114 # < ESC > to r a i s e K e y b o a r d I n t e r r u p t 115 if cv2 . w a i t K e y (1) == 27:

116 p r i n t(" < ESC > p r e s s e d ! \ n P l e a s e w a i t ! ") 117 s e r v o P i n . v a l u e = F a l s e

118 t i m e . s l e e p (1)

119 b r e a k

120

121 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

122 # #

123 # p i d d K o n t r o l l e r X : C o n t r o l s the a n g l e of the s e r v o t h a t is #

124 # r e s p o n s i b l e for the pan - m o t i o n #

125 # #

126 # IN : C u r r e n t angle , pid - gains , x - c o o r d i n a t e s for c e n t e r #

127 # and o b j e c t #

128 # #

129 # OUT : U p d a t e s the p r o c e s s s a f e v a r i a b l e for the pan - a n g l e # 130 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # 131

132 def p i d K o n t r o l l e r X ( output , p , i , d , loc , center , s e r v o R u n s ) : 133 s i g n a l . s i g n a l ( s i g n a l . SIGINT , C T R L C _ h a n d l e r )

134 p = PID ( p . value , i . value , d . v a l u e )

(53)

APPENDIX D. PYTHON CODE

135 p . i n i t i a l i z e () 136

137 p r i n t(" X p i d c o n t r o l l e r r e a d y to go ! ") 138 w h i l e s e r v o R u n s . v a l u e :

139 err = c e n t e r . v a l u e - loc . v a l u e 140 t o U p d a t e = p . u p d a t e ( err ) + 90 141

142 # C o n s t r a i n t s

143 if t o U p d a t e > 1 6 9 :

144 t o U p d a t e = 169

145 if t o U p d a t e < 11:

146 t o U p d a t e = 11

147 o u t p u t . v a l u e = t o U p d a t e 148

149 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

150 # #

151 # p i d d K o n t r o l l e r Y : C o n t r o l s the a n g l e of the s e r v o t h a t is #

152 # r e s p o n s i b l e for the tilt - m o t i o n #

153 # #

154 # IN : C u r r e n t angle , pid - gains , y - c o o r d i n a t e s for c e n t e r #

155 # and o b j e c t #

156 # #

157 # OUT : U p d a t e s the p r o c e s s s a f e v a r i a b l e for the #

158 # tilt - a n g l e #

159 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # 160

161 def p i d K o n t r o l l e r Y ( output , p , i , d , loc , center , s e r v o R u n s ) : 162 s i g n a l . s i g n a l ( s i g n a l . SIGINT , C T R L C _ h a n d l e r )

163 p = PID ( p . value , i . value , d . v a l u e ) 164 p . i n i t i a l i z e ()

165

166 p r i n t(" Y p i d c o n t r o l l e r r e a d y to go ! ") 167 w h i l e s e r v o R u n s . v a l u e :

168 err = c e n t e r . v a l u e - loc . v a l u e 169 t o U p d a t e = p . u p d a t e ( err ) + 90 170

171 # C o n s t r a i n t s due to p h y s i c a l l i m i t a t i o n s 172 if t o U p d a t e > 97:

(54)

173 t o U p d a t e = 97 174 if t o U p d a t e < 40:

175 t o U p d a t e = 40

176 o u t p u t . v a l u e = t o U p d a t e

177 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

178 # #

179 # s e t A n g l e s : C o n v e r t s the a n g l e v a l u e for e a c h s e r v o to a #

180 # PWM s i g n a l and s e t s the d e s i r e d a n g l e s #

181 # #

182 # IN : A n g l e s for b o t h s e r v o s #

183 # #

184 # OUT : S e n d s PWM s i g n a l to the s e r v o s #

185 # #

186 # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # 187

188 def s e t A n g l e s ( Xangle , Yangle , s e r v o R u n s ) :

189 s i g n a l . s i g n a l ( s i g n a l . SIGINT , C T R L C _ h a n d l e r ) 190

191 # Pin for pan s e r v o

192 x P i n = 2

193 # Pin for t i l t s e r v o 194 y P i n = 22

195

196 # GPIO - l i b r a r y 197 pi = p i g p i o . pi () 198

199 if not pi . c o n n e c t e d :

200 e x i t ()

201

202 # So t h a t GPIO - p i n s a l s o can be set w i t h sys . a r g v 203 if len( sys . a r g v ) == 1:

204 G = [ xPin , y P i n ] 205 e l i f len( sys . a r g v ) == 3:

206 G = []

207 for X Y p i n s in sys . a r g v [ 1 : ] : 208 G . a p p e n d (int( X Y p i n s ) )

209 e l s e:

210 p r i n t(" s e t A n g l e s NOT r e a d y to go ! ")

References

Related documents

The photocatalytic activity was monitored by studying degradation of the dye Methyl Orange (MO) in aqueous solution using modified Tungsten Oxide (WO 3 ) nanopowders as a

This study focused on the solid phase degradation of microplastic residues (particularly low density polyethylene, LDPE) in water through heterogeneous photocatalysis process

In this thesis I have analyzed how the phenomenon level of contrast, a consequence of the relation between level of light and distribution of light, works within urban green

Some research projects show that VLC can provide rich in- teractions beyond simple communication and uses the visible light as an interaction medium between humans and devices....

The  purpose  of  performing  the  following  experiments  is  to  evaluate  the  Physical  Layer  in  order  to  find  the   best  configurations  for  the

MACMAN in maintaining the disjoint backup routes AC Protocol MAC Routing Protocol Coupled/ Decoupled Back- Up route Reaction To route failure Congestion Avoidance

Med anledning av att studiens resultat visade att inget barn tillskrev sig själv ansvaret för våldet identifieras stor åtskillnad från studien av Fosco et al (2007, s. 9) där

Det substrat som är Onedbrutet i form av t.ex. kött, fruktskal och liknande kallas i modellen för komplext sammansatt material, ”Xc”. Eftersom det inte finns någon information om