• No results found

Color Sorting Robot: Sorting algorithm by color identification

N/A
N/A
Protected

Academic year: 2022

Share "Color Sorting Robot: Sorting algorithm by color identification"

Copied!
54
0
0

Loading.... (view fulltext now)

Full text

(1)

IN

DEGREE PROJECT TECHNOLOGY, FIRST CYCLE, 15 CREDITS

STOCKHOLM SWEDEN 2016,

Color Sorting Robot

Sorting algorithm by color identification TOMAS FREDRIKSSON

SARA STRÖM

(2)
(3)

Color Sorting Robot

Sorting Algorithm By Color Identification

TOMAS FREDRIKSSON SARA STRÖM

Bacherlor’s Thesis in Mechatronics Supervisor: Nihad Subasic

Examiner: Martin Edin Grimheden Approved: 2016-06-07

(4)
(5)

Abstract

Efficiency and automatization can be improved in several ways. The focus in this report has been working with color identification and creating a smart robot. A simple robotic arm is used to apply the color sorting to a physical system. This model evaluates how well a robotic arm can sort different objects using a predefined color identification algorithm. A demonstrator was built to perform tests for sorting speed and color identification. The robotic arm can sort a predefined shaped and sized object in 15,36 seconds. The color identification is sensitive to external factors and does not necessarily return the right RGB-value depending on lightning and brightness. The R-value often has the largest error. To further improve the color sorting robot, another color identification method could be tested, other motor types should be incorporated and a more precise sensor should be implemented.

(6)
(7)

Sammanfattning

Färgsorterande Robot

Förbättringar inom effektivisering och automatisering kan göras på många oli- ka sätt, och i den här rapporten har en metod med färgidentifiering arbetats fram för att skapa en smart robot. En enkel robotarm används för att applice- ra den fysiska tillämpningen av systemet samtidigt som själva färgsorteringen utgörs av en minidator. Denna modell utvärderar hur en robotarm med hjälp av en färgidentifieringsalgoritm kan sortera olika objekt. Resultatet visar att robotarmen kan sortera det bestämda objektet på 15,36 sekunder. Färgiden- tifieringen är dock känslig mot externa faktorer, såsom ljus och exempelvis blanka ytor. Programmet ger nödvändigtvis inte ’rätt’ RGB-värde, beroende på dessa externa faktorer. Det är ofta R-värdet som ger det största felet. För att förbättra färgsorteringsroboten, skulle en annan färgsortertingsmetod kun- na testas, motortypen kan bytas ut, samt en mer precis sensor skulle kunna implementeras.

(8)
(9)

Preface

We would like to thank our supervisor Nihad Subasic for support and feedback and Staffan Qvarnström for all the help with the components and manufacturing.

Finally, many thanks to all the student assistants in the lab who have been there throughout the entire semester.

Tomas Fredriksson & Sara Ström Kungliga Tekniska Högskolan, May, 2016

(10)
(11)

Contents

Abstract iii

Sammanfattning v

Preface vii

Contents ix

Nomenclature xi

1 Introduction 1

1.1 Background . . . 1

1.2 Purpose . . . 1

1.3 Scope . . . 2

1.4 Method . . . 2

1.5 Related projects . . . 2

2 Theory 3 2.1 Arduino . . . 3

2.2 Raspberry Pi . . . 4

2.3 Camera Pi . . . 4

2.4 Adafruit Motor Shield . . . 5

2.5 Robotic Arm . . . 5

2.6 OpenCV . . . 6

2.7 SimpleCV . . . 6

3 Demonstrator 7 3.1 Problem Formulation . . . 7

3.2 Hardware . . . 7

3.2.1 Robotic arm . . . 8

3.2.2 Camera Pi . . . 9

3.2.3 Other assemblies . . . 9

3.3 Software . . . 10

3.3.1 Robotic arm connected to Arduino . . . 10

(12)

3.3.2 Robotic claw . . . 11

3.3.3 Camera Pi . . . 11

3.3.4 Color identification . . . 11

3.3.5 Connect Raspberry Pi and Arduino . . . 11

3.4 Electronics . . . 12

3.4.1 Robotic arm . . . 12

3.4.2 Robotic claw . . . 12

3.4.3 Connect Raspberry Pi and Arduino . . . 13

4 Results 15 4.1 Robot Arm . . . 15

4.2 Color Identification . . . 15

5 Discussion and conclusions 19 5.1 Discussion . . . 19

5.1.1 Different motors . . . 19

5.1.2 Raspberry Pi . . . 20

5.1.3 Camera Pi . . . 20

5.1.4 Color Identification . . . 20

5.1.5 Results . . . 20

5.2 Conclusions . . . 21

6 Recommendations and future work 23 6.1 Recommendations . . . 23

6.2 Future work . . . 23

Bibliography 25

Appendices

A Breadboard layout for H-bridge 1

B Specifications of Raspberry Pi models 1

C All colors tested with Camera Pi 1

D Arduino Code for the Robot Arm 1

x

(13)

Nomenclature

Abbreviations

Abbreviation Description

API Application Programming Interface CPU Central Processing Unit

CSI Camera Serial Interface

DC Direct Current

GPIO General-Purpose Input/Output GPU Graphics Processing Unit HSB Hue, Saturation, Brightness

IDE Integrated Development Environment I2C Inter-Integrated Circuit

PWM Pulse Width Modulation

RGB Red, Green, Blue

RPi Raspberry Pi

SEK Swedish Krona

SPI Serial Peripheral Interface Bus SRAM Static Random-Access Memory

UART Universal Asynchronous Receiver/Transmitter

(14)
(15)

Chapter 1

Introduction

This chapter introduces the subject and defines the purpose, scope and method.

1.1 Background

Situations where humans work repeatedly and contains the same process every cycle have an ability to give uneven results. In the future a wide range of that kind of applications can be improved by using a robot. A robot’s work gives the same result each time as opposed to the human. When the robot is programmed it will do the exact same operations every time. Further advantages of using a robot is the decreased time consumption. The quality and capability to repeat precise movements makes it easy to increase the speed of the process. To invest in a robot is a one-time charge but in the long term the investment will be worth it due to the consistent quality and labor costs. Robots can work both day and night, compared to humans who need breaks and sleep. One kind of task which a robot would be able to perform is sorting objects into categories for instance by color, shape or size. This is, however, not always a simple combination. Therefore, in this thesis, a design of a solution is presented and tested.

1.2 Purpose

The purpose of this project is to analyze how a robot with a camera module can respond to objects of different color, then grip and sort them depending on the color identification. To achieve this, we have focused this thesis on the following questions:

• How fast can a system identify a specific object, lift it using a robotic arm and sort it using color identification?

• How similar in color can two objects be for the robot to see the difference and sort it accordingly?

(16)

CHAPTER 1. INTRODUCTION

1.3 Scope

In order to complete this project within the time limit and with a budget of 1 000 SEK (Swedish Krona) some boundaries had to be set. The object picked up by the robot has specified dimensions, which can be seen in Tab 1.1. The camera is supposed to be able to see the difference between two very different colors (e.g. red and blue). The robotic arm has no speed requirement or limits.

Table 1.1. Limitations of graspable objects

Parameters Values Weight <100 g

Size 30x30x30 mm (Length x Depth x Height)

Shape Cubic

1.4 Method

The Arduino and Raspberry Pi (RPi) are combined to fulfill the purpose of this project. The logic for interacting with robot hardware is on the Arduino and the calculations are done by the RPi, which sends control signals to the Arduino. The Arduino is a micro-controller, which means the clock speed and memory limits its ability to process images. A mini-computer is required for the processing. For this project a RPi is selected because it is easy to implement a camera module. The RPi is utilised to manage the image processing and the Arduino operates the motors. A camera programmed to identify colors is placed on the robot arm and connected to the RPi. Depending on what color the object has the RPi will send different signals to the Arduino for movement of the robot arm.

1.5 Related projects

This project is inspired by combining two other projects. One existing project where an Arduino is connected to a robot arm. The robot arm is driven by five DC-motors and is controlled with information from potentiometers. A motor shield is used to control four of the motors. This project does not include control of the fifth motor, the claw, and the arm is programmed to follow a specific path. [Instructables, 2016]

Another project introduces image processing with the Raspberry Pi and Python is used. This project presents how to capture an image, store it and show the image on the screen. This project includes a regular webcamera and does not include identification of shape, size or color. [raspberrypi, 2016]

2

(17)

Chapter 2

Theory

This chapter presents the research of all components and software used for designing the robot arm.

2.1 Arduino

Arduino board is a micro-controller, which are able to read different inputs and turn them into outputs. Relevant specifications are listed in Tab 2.1. Instructions can be sent to the board using the Arduino programming language. [Arduino, 2016]

The Arduino Uno is used in this project, see Fig 2.1.

Table 2.1. Specifications of Arduino UNO

Description Values Operating Voltage 5 V

Digital I/O Pins 14 (of which 6 provide PWM output) Analog Input Pins 6

SRAM 1 KB

Clock Speed 16 MHz

Figure 2.1. Arduino Uno

(18)

CHAPTER 2. THEORY

2.2 Raspberry Pi

The RPi is a credit-card sized computer, which is a powerful tool for its size, see Fig 2.2. In this project, a Raspberry Pi A+ is used, relevant specifications are listed in Tab 2.2 [RaspberryPi, 2016]. RPi uses a computer operating system (OS) called Raspbian, which is a Unix-like, open source OS. Our RPi in this project is using the latest release named "Raspbian Jessie". [RaspbianProject, 2016]

Table 2.2. Specifications of Raspberry Pi Model A+

Description Values

CPU 700 MHz single-core ARM1176JZF-S GPU Broadcom Videocore IV @ 250 MHz Memory 256 MB (shared with GPU)

Figure 2.2. Raspberry Pi A+

2.3 Camera Pi

The camera module to RPi is a five megapixel fixed-focus camera that supports high- defintion video as well as pictures, which translate into a resolution of 1920x1080, see Fig 2.3. The Camera is used in this project because of it’s relationship with the RPi; the module can be run through the OS on the RPi, which makes it simple to install and use. [RaspberryPiFoundation, 2016]

Figure 2.3. Camera Module for Raspberry Pi

4

(19)

2.4. ADAFRUIT MOTOR SHIELD

2.4 Adafruit Motor Shield

A motor shield is a driver module for motors that allows controlling of the working speed and direction of the motor [ArduinoMotorShield, 2016]. The motor shield used in this project is called Adafruit Motor Shield v2.3 and can control up to four DC-motors, see Fig. 2.4. [Adafruit, 2016]

Figure 2.4. Adafruit Motor Shield v2.3

2.5 Robotic Arm

This robotic arm, see Fig 2.5, is a manually controlled robot with five DC-motors, a gripping claw and four joints. Rotation is enabled in several directions. [OWIRobots, 2016]

Figure 2.5. Robotic Arm

(20)

CHAPTER 2. THEORY

2.6 OpenCV

OpenCV stands for Open Source Computer Vision and is a software library for real-time computer vision. It is originally written in C/C++, but now features both Python and Java interfaces, in addition to C/C++. It also supports various platforms including Windows, Linux, Mac OS, iOS and Android. OpenCV’s appli- cation areas are many, some examples are object identification and color filtering.

[Itseez, 2016] In this project the library is imported to Python 2.7.

2.7 SimpleCV

SimpleCV is computer vision made simple. It is an open source framework, which is built on and accesses the OpenCV libraries. It is written in Python and runs on Windows, Mac OS and Ubuntu Linux. The applications are the same as OpenCV’s but is simplified into easy to use Application Programming Interface, API [Sight- Machine, 2016].

6

(21)

Chapter 3

Demonstrator

This chapter describes the design and the development process of the robot arm and it’s sorting functions. It addresses the hardware, software and electronics used in this project.

3.1 Problem Formulation

To properly sort the objects, the demonstrator is used to solve several problems.

• The robot arm is equipped with five DC-motors, sensors needs to be added to control movement.

• The Adafruit Motor Shield v2.3 can only control four DC-motors.

• The fifth DC-motor controls the claw, a H-bridge needs to be added to control movement.

• The Arduino UNO is too slow for image processing.

• The Raspberry Pi needs to communicate with the Arduino.

• The pins of the Raspberry Pi holds 3.3 volts while the Arduino’s holds 5 volts.

• The Camera Pi captures high-definition pictures, these pictures needs to be processed in high-speed.

3.2 Hardware

The demonstrator consists of a robotic arm with four different axis of rotation, a claw, the camera module, the Raspberry Pi, Arduino and housing for the micro- controller and mini-computer. The robot can be seen as a whole in Fig. 3.1

(22)

CHAPTER 3. DEMONSTRATOR

Figure 3.1. The construction seen from the side

3.2.1 Robotic arm

The robotic arm is equipped with four DC-motors which control their specific rota- tional axis. The arm is also equipped with a gripping claw driven by a DC-motor.

This robotic arm was bought in small parts and assembled with instructions from blueprints provided with the parts. The blueprints can be found in the list of ref- erences. [Elfa, 2016] The consequence of the robot arm being driven by DC-motors is that it is not possible to know where in the room the arm is positioned. This problem was solved using four potentiometers, which function as angular sensors.

Depending on how the shaft is rotated that value can be translated to what max- imum and minimum angle that one specific axis is allowed to rotate. Limited by these sensors, the arm will not pass the end points and break. The potentiome- ters were glued onto the robotic arm in such way that the potentiometer’s shaft is steady at all times during movement. Instead the base of the potentiometer is rotated when the motors rotate the axis. The potentiometer solution can be seen in Fig. 3.2

Figure 3.2. Close-up on potentiometer glued on the robot

8

(23)

3.2. HARDWARE

3.2.2 Camera Pi

The camera module is connected to the RPi using the Camera Serial Interface (CSI)-port and is mounted on the robotic arm with a self-made 3D-printed case, see Fig. 3.3. The camera is stationary and have the same view during the rotation, with the consequence that the object has to be placed in the same position every cycle.

Figure 3.3. Close-up on how Camera Pi is attached

3.2.3 Other assemblies

The Arduino, Adafruit Motor Shield and RPi are mounted on a 3D-printed casing with screws and nuts. The final assembly can be seen in Fig. 3.4

Figure 3.4. Housing for Arduino, Raspberry Pi, toggle switch and H-bridge

(24)

CHAPTER 3. DEMONSTRATOR

3.3 Software

The RPi analyzes the color of the object and depending on which color the camera sees, it sends either a 0 or a 1 to the Arduino’s serial port. The Arduino uses these values to determine which DC-motors should be powered, in order to run in different directions. The software algorithm is explained in Fig. 3.5.

Figure 3.5. Flowchart for the demonstrator software algorithm

3.3.1 Robotic arm connected to Arduino

The software for controlling the robotic arm is written in the Arduino IDE, which can be seen in Appendix D. The program contains different states depending on where in the process the arm is. After every complete cycle the arm return to a standby position. The states are programmed so that they are controlled based on the values from the potentiometers. The movement of the arm is a predefined path which is hard coded. These values are determined by manual control of the motors, meaning when the arm is in a position that is desirable, that value is programmed on the Arduino. This is done for all motors, in all different states. Because of this, the object needs to be placed in a specific position to be picked up properly.

10

(25)

3.3. SOFTWARE

3.3.2 Robotic claw

The claw is controlled with PWM-signals on digital pin 3 on the Arduino. The claw is programmed to open and close in a specific timing during the arm’s movement.

This means that there are no actual sensors that tell us if the claw is actually holding something. The PWM-signal is set below it’s middle value when we want to open the claw, and vice versa if the claw should close.

3.3.3 Camera Pi

The software for the camera is written in Python by using the PiCamera [PiCamera, 2016] library. The camera takes a still picture every second which is analyzed by the color identification part. The resolution is lowered to 640x480 pixels for the RPi to work faster.

3.3.4 Color identification

The color identification process uses an open-source library named OpenCV [Itseez, 2016], which is accessed with SimpleCV [SightMachine, 2016]. Both are installed and compiled on the RPi, with Python 2.7. The software uses the picture taken by the Camera Pi to analyze the center pixel of the image. The RGB (Red, Green, Blue) color channel values of that pixel is analyzed and depending on the red value, either a "0" or a "1" is sent from the Serial Port of the RPi. This means that the sorted object needs to be in the center of the image. The RGB value analyzed from the picture is dependant on how much light the camera is receiving. A consequence of this can be that the robot needs to calibrated when moved to a different location, where the external lightning is different.

3.3.5 Connect Raspberry Pi and Arduino

Arduino and RPi connects with the General-Purpose Input/Output (GPIO) pins on the Raspberry Pi and the Serial pins on the Arduino. Two different methods were considered to connect the two devices, either using the Serial Peripheral Interface (SPI), or the Inter-Integrated Circuit (I2C) interface. The decision to use SPI was made because this method is easy to setup and operates faster. [Harrison, 2016]

The communication is only one way because of the difference in voltage between the two devices. The RPi’s serial pin holds 3.3 volts while the Arduino’s at 5 volts.

Only the RPi will send data to the Arduino, because if the Arduino would send data to the RPi, it could be damaged. The serial port called "ttyAMA0" is used to send data from the color identification program, meaning all other communication on that specific port has to be disabled. By default, the RPi uses this port to communicate general information that could be useful in other applications. It is not in this case, which means it needs to be disabled for the method to work. This is done in the "/root/" and "/boot/" directories on the RPi. All associations with this

(26)

CHAPTER 3. DEMONSTRATOR port has been disabled or removed in these directories. [Liang, 2016] This allows the RPi only to communicate on this port from the color identification program.

3.4 Electronics

This section describes how all components, including DC-motors, potentiometers, H-bridge, the micro-controller and mini-computer are connected.

3.4.1 Robotic arm

The potentiometers are soldered onto the analog input ports on the Arduino Uno board, the four DC-motors are connected to the Adafruit Motor Shield motor inputs, see Fig. 3.6. The motor shield also needs an external power source in order to run the motors, a 5 volt power source is chosen.

Figure 3.6. Arduino block diagram and wiring

3.4.2 Robotic claw

The PWM signals are sent on the digital pin 3 on the Arduino, this port is con- nected to a H-Bridge that we built ourselves from a schematic distributed during the introduction of this project, see , see Appendix A [Grimheden, 2016]. The out- puts from the H-Bridge are soldered onto a toggle switch in order to turn the power provided for the claw off. This in order to re-upload the program on the Arduino, the PWM-signal is reset which makes the DC-motor run uncontrollably and could possibly destroy the claw. The schematic can be found in Fig 3.7

12

(27)

3.4. ELECTRONICS

Figure 3.7. Toggle switch block diagram and wiring

3.4.3 Connect Raspberry Pi and Arduino

Since the RPi’s serial pin only holds 3.3 volts and the Arduino’s at 5 volts, a voltage divider must be used, which is basically two resistors. [Liang, 2016] The schematic can be found in Fig 3.8.

Figure 3.8. Arduino and Raspberry Pi serial connection schematic

(28)
(29)

Chapter 4

Results

This chapter presents the results gathered from testing the robotic arm.

4.1 Robot Arm

An experiment were made where one movement cycle of the arm was observed.

One cycle includes movement from standby, picking up the object, transporting the object, dropping the object and back to standby. The measurements obtained from the experiments can be seen in Tab.4.1

Table 4.1. Time required for each cycle

Cycle no. Time (s)

1 15,30

2 15,19

3 15,36

4 15,38

5 15,31

6 15,65

7 15,56

8 15,13

Avg. 15,36

4.2 Color Identification

Several tests were made with the Camera Pi, 16 different colors which was randomly selected was photographed by the camera and ran through the color identification program to analyze their RGB values. All tested colors can be found in Appendix C. The RGB-values from the output from the camera and color identifcation, as well as input provided from the testsheet, can be found in Fig. 4.1

(30)

CHAPTER 4. RESULTS

Figure 4.1. Difference in RGB values captured by Camera Pi.

All results from the tests is gathered in Tab. 4.2

16

(31)

4.2. COLOR IDENTIFICATION

Table 4.2. All results from RGB tests

RGB (In) RGB (Out) 255 255 255 180 148 133

0 0 0 55 59 58

255 0 0 195 88 106

0 255 0 215 214 0

0 0 255 166 96 71

255 255 0 222 226 115 0 255 255 211 207 195 255 0 255 239 139 193 125 125 125 179 146 137 200 50 50 204 104 128) 200 120 30 203 152 149

40 100 70 155 145 58

200 35 125 216 118 165

150 30 30 180 86 111

255 175 255 231 178 184

50 50 100 148 89 89

(32)
(33)

Chapter 5

Discussion and conclusions

This chapter discusses and summarizes the results presented in the previous chapter.

The summary is based on an analysis of results and aims to answer the research questions.

5.1 Discussion

The robot can pick up an object and sort it according to color, which makes it possible for us to evaluate our research questions. However, we encountered many problems along the way and our solution can be improved in many ways. For more recommendations and future work see chapter 6.

5.1.1 Different motors

The main factor that could improve the project is to change the type of motor used by the robot arm. Using DC-motors makes it difficult to control the robot.

The potentiometers makes it possible to control it in a limited fashion, but not good enough compared to other solutions. Servo-motors on the other hand can control the positions of the motors. With a feedback system that a servo-motor enables it gets easier to pick up objects independent of its exact location. Using potentiometers limits us to only picking up objects on a predefined position. Servo-motors would also prevent the motors from running uncontrollably and possibly break, something we had to spend a lot of time on during this project. However, this solves how much of the predefined path that needs to be hard coded, is not something we have evaluated. Another problem we encountered with the motors was the high- pitch sound that the motors generated while running on PWM-signals. This was solved in the Arduino-code by changing what frequency the PWM-signal operated on. Other disturbing sounds that we could not fix was the creaking sounds from the plastic gearboxes that was included with the DC-motors. This is something that could be fixed if another motor type were chosen. Then another gearbox can be used, with higher strength and stability.

(34)

CHAPTER 5. DISCUSSION AND CONCLUSIONS

5.1.2 Raspberry Pi

We decided to use the Raspberry Pi Model A+ for this project, a choice motivated by the low price and the fact that the hardware could do all image processing needed for the project. A consequence of this choice was the additional time spent on installing the required libraries and software on the RPi. We had to buy a USB hub and a Ethernet-to-USB-adapter, which resulted in a more expensive device than if we for example would have bought the latest RPi model. One advantage with using the older model was the power usage, weight and size. This choice of mini-computer could have been evaluated more, to prevent unnecessary time consumption. For reference, see all Raspberry Pi models specifications in Appendix B.

5.1.3 Camera Pi

The Camera Pi has great potential, which is not fully used is not used for this project. The resolution is lowered, but the quality is still satisfactory. The ability to see different shapes is possible with the camera but this is not included in this project. This is something one could continue to work with in another project. The connection between camera module and mini-computer is very simple and was not something we had to spend extra work on. It is also very simple to mount the module because of the screw holes on the camera.

5.1.4 Color Identification

The color identification is very sensitive to external lightning, because it is purely based on the R-value in the RGB-value provided from the image. The camera sends a picture to the RPi which is not an ideal solution due to the fact that the R-value varies depending on e.g. the shutter speed. This means that the camera might not see the objects "real" R-value, and sort it incorrectly. One other solution could be using Hue, Saturation, Brightness (HSB) color spacing method instead of basing the sorting on RGB-values. This requires more research and is mentioned in chapter 6. Because our sorting algorithm only depends on the R-value the color recognized as red could be magenta or yellow, depending on the G-values and B-values of the image, which is another argument for changing color identification method.

5.1.5 Results

The final results of the project were in line with what we expected, the demonstrator could answer our questions in a satisfactory fashion. With the hardware provided we could not optimize the sorting path further, but the time of one sorting cycle is not fast enough for this project to be relevant for the market. Therefore, we would recommend to continue working on the speed of the motors, and toughen up the robot so that it can perform more precise sorting, with faster cycle speed. All colors were randomly chosen in the color identification experiment. One conclusion is that the camera is very sensitive in its red channels. For example, when the

20

(35)

5.2. CONCLUSIONS

input was 0 the system responded with an R-value of 215. To calibrate this camera is not a difficult task but it’s something we haven’t spent time on. Depending on lightning, weather, and surroundings we had to re-calibrate the input channels each time to attain decent results. Even though this is a tiring task it is still possible with our chosen method. We believe that if the robot would be installed in a specific environment, this problem could be solved using different color channel modifications and change in the cameras brightness and contrast. This is something that is possible with the Camera Pi and RPi. For us to be satisfied with the results for our last research question we would like to do more tests and work with another color identification method, this is something we are discussing further in the next chapter.

5.2 Conclusions

Many time-consuming tasks throughout the project could have been avoided if we would have invested in proper hardware from the beginning, which would also give better results and a more precise sorting. This would, however, increase the cost for the project which is a restriction we cannot overlook. With that in mind we are happy with the results and satisfied with our demonstrator and the way it could answer our research questions.

(36)
(37)

Chapter 6

Recommendations and future work

This chapter provides recommendations for more detailed solutions and future work.

6.1 Recommendations

First and foremost our recommendations for attaining better results is to use a different RPi model. The time spent on building workarounds for the old model we used could have been spent on improving the several parts of the project. As mentioned in the discussion, changing the motor type would be beneficial for more precise movement of the motors. Servo-motors is something we would recommend, because they have a controllable built-in parameter and often includes a gearbox.

Another recommendation is to look over the color identification method using the RGB-values. The lightning caused problems for the camera to identify the given RGB-value. A better method that could be implemented on the project is HSB- color space. HSB categorizes the color in three different parameters in comparison with RGB that only categorizes the color channels. The advantage of using this model is that one of the parameters is brightness, which could solve some of the problems in this project. [Tech, 2016]

6.2 Future work

One thing that could be improved even more is the predefined paths that the robotic arm moves in. This can be optimized using linear interpolation, something that is not done in its current state. The robot arm is adjusted manually, so it may be possible that it is not moving in the fastest path. To sort objects based on both color and shape is an easy next step for this project. The camera is capable to identify several colors and shapes at the same time. That makes it possible to identify object who contains more than one color but still have different shapes.

Using the SimpleCV application with OpenCV library we can modify the color identification program so that the image is processed into black and white with the Binarize-function. The next step is to use a method based on Hugh Transform

(38)

CHAPTER 6. RECOMMENDATIONS AND FUTURE WORK to define the contour of the object and then check if it is a specific shape. These functions already exits in the SimpleCV library. [OpenElectronics, 2016] Through our research we have encountered several examples of robotic arms that is used in the industry today. The latest robots often has six possible axial movements [Andersson, 2016], enabling faster and more precise sorting. This is something one could look at in modifying our project idea. Another idea is to make the robot mobile, the range of applications then broadens. The camera and identification part used in this project could be modified so that it also could detect where the object is placed and then pick it up accordingly. A more integrated design could also be developed, making the robot more desirable on the market.

24

(39)

Bibliography

[Adafruit, 2016] Adafruit (2016). Overview | adafruit motor shield v2 for arduino. Available from: https://learn.adafruit.com/

adafruit-motor-shield-v2-for-arduino/overview [cited 2016-04-15].

[Andersson, 2016] Andersson, J.-E. (2016). Slutrapport robotteknik i sagverk.pdf. Available from: http://www.ltu.se/cms_fs/1.82945!/file/

SlutrapportRobotteknikisagverk.pdf [cited 2016-05-08].

[Arduino, 2016] Arduino (2016). What is arduino? Available from: https://www.

arduino.cc/en/Guide/Introduction[cited 2016-04-12].

[ArduinoMotorShield, 2016] ArduinoMotorShield (2016). Arduino - ar- duino motor shield. Available from: https://www.arduino.cc/en/Main/

ArduinoMotorShieldR3[cited 2016-04-12].

[Elfa, 2016] Elfa (2016). owi 535 manual. Available from: https://www.elfa.se/

Web/Downloads/_d/ts/c-9895_1_eng_dts.pdf?mime=application.pdf [cited 2016-05-02].

[Grimheden, 2016] Grimheden, M. E. (2016). Fim 2016 - experiment 3. Avail- able from: https://www.kth.se/social/files/56a89c13f2765417dfd6fe1c/

Experiment3.pdf[cited 2016-04-28].

[Harrison, 2016] Harrison, M. (2016). Tradeoffs when considering spi or i2c.

Available from: http://electronics.stackexchange.com/questions/29037/

tradeoffs-when-considering-spi-or-i2c[cited 2016-05-02].

[Instructables, 2016] Instructables (2016). Robotarm controlled with arduino. Available from: http://www.instructables.com/id/

Intro-and-what-youll-need/step3/Load-the-Arduino-code/ [cited 2016- 05-02].

[Itseez, 2016] Itseez (2016). Opencv. Available from: http://opencv.org/ [cited 2016-04-15].

[Liang, 2016] Liang, O. (2016). Raspberry pi and arduino connected over serial gpio - oscar liang. Available from: https://oscarliang.com/

raspberry-pi-and-arduino-connected-serial-gpio/ [cited 2016-04-29].

(40)

BIBLIOGRAPHY [OpenElectronics, 2016] OpenElectronics (2016). Computer vision with raspberry pi and the camera pi module. Available from: http://www.open-electronics.

org/computer-vision-with-raspberry-pi-and-the-camera-pi-module/

[cited 2016-04-27].

[OWIRobots, 2016] OWIRobots (2016). Owi-535 robotic arm edge.

Available from: http://www.owirobots.com/store/catalog/

robotic-arm-and-accessories/owi-535-robotic-arm-edge-kit-110.html [cited 2016-04-12].

[PiCamera, 2016] PiCamera (2016). Python picamera - raspberry pi documenta- tion. Available from: https://www.raspberrypi.org/documentation/usage/

camera/python/README.md[cited 2016-05-08].

[raspberrypi, 2016] raspberrypi (2016). Basic image processing. Avail- able from: https://www.cl.cam.ac.uk/projects/raspberrypi/tutorials/

robot/image_processing/[cited 2016-05-02].

[RaspberryPi, 2016] RaspberryPi (2016). Raspberry pi. Available from: https:

//www.raspberrypi.org/help/what-is-a-raspberry-pi/[cited 2016-04-12].

[RaspberryPiFoundation, 2016] RaspberryPiFoundation (2016). Camera module - raspberry pi. Available from: https://www.raspberrypi.org/products/

camera-module/ [cited 2016-04-12].

[RaspbianProject, 2016] RaspbianProject (2016). Frontpage - raspbian. Available from: https://www.raspbian.org/FrontPage [cited 2016-04-28].

[SightMachine, 2016] SightMachine (2016). Simplecv. Available from: http://

simplecv.org/[cited 2016-04-20].

[Tech, 2016] Tech, N. M. (2016). Introduction to color theory. Avail- able from: http://infohost.nmt.edu/tcc/help/pubs/colortheory/web/

hsv.html[cited 2016-05-04].

26

(41)

Appendix A

Breadboard layout for H-bridge

(42)
(43)

Appendix B

Specifications of Raspberry Pi models

(44)

APPENDIX B. SPECIFICATIONS OF RASPBERRY PI MODELS

2

(45)

Appendix C

All colors tested with Camera Pi

(255,255,255) (0,0,0) (255,0,0) (0,255,0)

(0,0,255) (255,255,0) (0,255,255) (255,0,255)

(125,125,125) (200,50,50) (200,120,30) (40,100,70)

(200,35,125) (150,30,30) (255,175,255) (50,50,100)

(46)
(47)

Appendix D

Arduino Code for the Robot Arm

(48)

APPENDIX D. ARDUINO CODE FOR THE ROBOT ARM

2

(49)
(50)

APPENDIX D. ARDUINO CODE FOR THE ROBOT ARM

4

(51)
(52)
(53)
(54)

www.kth.se

References

Related documents

In this scenario the robot with the color detecting ability, the Pheromone- finder, should move around until the colored line is detected, see Figure 6.14.. When this occurs the

Which each segment given a spectral color, beginning from the color red at DE, through the color orange, yellow, green, blue, indigo, to violet in CD and completing the

När du rör dig runt mitt verk kommer du hela tiden upptäcka något nytt, samtidigt som något du ser försvinner.... Och så

We keep the pixel color information stored with a gamma 2 color profile in memory and whenever we perform any blending operations on the shader we convert to linear and then back

Base on table (4.2), the color of fabric, printed by pyrisma pigments, change in five different angles from pink through yellow to green, and these changes are more than other

Figure 44: Average inter sample standard deviation per color channel Figure 44 shows the average inter-sample standard deviation per color sam- ple for the original targeted and

Hence no expansion in harmonic oscillator modes is possible, which means that no quark field quanta (quarks) can exist. Only if I) QCD is wrong, or II) quanta are not

The aim of this study is to describe the perspectives of rural community members in relation to sexual violence, to the existing support of victims, and to general social and