• No results found

Extraction and Application of Secondary Crease Information in Fingerprint Recognition Systems

N/A
N/A
Protected

Academic year: 2021

Share "Extraction and Application of Secondary Crease Information in Fingerprint Recognition Systems"

Copied!
41
0
0

Loading.... (view fulltext now)

Full text

(1)Examensarbete LITH-ITN-MT-EX--05/007--S. Extraction and Application of Secondary Crease Information in Fingerprint Recognition Systems Pontus Hymér 2005-03-09. Department of Science and Technology Linköpings Universitet SE-601 74 Norrköping, Sweden. Institutionen för teknik och naturvetenskap Linköpings Universitet 601 74 Norrköping.

(2) LITH-ITN-MT-EX--05/007--S. Extraction and Application of Secondary Crease Information in Fingerprint Recognition Systems Examensarbete utfört i medieteknik vid Linköpings Tekniska Högskola, Campus Norrköping. Pontus Hymér Handledare Henrik Storm Examinator Björn Kruse Norrköping 2005-03-09.

(3) Datum Date. Avdelning, Institution Division, Department Institutionen för teknik och naturvetenskap. 2005-03-09. Department of Science and Technology. Språk Language. Rapporttyp Report category. Svenska/Swedish x Engelska/English. Examensarbete B-uppsats C-uppsats x D-uppsats. ISBN _____________________________________________________ ISRN LITH-ITN-MT-EX--05/007--S _________________________________________________________________ Serietitel och serienummer ISSN Title of series, numbering ___________________________________. _ ________________ _ ________________. URL för elektronisk version http://www.ep.liu.se/exjobb/itn/2005/mt/007/. Titel Title. Extraction and Application of Secondary Crease Information in Fingerprint Recognition Systems. Författare Author. Pontus Hymér. Sammanfattning Abstract This. thesis states that cracks and scars, referred to as Secondary Creases, in fingerprint images can be used as means for aiding and complementing fingerprint recognition, especially in cases where there is not enough clear data to use traditional methods such as minutiae based or correlation techniques. A Gabor filter bank is used to extract areas with linear patterns, where after the Hough Transform is used to identify secondary creases in a r, theta space. The methods proposed for Secondary Crease extraction works well, and provides information about what areas in an image contains usable linear pattern. Methods for comparison is however not as robust, and generates False Rejection Rate at 30% and False Acceptance Rate at 20% on the proposed dataset that consists of bad quality fingerprints. In short, our methods still makes it possible to make use of fingerprint images earlier considered unusable in fingerprint recognition systems.. Nyckelord Keyword. biometrics, fingerprint analysis, feature extraction, secondary creases, template aging, Gabor filter, pattern recognition, Hough transform, vector clustering.

(4) Upphovsrätt Detta dokument hålls tillgängligt på Internet – eller dess framtida ersättare – under en längre tid från publiceringsdatum under förutsättning att inga extraordinära omständigheter uppstår. Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut enstaka kopior för enskilt bruk och att använda det oförändrat för ickekommersiell forskning och för undervisning. Överföring av upphovsrätten vid en senare tidpunkt kan inte upphäva detta tillstånd. All annan användning av dokumentet kräver upphovsmannens medgivande. För att garantera äktheten, säkerheten och tillgängligheten finns det lösningar av teknisk och administrativ art. Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den omfattning som god sed kräver vid användning av dokumentet på ovan beskrivna sätt samt skydd mot att dokumentet ändras eller presenteras i sådan form eller i sådant sammanhang som är kränkande för upphovsmannens litterära eller konstnärliga anseende eller egenart. För ytterligare information om Linköping University Electronic Press se förlagets hemsida http://www.ep.liu.se/ Copyright The publishers will keep this document online on the Internet - or its possible replacement - for a considerable time from the date of publication barring exceptional circumstances. The online availability of the document implies a permanent permission for anyone to read, to download, to print out single copies for your own use and to use it unchanged for any non-commercial research and educational purpose. Subsequent transfers of copyright cannot revoke this permission. All other uses of the document are conditional on the consent of the copyright owner. The publisher has taken technical and administrative measures to assure authenticity, security and accessibility. According to intellectual property law the author has the right to be mentioned when his/her work is accessed as described above and to be protected against infringement. For additional information about the Linköping University Electronic Press and its procedures for publication and for assurance of document integrity, please refer to its WWW home page: http://www.ep.liu.se/. © Pontus Hymér.

(5) Abstract This thesis states that cracks and scars, referred to as Secondary Creases, in ngerprint images can be used as means for aiding and complementing ngerprint recognition, especially in cases where there is not enough clear data to use traditional methods such as minutiae based or correlation techniques. A Gabor lter bank is used to extract areas with linear patterns, where after the Hough Transform is used to identify secondary creases in a r, θ space. The methods proposed for Secondary Crease extraction works well, and provides information about what areas in an image contains usable linear pattern. Methods for comparison is however not as robust, and generates False Rejection Rate ≈ 30% and False Acceptance Rate ≈ 20% on the proposed dataset that consists of bad quality ngerprints. In short, our methods still makes it possible to make use of ngerprint images earlier considered unusable in ngerprint recognition systems.. Keywords biometrics, ngerprint analysis, feature extraction, secondary creases, template aging, Gabor lter, pattern recognition, Hough transform, vector clustering.

(6) Preface This thesis and its goal has been formulated and initiated in cooperation with Fingerprint Cards AB, Göteborg. I would like to thank the team at Fingerprint Cards for valuable assistance during the work process, especially the algorithm group consisting of my project supervisor Henrik Storm, and Eric Setterberg. Without their knowledge and solid experience in the area of ngerprint analysis, this thesis would have been far more dicult to nish. Last but not least, thanks goes out to my opponent Karin Dahlberg for her valuable comments on the report, and to my examiner Björn Kruse at ITN. It has truly been an interesting experience. Göteborg, 9 March 2005 Pontus Hymér. 1.

(7) Contents 1 Introduction 1.1 1.2 1.3 1.4 1.5 1.6. Reader Prerequisites Problem Description Goal . . . . . . . . . Outline of report . . Methodology . . . . Tools . . . . . . . . .. . . . . . .. 2 Biometry Overview 2.1 2.2 2.3 2.4. . . . . . .. . . . . . .. . . . . . .. The Principles of Biometry Fingerprint Identication . Identication vs Verication Template Ageing . . . . . .. . . . . . . . . . .. . . . . . . . . . .. . . . . . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. . . . . . .. 3. 3 3 4 5 5 6. 7. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . 7 . 8 . 9 . 10. 3.1 System Architecture . . . . . . . . 3.2 Preprocessing . . . . . . . . . . . . 3.2.1 Normalization . . . . . . . . 3.3 Segmentation . . . . . . . . . . . . 3.3.1 Masking . . . . . . . . . . . 3.3.2 Secondary Crease Detection 3.4 Feature Extraction . . . . . . . . . 3.4.1 R/θ Parameter Space . . . 3.4.2 The Hough Transform . . . 3.5 Feature Comparison . . . . . . . . 3.5.1 Normalization . . . . . . . . 3.5.2 Comparison Algorithm . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. . . . .. 3 Methods. 4 Experiments and Results. 4.1 Dataset Description . . . . . . . . 4.2 Results and Analysis . . . . . . . . 4.2.1 Segmentation . . . . . . . . 4.2.2 Secondary Crease Detection 2. 11. 11 11 12 13 13 14 19 19 20 22 22 22. 25. 25 26 26 28.

(8) 4.2.3 Feature Comparison . . . . . . . . . . . . . . . . . . . 28. 5 Discussion. 32. 5.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 5.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . 33. 3.

(9) Chapter 1. Introduction This chapter aims to present an overview over the thesis in terms of goals, limitations, environment and structure. The reader will get orientated in the overall purpose and which parts of the report concerns dierent matters.. 1.1 Reader Prerequisites To fully assimilate the information presented in this thesis report, the reader should be oriented in the area of multivariable calculus, linear algebra and image analysis. Basic knowledge in the area of biometry in general and ngerprint recognition in particular is also preferred.. 1.2 Problem Description Human ngerprints comprise of a series of valleys and ridges. The uniqueness in the pattern these whorls create is used in dierent ways to identify a user in ngerprint recognition systems. For some ngers though, the pattern is corrupted due to sweat, dryness, scarry tissue etc, which causes interruptions in, and sometimes complete absence of, the linear pattern found in good quality ngerprint images. These creases, below referred to as Secondary Creases or SC, today complicates the process of extracting distinct features for the ngerprint at hand using most studied techniques. In some ngerprints of real bad quality these creases are the only distinguishable features. The main question this thesis aims to answer is thereby set: Can these creases be used in a meaningful way? This thesis is produced on location and in cooperation with Fingerprint Cards AB (below FPC) in Göteborg, Sweden. The company has since 1996 been developing systems for analyzing and matching unique ngerprint patterns to determine or verify a persons identity. The systems include microchips with algorithms which, without the help of external CPU power, 4.

(10) (b) Bad (a) Good quality print quality print Figure 1.1:. Examples of ngerprint quality. reads, stores and compares textural patterns in ngerprint images. The company has developed two types of capacitive sensors; the area sensor and the smaller line sensor[4]. The latter being the latest model with a much smaller sensing area, it instead requires the user to swipe his/hers nger over the sensor.. 1.3 Goal We state that abnormalities in ngerprints, such as creases and scars, can be used for identication, since they are relatively stable over time, and may sometimes be the only visible unique features in a ngerprint image. The goal of this thesis is divided into two main parts: 1. Locate areas in ngerprint images containing Secondary Creases 2. Parametrize these creases, develop and implement a method for comparison A system for presenting solutions to these issues is designed.. Requirements The system to be designed is required to: • Handle high and poor quality images in the same environment without. necessary manual intervention, such as parameter justifying. • Be robust and trustworthy over large quantities of data • Be easy to use as an educational overview of an Fingerprint Identica-. tion System. 5.

(11) Delimitation The system does not: • Accommodate for large rotations or translations in comparison since. this is a dierent problem already handled by the present core algorithm. • Handle input data from arbitrary image collection systems, since the. algorithms are specically designed for FPC area sensor. • Consider computational eciency and memory usage. 1.4 Outline of report This report is divided into ve chapters 1. Chapter 1 - Introdcution gives an overview of this report and its parts, and is soon to its end 2. Chapter 2 - Biometry Overview will introduce the reader in the subject of biometry and is a good start for a novice in that area 3. Chapter 3 - Methods describes the work done in evaluating and implementing dierent methods throughout the entire process 4. Chapter 4 - Experiments and Results presents the results derived for the chosen test data with the methods described in Chapter 3 5. Chapter 5 - Discussion concludes the thesis with a review and discussions about further work in the area In general throughout the report images, and often series of these, have been used to provide a describing environment for the methods investigated. More than often in this project, drawings has proven to perform the descriptive task for the reader a lot better than lengthy theoretical expositions. This is clearly a report in imagery, therefore imagery should be the tool to describe it in.. 1.5 Methodology To design a system for identication/verication of ngerprints using information on SC, an overall system architecture was early decided on as described in section 3.1 below. The dierent stages has then been investigated one by one, keeping them clearly separable with predened in- and out data in between (g 3.1). Each stage, as seen below, has been the subject of an "search, adapt and evaluate" strategy to be able to cover many dierent 6.

(12) approaches, of which some has been determined to t the purpose better than others with respect to the tools used. 1. Preprocessing The ngerprint image is adjusted for optimal performance 2. Segmentation Foreground and background are separated, and information about areas containing SC is extracted 3. Feature extraction A parametric representation of the SC is produced 4. Feature comparison Information on two dierent data sets are compared to verify a possible match. Criticism of methodology The work has very much resembled a "trial-. and-error" process which is unfortunately inevitable in an area this sparse and multi faceted. At many times the method has been time consuming and we ended up with a lot of dead ends. A more theoretical background investigation could have been conducted after which the results could have been implemented. We have however found the latter approach impossible due to the amount of details to take into consideration. The uncertainty in the possibilities of the tools used were also a deciding factor. The aim is further to aid future investigation in these areas of research, why a negative result is as valuable as any as long as it is well documented.. 1.6 Tools Software MATLAB 6.5 is used in algorithm and GUI development, and has been provided by FPC on location. LATEXand TeXnicCenter is used for report writing and has been downloaded as freeware licenses [7].. 7.

(13) Chapter 2. Biometry Overview The main objective with this chapter is to introduce the reader to biometry in general and ngerprint identication in particular. Some terminology is presented and the history of ngerprint technology is briey mentioned.. 2.1 The Principles of Biometry Biometry is the science of verifying the identity of a specic person using information on the unique physiological or behavioral traits of that person. Examples of physiological traits include the iris, ngerprints, palm prints or the shape of a persons head, while examples of behavioral traits include vocal and pen signatures, keyboard stroke and walking dynamics. The accuracy of systems using these traits for authenticity measures are evaluated by some widely acknowledged factors. A description of these measures are given below. Since the measures are all dependent on each other, the numbers should only be used for approximate references while reading this report.. False Rejection Rate, FRR Number of times a "valid person" is rejected. (false negative) compared to the total number of tests. This factor reveals the algorithm's eciency in rather clear numbers, typical values are 5 − 10%, and is usually the value compared between algorithms.. False Acceptance Rate, FAR Number of times an impostor is accepted. (false positive) compared to the total number of tests. FAR is often preset to a value, depending on wanted security level on the system. High security settings means a low FAR, while FRR in that case often goes up. A typical high security setting would imply an FAR = 1/10 000.. Failure To Enroll, FTE Number of individuals rejected enrollment in. the system compared to the total number of tests. This is a number infected with discussions among suppliers of ngerprint verication algorithms. Many 8.

(14) claim their systems perform with a F T E < 0.1%, resulting in an untold increase in FRR later on. This is because prints of bad quality are allowed to enter the system as templates, rendering in problems at matching the same nger at a later time instance. In reality, and with respect to the entire population, the number should be around 2 − 5%.. 2.2 Fingerprint Identication Fingerprints, as a biometric measure, are as mentioned before ow-like ridges present on the human ngers. It is the pattern created of these ridges and valleys that make a ngerprint unique. Fingerprints have a long history of use in police forensic science, and is to this date the most convenient biometric element with which to identify a person. Other biometric technologies as the ones mentioned above are not as mature and are considered intrusive with high implementation costs, making them impractical for widespread use[1]. To reduce processing time, a ngerprint's features are usually divided into three scales of detail. Of these three levels, classication is the rst step in identifying a ngerprint. The system used today for classication is the same introduced 1899 by a British policeman, Sir Edward Richard Henry, hence the name; The Henry Classication System [2]. The system classies a print into ve dierent categories of global structure of the ridges and valleys; Whorl, Left loop, Arch, Tented arch and Right loop, some of them presented in g 2.1. The system remains intact today, with the extensions of including additional types as Double loop, Central pocket loop and Accidental.. (a) Whorl Figure 2.1:. (b) Left loop. (c) Arch. Examples of ngerprint classes. The second level of detail comprises information about discontinuities in the ridge pattern, often called minutiae points. These points are dened with location and direction in a ngerprint and is the most widely spread means of identication today. In the system studied at FPC, these minutiae points are typically used, but not exclusively, to dene "Distinct Areas" in a ngerprint[4]. The third level of detail includes ne features in the ngerprint, such as ridge and valley width, and sweat pores present in the imprint of the nger. Analysis at this level requires high quality images and is often used as a 9.

(15) complement to the normal feature extraction at higher levels [3]. An AFIS1 , where all the above mentioned aspects are handled, is an automatic pattern recognition system which basically consists of ve main stages: 1. Data acquisition where ngerprint data is collected 2. Image enhancement where the data is preprocessed 3. Feature extraction where information about unique areas in the ngerprint is extracted 4. Feature comparison where information from at least two ngerprints is compared 5. Decision where a match is conrmed/rejected. Data acqusition (enrollment). Image enhancement. Feature extraction. Feature comparison. Decision. Yes. Match. No No match Stored templates. Figure 2.2:. A model of a generic Automatic Fingerprint Identication System. 2.3 Identication vs Verication The last step in the process depicted above typically results in a score, to decide whether the ngerprints do match. The matching process is conducted under dierent conditions depending on what sort of authentication task from the two described below is requested from the system.. Identication Fingerprint identication refers to the process of matching a query ngerprint against a database of information to determine the identity of the individual, i e the system asks "Who is he?". This is the task performed by the police at a crime scene, comparing ngerprints from the crime scene with their database of ngerprints from criminals. It is also used in door entry systems where the systems decides whether the user belongs to an allowed group or not. 1. Automated Fingerprint Identication System. 10.

(16) Verication The person at hand tells the system who he is with the help. of eg a code or access card, and the task for the system is then to, with the help of an image of the persons ngerprint, establish whether the person is the one he/she claims to be. Therefore the relationship is an 1:1 match and much less time consuming than the former type.. 2.4 Template Ageing It is widely acknowledged that most of our biometric traits slightly vary with time, so do our ngerprints. The general underlying pattern is the same, but on a smaller scale valleys will vary in width, pores will be more or less visible and the wetness of the skin will vary from time to time. Even SC will vary, but empirical evaluations on the proposed dataset have shown that compared to the ne patterns in a ngerprint, SC are more stable over time, and can more easily be distinguished from time to time. For at least 2-5% of the population, the underlying pattern in the ngerprint is of so poor quality to the degree that they cannot be used in an ordinary AFIS[6]. This fraction is probably even higher when the population consists of (i) older people; (ii) people exposing their ngertips to extreme conditions, such as heavy duty handy work found in workshops, manufacturing engineering e g; (iii) people living in dry weather conditions or having skin problems; (iv) people with poor ngerprints due to their genetic or racial attributes [11]. For these people, the use of SC may actually be the only way to enable the use of ngerprint authentication.. 11.

(17) Chapter 3. Methods This chapter aims to describe the methods in the system algorithm designed, and with what means dierent issues have been solved. Many dierent algorithms have been investigated, and some may have proven inadequate for the task at hand. These are all described in theory below with the emphasis on the methods used for the nal optimal performance. The reader will rst obtain an overview of the system, where after detailed descriptions of dierent important methods are presented for analytical purposes.. 3.1 System Architecture The architecture seen in gure 3.1 was established in an early stage. Its structure was based on experiences from existing systems and on a theoretical background on possible areas of diculty. For most of the stages below, multiple methods have been investigated, to later be summarized to an optimal design, described last in each stage. For this sake, the stages has been clearly separated with predecided entities inbetween (I, Iprep , Imask , Ihough ), to be able to replace certain steps in the process without aecting the rest of the system. After the rst cycle of design the system was restructured and optimized for best possible performance.. 3.2 Preprocessing Fingerprint images are rarely of perfect quality. They may be degraded and corrupted due to image noise, impression conditions and variations in the skin. Thus, image enhancing techniques must be used prior to feature extraction.. 12.

(18) Segmentation. Preprocessing data Image enhancement. Preprocessed image, Iprep. Secondary Crease detection. Background masking. - Normalization of data. - Background masking with waterfall technique. Morphological process. - Gabor filtering - Linear symmetry - Thresholding algorithm. Background mask, Imask. Image mask, QI IMQI. Feature extraction Sensor data, I. Vector data, [r, theta]. Skeleton image Ihough. Vector extraction. - Hough transform of image - Extract n maximum values - Filtering of candidates. (a) The extraction process Preprocessing data. Template data [r,theta]. Normalization of data Candidate data [r,theta]. Extract vector of movement in [r,theta]-space. -Normalize using std dev and mean values. Match at > m points?. Yes. Vector space creation. Vectors within confidence limit. - Generate vector from ALL in template to ALL in candidate. Vectors with awarded points. Nearest neighbour algorithm - Many vectors pointing in the same direction => possible common translation/rotation. End process. No. (b) The comparison process Figure 3.1:. 3.2.1. An overview of the system algorithm. Normalization. The rst step in the preprocessing stage is to normalize the ngerprint image so that it has a prespecied mean and variance. This results in a maximum span of the greyscale variation in the image, with the help of spreading the histogram of the image across the entire spectrum. This is done by analyzing minimum and maximum values of the image. Iprep (x, y) =. I(x, y) − min(I) max(I) − min(I). with I being the input image pixel values and Iprep ∈ [0, 1].. 13. (3.1).

(19) (a) Original image. (b) Original histogram. Figure 3.2:. (c) Normalized image. (d) Normalized histogram. Before and after normalization. 3.3 Segmentation The segmentation process consists of two main issues; Masking of background information, and Secondary Crease Detection. Each one of these consists of submethods, which are described below. 3.3.1. Masking. Before any analysis of the ngerprint image can be initiated, it needs to be decided which parts of the image corresponds to the ridges and valleys, i e the areas of the nger which have been in contact with the sensor. The result from this stage can be used as a area index, AI , by measuring how much sensor area has been rated as foreground information. An image is considered high in quality and usability with a high AI -value. AI = 1 indicates an image that is recognized to be all foreground information, which is highly unbelievable. P Iprep (x, y) AI = PF , AI = [0, 1] (3.2) Ω Iprep (x, y). with F being the pixel positions where foreground information has been extracted, and Ω indicating the whole set of image pixels. If too much of the image data is rated as being background/useless information, the print can be conrmed useless.. Window Variance The method simply thresholds the original image at. a calculated threshold value, based on the variance value in local regions of the image. X ¯ 2, w ∈ Ω V ar = (I − I) (3.3) w. where w describes the pixel neighbourhood over which to sum the quadratic errors. The variance is then used as a threshold to decide whether the region 14.

(20) w is foreground or background. High variance indicates varying patterns, e g ridges and valleys, while as low values indicates homogeneous background data. High quality images results in very correct background masking, while it quickly becomes useless when image quality decreases. The method was rejected due to a too coarse scale in the resulting mask, thus generating even more pattern-like areas for the system to misinterpret as ngerprint pattern. The method also lacked possibilities to allow lighter areas within a ngerprint, which can be the case with large SC. Finally, the method is very time consuming. These disadvantages made us discard the method, in a search after a more appropriate one.. "Waterfall variance" The method used is a further developed variance based method in the spatial domain, where one could imagine dripping a drop of water in each pixel column, to see where it hits a pixel with a neighbourhood variance over a certain adaptive threshold. This routine is performed on each column from above, with some boundary conditions, where after the same is applied from below. The mask, Imask , is then ltered with a mean value lter to soften sharp edges, and nally applied to the image according to eq 3.4 with the mean value representing the background. This way, the transition from background to foreground information will not be misinterpreted as edges. The resulting image is passed on to the next stage, and Imask is stored in the memory for use further in the process. . h. i. Imasked = 1 − (1 − Imask ) ∗ (1 − Iprep ). · Iprep. (3.4). (a) (b) Extracted (c) Masked Preprocessed mask, Imask image, image, Iprep Imasked Figure 3.3:. 3.3.2. Overview of the image masking process. Secondary Crease Detection. This stage has the task of accentuating visible edges from the image derived in segmentation and to lter out what may be interesting edges for the purpose, since the regular ngerprint pattern also will respond to the 15.

(21) edge enhancing methods. This stage has without doubt been the greatest challenge in the thesis. Methods proposed for this purpose are widely investigated and most methods we came across turned out to produce a slightly dierent output than what we desired. An important task is to establish what distinguishes a SC from an ordinary valley in the ngerprint image. Early studies showed a few primary separable features: • SC tends to be longer through a ngerprint than the ngerprints valleys • SC does not follow any global pattern, more than most of them being. horizontal, and does in that way seem independent of each other. At the same time, some features seemed to be inseparable in opposite to what was expected from theoretical studies. • SC are not always wider than the valleys, i e the two can not be. separated by width alone.. With these features in mind, a few methods were evaluated, described below.. Horizontal averaging lter. Imasked was convolved with an averaging horizontal lter, F, in the spatial domain to accentuate present secondary creases. h i 1 F = 1 1 1 1 2 2 2 1 1 1 1 · (3.5) 14. Although this method still produces a fairly good result on the dataset used it does not in any way consider secondary creases which diers more than π/4 from horizontal alignment, which is why the method was discarded.. (a) Input image,. (b) Filtered image. Imasked Figure 3.4:. (c) Thresholded (b). Filtering with a horizontal averaging lter. Linear symmetry A method for locating areas in an image with a good quality linear pattern has been evaluated, presented in [8]. In short, local gradients in a neighbourhood of the image are compared to see whether a 16.

(22) useful pattern is present. We found that the method works well on good quality ngerprints, such as the one in g 3.5 but it's usefulness quickly deteriorates with decreasing image quality as in g 3.5(c). Therefore the method was discarded in search of some more robust quality invariant method. There will, however, be reasons to return to this method later on.. (a) Good quality image. (b) Linear (c) Bad (d) Linear symmetry quality image symmetry mask for (a) mask for (c). Figure 3.5:. Resulting linear symmetry masks. Gabor lter bank The method that performed the edge detection task the best is based upon the Gabor function. It has been recognized as a very useful tool in computer vision and image processing, especially for texture analysis, due to its abilities to localize properties in both the spatial (the image plane, pixel-by-pixel) and the frequency domain. In the case of SC in ngerprint images, studies have showed that the features we're looking for is often characterized by regions were ridges/SC cross each other in nearly right angles. This is why we use multiple angles for the Gabor lters in our convolution. The 2-D Gabor function is a harmonic oscillator, composed of a sinusoidal plane wave of a particular frequency f (o ) and orientation θ (radians) within a Gaussian envelope with the σx and σy denoting the variance of the envelope in X and Y direction. ". 1 x2θ yθ2 g(x, y; θ) = exp − + 2 σx2 σy2. #!. f cos 2πxθ 360 . . (3.6). xθ = x cos θ + y sin θ. (3.7). yθ = y cos θ − x sin θ. (3.8). For further understanding, we can decompose eq 3.6 into two orthogonal parts, one parallel and the other perpendicular to the orientation σ . The following formula will then be deduced: 17.

(23) g(x, y, f, θ) = h f, θ) · hy (y, θ) x (x;       x2 y2 f = exp − 2σθ2 cos 2πxθ 360 · exp − 2σθ2 x. y. The rst part hx behaves as a 1-D Gabor function which is a band pass lter, and the second one hy represents a Gaussian function which is a low pass lter. In practice, what is performed is a low pass ltering along the orientation θ and a band pass ltering orthogonal to that angle. The band pass property is related with the σx in the sense that low values of σx results in a low pass function.. (a) hx , spatial Figure 3.6:. (b) hy , spatial. (c) g , spatial. Filter images at f = 68Hz , θ = π/6. A number of authors [5], [12], [13] used a Gabor lter bank to extract local image features, and so do we. Typically, an input image I(x, y), (x, y) ∈ Ω, is convolved with the 2D Gabor function above to obtain a Gabor feature image. In our experiments, the lter bank comprises 36 Gabor lters that are the result of using six dierent preferred spatial frequencies f = (20 + 10j), j = (1, 2, . . . , 6) and 6 dierent equidistant preferred orientations θ = k(π/6), k = (1, 2, . . . , 6). The application of such a lter bank to an input image results in a [6x6] matrix of ltered images, seen in g 3.7(b). As seen in g 3.7(b), relatively clear patterns occur in each image, corresponding to the areas with that specic frequency and orientation. This includes both regions with that pattern, as well as individual lines, the latter often corresponding to SC in the image, as seen in the marked image in g 3.7(b). What we want though, is information about lines not being a part of a local pattern. This takes us back to our evaluation of the linear symmetry method, described in 3.3.2, where we needed good quality images to mask out regions with local linear pattern. This is exactly the case here. Therefore, we add all images in each orientation, one cluster marked in the bottom of g 3.7(b). The resulting 6x1 array from the entire 6x6 result bank are then fed through our Linear Symmetry algorithm, rendering in 6 responses where the linear patterns have been surpressed, a subset of that 18.

(24) (a) 6x6 lter bank Figure 3.7:. (b) Bank of resulting images The Gabor lter bank. vector being shown in g 3.8. By adding these 6 responses, we get a resulting. (a) Orientation. (b) Orientation π ,. 3 ∗ π/4, Iedges (3) Figure 3.8:. Iedges (1). Two out of 6 orientations with the linear patterns being supressed. image with high values where multiple local directions are present, and no local pattern being present, as in the case of a typical SC.. Thresholding and Thinning We threshold the sum of Iedges to prepare it. for the upcoming feature extraction, with the help of a simple static threshold value (0.5). The image is then skeletonized with a morphological process to speed up the upcoming Hough transform. Using the thresholded image in the Hough transform would render in unnecessary long computation time, with so many more pixels to process.. Quality Index Based on the previous segmentation where the Area Index was set for an image, and the information from thresholding Iedges , a Quality 19.

(25) (c) (b) (a) Secondary edge image, Thresholded Skeletonized image, Ihough image Iedges Figure 3.9:. Before and after thresholding and skeletonizing. Index is introduced to grade the images usability in Distinct Area Detection. The aim is to grade an image depending on how much area is unusable, ie areas containing SC or being background area. The resulting image is not used further on in the feature extraction, but is instead extracted simply for evaluation purposes at this stage. To calculate QI, an image mask, IMQI , is build up with values equal 1(white) representing usable area. . h. IMQI = 1 − [1 − (Imask > 0.5)] ⊕ Iedges > 1.4 ∗ Iedges. i. (3.9). with ⊕ denoting an logical OR operation on the two resulting images, and Iedges representing the mean value of the image data Iedges . The factors 0.5 and 1.4 have been set based on empirical tests. QI is then calculated in the same manner as the Area Index in eq 3.2.. 3.4 Feature Extraction With a given binary image as input, from the section above, the task is here to extract the linear features, the SC, in a fabricated parameter space appropriate for storage and comparison. This will result in the found SC being represented by a straight line/vector through the image. The technique described below is based on the assumption that the SC are characterized by linear ridges. Having extracted the pixels on and around these lines in the previous step, it remains to nd the best tting vector for these pixels. The parameter space in which to represent these vectors were chosen to coincide with the one used in the Hough transform, described below. 3.4.1. R/. θ. Parameter Space. The equation of a straight is given in parametric form by the equation: x cos θ + y sin θ = r. 20. (3.10).

(26) where r is the length of a normal to the vector from the origin of the image and θ is the angle which the normal makes with the X -axis. For any given vector, r and θ are known. Although in this case we have got a set of pixels, which we would like to represent with single vectors. For this we use eq 3.10. 3.4.2. The Hough Transform. The computations here are quite straight-forward, although some processing of the data in the Hough transform space is required for robust extraction of lines. From eq 3.10 the solution is computed for each (xi , yi ) pair where Ihough (xi , yi ) = 1, yielding a set of values for r and θ. These values are then recorded by incrementing an element of a 2-D array, known as the Hough accumulator, for each (r, θ). Two pixels in the input image laying on the same straight line in the image, will thus both increment the same cell in the Hough accumulator.. (a) Reference image, Ihough. Figure 3.10:. (b) Houghspace with marked maximas. (c) Reference image with superimposed vectors. Hough analysis of a reference image. Detection Of Local Maxima From the image in the fabricated r/θ space. given above, a number of maximas (< nLmax ) are extracted, each maximum representing the best matching vector for the pixels (xi , yi ) contributing to that maximum. As a result, the solution set curves in r/θ space do not intersect in a single point. Thus, the maximum values can not be detected simply by thresholding g 3.11(b) at one predened threshold value. This would render a number of vectors being extracted with local maximas in r/θ space near to each other, resulting in many vectors representing the same pixel cluster in the original image. To avoid this, a specied region around a chosen maximum in r/θ space is suppressed before the next maximum is chosen. A lowest threshold is also set here, which terminates the search for 21.

(27) (a) Fingerprint, I. (b) Extracted Hough space. Figure 3.11:. (c) Fingerprint image with superimposed vectors, V. i 1 2 3 4 5 .. .. θ r -0.06 9 -0.28 48 -0.65 0 -2.47 3 -2.75 -37 .. .. . .. (d) Extracted data. Hough analysis of a Fingerprint Image. more vectors at a certain level, when there's actually no more SC in the image.. Feature Filtering After the application of the threshold to nd local max-. imas in r/θ space, a set of candidate SC has been obtained. Not all of these vectors do, in fact, coincide with an actual SC in the ngerprint image. To further aid the accuracy of the system, the extracted vectors are therefore fed back to the original image for a "reality check". For each vector, a prole is extracted by scanlining the original image along the chosen vector and its immediate surroundings. Depending on an adaptive threshold working with the mean value of the scanline and a xed minimum "hit length" in the original image, each vector is decided whether it seems to have spotted a SC or not. The resulting ltered vector set, Vf , is therefore a subset of V .. (a) Iedges with two vectors superimposed Figure 3.12:. (b) The prole of the rightmost vector, hitlength > hitlengthmin. Example of the feature ltering algorithm. 22.

(28) Feature Representation The resulting vectors are passed on in the process. in the format Vf i (r, θ), i ∈ [1, nl], nl being the number of vectors extracted for each image. To plot these vectors in the original image, eq 3.10 can be transformed to cos θ r y=− x+ (3.11) sin θ. sin θ. for each vector. This would correspond to the normal form y = kx + m where the slope of the vector, k, would be set by k = − cos θ/ sin θ.. 3.5 Feature Comparison The task is here to compare two sets of vectors, V fc and V ft , candidate vectors and template vectors respectively, both being possible partial subsets of one another. 3.5.1. Normalization. To equalize the inuence of dierences in the two parameters r, θ, both variable sets are normalized by the following equations working with µ (mean value) and σ (standard deviation) from the r and θ values of candidate and template data P. µr =. P. P. rc + r t n c + nt. µθ =. P. θc + θt nc + nt. (3.12). Standard deviation in r and t: σr =. q. 2 ) − µ2 E(rall r. σt =. q. 2 ) − µ2 E(θall t. (3.13). With these values, we derive the normalized parameter values: rt,norm =. rt − µr σr. θt,norm =. θt − µθ σθ. (3.14). rc,norm =. rc − µr σr. θc,norm =. θc − µθ σθ. (3.15). Output from this transformation will result in a parameter space as seen in g 3.13. This space displays all candidate and template parameters in r/θ-space. 3.5.2. Comparison Algorithm. The system now needs to establish whether an enough number of vectors seem to be present in both template and candidate data, with a possible common translation and rotation of the SC in the spatial domain. This possible change in both translation and rotation can be hard to overview. 23.

(29) Figure 3.13:. Normalized space. To nd a common translation, we extract all possible movements from the template vectors to all possible, reasonable (within a certain distance), candidate vectors, to try to nd matching couples. These vectors are placed in a new space with all vectors starting at the origin 0,0.. (a) Normalized space with vectors drawn from template data Figure 3.14:. (b) Vectors moved to a common origin (upscaled). Parts of the comparison algorithm. Clusters of vectors, as the one indicated in g 3.14(b) indicates a common movement for many of the vectors, and to evaluate this numerically a nearest neighbour scoring algorithm is engaged. The algorithm visits all vectors to nd out how many neighbours it has within a preset Euclidean distance, calculated with eq 3.16, where p and q are the two points in question, p1 and p2 representing two dimensions. dE =. q. (p1 − q1 )2 + (p2 − q2 )2. (3.16). If there is a clear cluster of vectors, the number of vectors in the cluster itself will contribute to a grade high enough to conrm a positive match of the ngerprint. The threshold level for a positive match is dependent on how many vectors have been extracted from the template and the candidate (nt 24.

(30) and nc respectively). threshold = max(1, min(nc − 2, nt − 2)). (3.17). When a match has been conrmed, the system approves the comparison and the resulting mean translation and rotation is calculated.. 25.

(31) Chapter 4. Experiments and Results This chapter will describe the used ngerprint database, and present the results from this thesis work in terms of measurable quantities for the tested methods. Conclusions and the most interesting test results will be presented here, this to keep this chapter as analyzing as possible, without getting to draughty. A brief comparison with other algorithms present will also be undertaken.. 4.1 Dataset Description The ngerprint database used as dataset for the described methods is a subset of data provided by FPC, based on the aim to handle images today being a challenge for the present algorithms over long periods of time. The data would have to be assigned to the 3 − 4% of the population with bad ngerprint quality, and show ridges as dark and valleys as light pixels. To avoid excessive computation times and to isolate performance on bad quality prints, a few individuals who well represent that investigated segment of the population have been selected where data is available from 4 dierent occasions. These occasions have a total time span on about 3 months, and table 4.1 shows the dataset characteristics. Although the changes over time in the images are the subject of interest here, no attention will be given to the exact timespan between the images, but instead focus on the actual changes in the images. As seen from the samples, dierent persons represents dierent qualities of ngerprints, this to be able to measure the systems individual performance on dierent sets of data. The same nger at a later time instance in the database is also shown to demonstrate the obvious diculties faced by the algorithm.. 26.

(32) Group. Characteristic. Value 4 Number of ngers per subject 6 Number of images per nger 6 Number of time instances 4 Total number of images 576 Timespan 80 days Model Area Sensor FPC4010 Measurement principle Reective capacitive Image size 200 x 152 pixels Resolution 363 DPI Pixel depth 8 bits Sensing area size 10.64 x 14.00 mm. Population Number of subjects. Sensor. Table 4.1:. (a) Person #1 Figure 4.1:. (b) Person #2. (c) Person #3. (d) Person #4. Samples from the ngerprint database at time instance 1. (a) Person #1 Figure 4.2:. Characteristics of dataset. (b) Person #2. (c) Person #3. (d) Person #4. Samples from the ngerprint database at time instance 2, same ngers. 4.2 Results and Analysis 4.2.1. Segmentation. To evaluate the performance of the rst measurable step in the process, the system has been tested with the described dataset, and its performance has 27.

(33) been measured in the entity AI, Area Index, described in section 3.3.1. According to the Waterfall Masking method, the ngerprint database has been evaluated over all time instants for each person, resulting in test results shown in table 4.2. High values indicates an image with large useable areas, with AI = 1.00 indicating no pixels being rejected as background material at all. Testperson Person 1 (JS) Person 2 (JB) Person 3 (IM) Person 4 (CS) Average/low/high Time for calculation Number of prints Table 4.2:. AI 0.940 0.759 0.898 0.883 0.87 12 sec 144. Min 0.788 0.604 0.688 0.663 0.60. Max 0.965 0.874 0.960 0.942 0.96. Test results for Waterfall Masking. Using Waterfall Masking resulted in a reliable quality measures for the purpose, since it allowed large areas within the ngerprint that could be SC. We see that no print is all background information, which is reasonable, since it in every image should be some useful features present. Neither was no image all perfect, which also reects reality. The method also allows large SC in images, without it being considered background information, which is a required feature. What the method lacks though is the reliability when it comes to ngerprint images with bad contact area in the middle of the image, as the one in g 4.3. This was though considered a minor liability, since the present algorithm handles "low contact percentage" images, i e an image as the one below would never be allowed to enroll in the system.. Figure 4.3:. Example on dicult image to segment, it being very misplaced on the sensor. 28.

(34) 4.2.2. Secondary Crease Detection. The Quality Index, QI, determines how much area in the image is usable for Distinct Area Detection, i e grades background area and areas containing SC as bad areas. Studies of the results have shown that the method very neatly marks the areas where SC may be present. A drawback though, is that it seems to misinterpret ngerprint ridges and valleys as SC when they get to wide. This is a drawback ascribed to the linear symmetry method being applied in 3.3.2. Important to note with the deduced results in table 4.3 is that the QI Testperson Person 1 (JS) Person 2 (JB) Person 3 (IM) Person 4 (CS) Average/low/high Calculation time Number of prints Table 4.3:. QI 0.723 0.558 0.695 0.700 0.67 314 sec 144. Min 0.561 0.402 0.458 0.467 0.40. Max 0.759 0.686 0.772 0.761 0.77. Test results for QI. includes both quality degrading due to background information (the Area Index), and due to unusable areas. That way, it can be trusted to only let through areas with clear linear patterns, which is what a regular algorithm needs to extract a qualied template. 4.2.3. Feature Comparison. To evaluate the False Rejection Rate (FRR) and the False Acceptance Rate (FAR), tests have been run on the database, and individual values for all persons have been recorded. This to be able to isolate types of ngerprints these methods handle better than others. From this data, general conclusions can be drawn considering the usability of the methods proposed.. FRR Tests were here run with two slightly dierent purposes: •. To test the inuence of maximum number of vectors, nLmax This test rendered in best results (lowest FRR) for nLmax = 7, and revealed quite a lot about the parameter dependencies; as nLmax increases, FRR is instantly decreasing, while as FAR increases some. This is natural, due to the matching algorithm; the odds of a template 29.

(35) Testperson Person 1 (JS) Person 2 (JB) Person 3 (IM) Person 4 (CS) Total nLmax. No of failed enrolls (FTE) Time for calculation Table 4.4:. FRR 60% 58% 34% 71% 55%. Total FRR over all time instances, 7 vectors. Testperson Person 1 (JS) Person 2 (JB) Person 3 (IM) Person 4 (CS) Total nLmax. No of failed enrolls (FTE) Time for calculation Table 4.5:. Fails Tests 387 642 379 648 219 648 459 648 1444 2587 7 1(0.03%) 2h6m33s. Fails Tests 206 648 172 642 129 648 334 648 841 2587 20 1(0.03%) 2h24m3s. FRR 32% 27% 19% 51% 32%. Total FRR over all time instances, 20 vectors. vector nding a "fake" counterpart in the candidate data increases as more and more vectors exists in the candidate space. •. To test the performance over dierent time instances. The tests were run with template generation from the rst time instance, where after each template was compared to its responding nger at indicated time instance.. (a) Time instance 1 Figure 4.4:. (b) Time instance 2. (c) Time instance 3. (d) Time instance 4. The same nger at the four time instances. 30.

(36) Testperson Person 1 (JS) Person 2 (JB) Person 3 (IM) Person 4 (CS) Total nLmax. No of failed enrolls (FTE) Time for calculation Table 4.6:. nLmax. No of failed enrolls (FTE) Time for calculation. Fails Tests 86 216 79 216 27 216 87 216 279 864 20 0(0.0%) 50m27s. FRR 39% 37% 12% 40% 32%. FRR for time instance 3, 20 vectors. Testperson Person 1 (JS) Person 2 (JB) Person 3 (IM) Person 4 (CS) Total nLmax. No of failed enrolls (FTE) Time for calculation Table 4.8:. FRR 40% 25% 23% 30% 29%. FRR for time instance 2, 20 vectors. Testperson Person 1 (JS) Person 2 (JB) Person 3 (IM) Person 4 (CS) Total. Table 4.7:. Fails Tests 85 210 55 216 50 216 65 216 255 859 20 6(0.1%) 48m29s. Fails Tests 84 216 141 216 31 216 86 216 342 864 20 0(0.0%) 49m55s. FRR 39% 65% 14% 40% 39%. FRR for time instance 4, 20 vectors. The diering results shown in table 4.6-4.8 indicates a changing environment (dry weather etc.) between the two time instances. Over the whole time period, we see a successive increase in the total FRR value. This is natural, and what is interesting is that for test person # 3, the value instead decreases over time, which inicts a good quality 31.

(37) template generation at time instance 1. The best total FRR, for all persons in the dataset, was 29%, which is very high compared to a generic AFIS. Considering the dataset used though, the results would have to be considered rather good.. FAR This measure is of less importance and is today a minor problem in the algorithms. The reason that this number is often very low in present algorithms is the fact that it deals directly with spatial features in the image. In other words, an accidental "hit" on another nger is highly unbelievable. This causes problems when working in another parameter space, since a lot of information is lost in the process. The possibilities for two vectors accidentally matching (within margins) between two dierent ngerprints is considerably greater than two spatial areas looking the same. Testperson Person 1 (JS) Person 2 (JB) Person 3 (IM) Person 4 (CS) Total No of failed enrolls (FTE) Time for calculation Table 4.9:. Accepts Tests 144 324 140 324 111 324 129 324 524 1296 0(0.0%) 1h9m53s. FAR 44% 43% 34% 39% 40%. Total FAR. Very true, the tests rendered in values of FAR around 40% with the same parameter settings as in the FRR-tests. This value is far greater than the average system ( 3 − 4% on an average dataset), but is of less importance since prints of this bad quality are usually not even allowed to enter a system.. 32.

(38) Chapter 5. Discussion This chapter is meant to conclude the work performed during this thesis, and emphasize the meaning of the results by means of overall usability and robustness. Future possibilities for the work is also discussed.. 5.1 Conclusion The proposed system with a Gabor lter bank for detection of secondary creases in ngerprint images described in this thesis is robust and has proven to work well on the proposed dataset. The methods for comparison though, turned out to be a greater challenge. The transition from working in the spatial domain to a fabricated domain (r, θ) obviously generates a loss of information to great to overcome. The recognized performance measure FRR is admittedly a few percent lower than present algorithms, while FAR is a lot higher for the proposed data. It should be noted once again, though, that the algorithms in this thesis were never intended to replace today's, but instead complement them where needed. Therefore, the traditional measures FRR and FAR may be a poor qualitative check for this developed vector comparison method, but are the admittedly most widely known quality measures in the business. As for the detection of secondary creases, the material generated is a great help to existing algorithms, providing information about what areas in a ngerprint images (bad or good quality) to avoid for Distinct Area Detection. An entity for this denition of a good quality ngerprint has also been set; the Quality Index. Overall, the work has shown that ngerprints normally avoided by ngerprint recognition systems, can in fact possess features that are extractable and usable for verication purposes, and that those methods work better than traditional methods for that specic dataset.. 33.

(39) 5.2 Future Work Due to the complexity of the system described, many parts of it can be further optimized to t its purpose even better. However, from having developed the system from scratch, the issues below would be prioritized in further work.. Spatial Information On Vectors Information about proles for each vectors could be stored to provide knowledge on where along the vector in the image the SC is detected to exist.. Matching Algorithm Further investigation in optimal clustering and graph matching with adaptive thresholds in match scoring.. Improved Background Masking Further development of the background masking algorithm to allow and identify bad contact areas in ngerprint images.. 34.

(40) Bibliography [1] Federal Bureau of Investigation, The Science of Fingerprints: Classication and Uses, U.S. Government Printing Oce, Washington D.C., 1984 [2] Jain, A.K., L. Hong, S. Pankanti, and R. Bolle, An IdentityAuthentication System Using Fingerprints, Proceedings of the IEEE, Vol. 85, No. 9, Sept. 1997, pp. 1365-1388. http://citeseer.ist.psu.edu/jain97identity.html. [3] A. Roddy and J. Stosz, Fingerprint Features - Statistical Analysis and System Performance Estimates. Proceedings of IEEE, 85(9), pp. 1390-1421, 1997. http://citeseer.nj.nec.com/roddy99fingerprint.html (2004-11-05). [4] Fingerprint Cards http://www.fingerprint.se (2004-10-01, 2005-02-01). [5] N. Petkov, P. Kruizinga, Nonlinear Operator for Oriented Texture, IEEE transactions on Image Processing, vol. 8, no. 10, October 1999 [6] A. Jain and S. Pankanti, Automated Fingerprint Identication and Imaging Systems http://citeseer.ist.psu.edu/453622.html (2004-10-26). [7] Binaries for LATEXreport writing available on http://www.math.aau.dk/~dethlef/Tips/download.html (2004-10-05). and the client TeXnicCenter is available on http://www.toolscenter.org/front_content.php?idcat=50 (2004-10-05). 35.

(41) [8] A. M. Bazen and S. H. Gerez, Directional Field Computation for Fingerprints Based on the Principal Component Analysis of Local Gradients, Proceedings of the proRISC/IEEE workshop, November 30 – December 1, 2000. [9] K.W. Bowyer, P.J. Phillips, Empirical Evaluation Techniques in Computer Vision, The Institute of Electrical and Electronics Engineers, Inc., 1995, ISBN 0-8186-8401-1 [10] M. van Ginkel, C.L. Luengo Hendricks and L.J. van Vliet, A short introduction to the Radon and Hough transforms and how they relate to each other http://www.ph.tn.tudelft.nl/~michael/mvanginkel_radonandhough_tr2004.pdf (2004-10-07). [11] Fingerprint Door locks Inc., Biometrics overview http://www.fingerprintdoorlocks.com/info_biometrics.html (2005-01-10). [12] N. Petkov, P. Kruizinga, Computational models of visual neurons specialised in the detection of periodic and aperiodic oriented visual stimuli: bar and grating cells, Biological Cybernetics 76, 83-96, Springer Verlag 1997 [13] R. Thai, Fingerprint Image Enhancement and Minutiae Extraction School of Computer Science and Software Engineering, The University of Western Australia, 2003 [14] M. Sonka, V. Hlavac, R. Boyle, Image Processing, Analysis, and Machine Vision, 2nd. ed. PWS Publishing, 1998, ISBN 0-534-95393-X. 36.

(42)

References

Related documents

As a goal of high value waste management and recycling activities is to substitute virgin materials in valuable (original) applications, the viability of extracting

Coevolutionary genetic programming (CGP) is used to synthesize composite feature vectors based on the primitive features (simple or relatively complex) directly extracted

We then combine existing works on counter, predicate, and constrained monotonic abstraction and build a nested counter exam- ple based refinement scheme for establishing

enkätfrågor, därefter skickades 382 enkäter ut (Bell,2000 s.112). Enkätsvaren skulle skickas till oss inom två veckor. Den relativt korta tid, som respondenterna hade till

Since the hydraulic accumulators will provide the ‘peak flows’ to the VDLA’s the work hydraulics will no longer dictate the speed of the combustion engine, which in turn leads to

In 50 399 consecutive patients with MI we found a substantially higher in-hospital bleeding incidence in women compared to in men. The risk of in-hospital bleeding was 17% higher

För kommuninvånaren innebär detta att insatser inom till exempel öppenvård (IFO) och äldreomsorg kan möjliggöras i direktkontakt mellan utförare och medborgare men också enligt

Quantum Chemical Studies of Deposition and Catalytic Surface Reactions. Linköping Studies in Science and Technology