Improving texture analysis performance in biometrics by adjusting image sharpness

Improving texture analysis performance in biometrics by adjusting image sharpness

Author’s Accepted Manuscript Improving texture analysis performance biometrics by adjusting image sharpness in Kunai Zhang, Da Huang, Bob Zhang, Dav...

1MB Sizes 4 Downloads 99 Views

Author’s Accepted Manuscript Improving texture analysis performance biometrics by adjusting image sharpness

in

Kunai Zhang, Da Huang, Bob Zhang, David Zhang

www.elsevier.com/locate/pr

PII: DOI: Reference:

S0031-3203(16)30381-8 http://dx.doi.org/10.1016/j.patcog.2016.11.025 PR5968

To appear in: Pattern Recognition Received date: 16 July 2016 Revised date: 23 October 2016 Accepted date: 28 November 2016 Cite this article as: Kunai Zhang, Da Huang, Bob Zhang and David Zhang, Improving texture analysis performance in biometrics by adjusting image sharpness, Pattern Recognition, http://dx.doi.org/10.1016/j.patcog.2016.11.025 This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting galley proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

Improving texture analysis performance in biometrics by adjusting image sharpness Kunai Zhanga , Da Huangb , Bob Zhangc , David Zhanga,∗ a Biometrics

Research Centre, Department of Computing, The Hong Kong Polytechnic University, Hong Kong, China b Department of Automation, Tsinghua University, Beijing, China c Department of Computer and Information Science, University of Macau, Macau, China

Abstract In this paper, a method to improve texture analysis performance in biometrics by adjusting image sharpness is presented. Images of high sharpness are usually considered as high quality data in texture analysis. Therefore, the imaging sensor and lens are carefully selected and calibrated in an image acquisition system in order to capture clear images. However, the results of our experiments show that the performance of texture analysis in biometrics can be improved by filtering clear images to lower sharpness. The experiments were conducted on the PolyU Palmprint Database using two algorithms (CompCode and POC), as well as on the CASIA Iris Database using IrisCode. In this paper, a filtering method using Gaussian filters is adopted to the images during the pre-processing stage to adjust the image sharpness. Results indicate that there is an optimal range of image sharpness and if all the images are filtered to this range, the performance of texture analysis on the whole dataset will be optimized. A scheme is also proposed to find the optimal range and to filter an image to the optimal range. Keywords: Image Sharpness, Image Filtering, Texture Analysis, Palmprint Recognition, Iris Recognition

∗ Corresponding

author Email address: [email protected] (David Zhang)

Preprint submitted to Journal of Pattern Recognition

November 28, 2016

1

1. Introduction

2

Texture analysis has been studied for decades as it is an important and use-

3

ful area in computer vision. In the physical world, the surface of most objects

4

appears with texture. Therefore, a successful vision system must be able to

5

analyze texture features [1]. So far, a large number of texture analysis algo-

6

rithms have been developed, among which the method of using Gabor filters in

7

extracting textured image features is proved to be optimal in minimization the

8

joint two-dimensional uncertainty in space and frequency [2]. Texture classifi-

9

cation and segmentation [3, 4], image recognition [5, 6, 7], image registration

10

and motion tracking [8] are typical applications of Gabor filters.

11

Gabor features are also widely utilized in biometrics recognition, particu-

12

larly in iris recognition and palmrpint recognition. Other than Gabor features,

13

many new methods have been developed to provide better representation of the

14

features or extract better features, like sparse representation [9] and deep learn-

15

ing [10] techniques utilized in face recognition. With these new techniques, the

16

performance of face recognition has been improved significantly. However, some

17

problems still remain challenging. In face recognition, the problem of identifying

18

identical twins is still unsolved. Like face recognition, fingerprint recognition is

19

also a very popular biometric technology both in academia and industry. Fin-

20

gerprint recognition has an even longer history than face recognition and has

21

achieved big success in the market [11, 12]. However, it can be spoofed in

22

several ways especially by Gelatin-made fake fingers [13]. Compared with face

23

recognition and fingerprint recognition, iris recognition and palmprint recog-

24

nition are relatively more reliable. Though iris recognition is relatively new,

25

which was first developed in 1987 [14], it is one of the most reliable features in

26

biometrics [15, 16]. Palmprint recognition is an even younger technology which

27

was developed in 2003 [17], but it has comparable accuracy and reliability with

28

iris [18], much higher accuracy than fingerprint recognition and face recogni-

29

tion. Although there are many different algorithms using different features in

30

iris recognition and palmprint recognition, among them the most popular algo-

2

31

rithms use texture features extracted by 2-D Gabor filters. 2-D Gabor filter is

32

a useful tool for texture analysis.

33

Usually, researchers use clear images with high sharpness for texture analysis.

34

In iris recognition and palmprint recognition, all images in the public database

35

are of high sharpness. However, in our previous experiments where we tried to

36

remove noise from palmprint images, it was found that by filtering the images

37

to lower sharpness can increase the recognition performance. After Gaussian

38

filtering, the sharpness of the ROI images was decreased. But such filtered

39

images can even perform better than clear images.

40

To further validate our finding, more experiments were conducted on both

41

palmprint and iris images (Fig. 1) and all these experiments indicated the same

42

result that recognition accuracy can be improved by adjusting image sharpness

43

to some range. In the following sections of the paper, the optimal range of image

44

sharpness will be calculated and the relationship between recognition rate and

45

image sharpness will be analyzed based on the experimental results.

46

In the past ten years, a lot of algorithms based on texture features have been

47

developed for iris recognition and palmprint recognition. Among them, the most

48

famous one for iris recognition is IrisCode that was developed in 1993 and con-

49

tinuously improved by Daugman [19, 20, 21, 22]. IrisCode has been widely used

50

in commercial iris recognition systems and more than 50 million persons have

51

been enrolled in such systems [22]. While in palmprint recognition, the most

52

popular and successful algorithm is CompCode that was developed by Zhang in

53

2003 [17]. CompCode is the most accurate, reliable and efficient algorithm for

54

palmprint recognition [18] and a lot of public palmprint database collected from

55

hundreds of palms have been established. In this paper, iris recognition and

56

palmprint recognition are taken as two case studies of texture analysis to verify

57

that images of lower sharpness can perform better than those of higher sharp-

58

ness. A group of Gaussian filters with different parameters are applied to iris

59

and palmprint images to change the image sharpness. After filtering, IrisCode

60

and CompCode are utilized to evaluate the verification results of iris recognition

61

and palmprint recognition, respectively. To make the results more convincing, 3

62

another palmprint recognition algorithm Palmprint Orientation Code (POC)

63

[23], which is not based on Gabor filters, was also tested. Results of these ex-

64

periments all show that lower sharpness images can achieve better performance

65

in recognition accuracy than higher sharpness images. Based on the results, an

66

optimal range can be determined. When all the images are filtered to this range,

67

the performance can be improved. The EER of palmprint recognition can be

68

improved by at least 19.2% and iris recognition can be improved by 11.5%.

69

The paper is organized as follows: Section 2 presents the criteria for image

70

filtering, Section 3 presents the experimental results and followed by the per-

71

formance analysis in Section 4, and the last section summarizes the conclusions

72

drawn from the experiments.

73

2. Strategy for image sharpness adjustment To determine the optimal range, a group of Gaussian filters were applied to the images in the database. G(x, y) =

1 − x2 +y2 2 e 2σ 2πσ 2

(1)

74

In the matlab codes, the Gaussian filter function has two main parameters: the

75

window size and σ. Table 1 describes the results of testing on different window

76

sizes. In this experiment, σ was fixed to 1.635 while the window size varied from

77

9×9 to 19×19. The last row is the average execution time of the Gaussian filter

78

performed on a single image. In order to analyze how σ affects the performance,

79

a Gaussian filter window size of 9 × 9, which achieves the lowest EER with

80

similar GAR and time complexity, is used in all the following experiments in

81

this paper. After filtering, the optimal range is determined according to the

82

EER.

83

2.1. Determine the optimal range Image sharpness refers to the contrast of the edge and the background in an image. There are many methods to evaluate sharpness quantitatively. In

4

(a)

(b)

(c)

(d)

Fig. 1. (a) Original clear palmprint image. (b) Palmprint image after filtering. (c) Original clear iris image. (d) Iris image after filtering.

5

Table 1: Gaussian filter window size experiments on PolyU Palmprint Database

Window Size

9×9

11 × 11

13 × 13

15 × 15

17 × 17

19 × 19

σ

1.635

1.635

1.635

1.635

1.635

1.635

eav

14.7045

14.6973

14.6985

14.6978

14.6978

14.6978

variance of eav

0.8625

0.8607

0.8608

0.8607

0.8607

0.8607

EER(%)

0.00025

0.00028

0.00033

0.00033

0.00031

0.00031

GAR(F AR = 0)

0.999703

0.999703

0.999778

0.999703

0.999703

0.999703

average time(ms)

19.88

19.19

18.75

18.78

19.45

20.58

this paper, Edge Acutance Value (EAV) in [24] is introduced to calculate the sharpness of an image by Pm×n P8 i=1

eav = EAV (I) =

α=1

df /dx

m×n

(2)

84

where m, n denotes the number of rows and columns of the image, df is the

85

difference in gray values between two pixels and dx is the difference in distance

86

between two pixels. df /dx is applied to the eight-neighborhood. When calcu-

88

lating df /dx in the horizontal or vertical directions, the weight is 1; while in √ the 45◦ or 135◦ direction, the weight is 1/ 2. If the eav of an image is small, it

89

implies that the image sharpness is low. By applying the Gaussian filters with

90

different σ to the same image, the image can be filtered to different levels of

91

sharpness. In Fig. 2, it is obvious that the palmprint ROI of highest sharpness

92

is (a) and the lowest sharpness is (c); the iris ROI of highest sharpness is (d)

93

and the lowest sharpness is (f). After calculation using Eq. (2), the result is:

94

EAV (a) > EAV (b) > EAV (c), EAV (d) > EAV (e) > EAV (f ).

87

The mean of EAV is defined to measure the average eav of a dataset. PN EAV (Ii ) eav = i=1 N 95

(3)

where Ii is the ith image in the dataset which contains N images.

96

Usually, all images in a biometric image dataset are clear images. After

97

Gaussian filter is applied to every single image in the dataset, the sharpness of

98

each image is decreased and as a result, eav is decreased. A hypothesis can be 6

99

made that there is an optimal range [E1 , E2 ] on the EAV axis and when eav is

100

adjusted to this optimal range, the recognition performance of this dataset will

101

be improved (Fig. 3).

102

In order to find the optimal range [E1 , E2 ], we used Gaussian filters with

103

different σ to filter the images and calculated the corresponding eav and EER.

104

Such an experiment has been done on the PolyU Palmprint Database Session 2

105

and the optimal range [E1 , E2 ] has been found. Then all the images of PolyU

106

Palmprint Database Session 1 and CASIA Iris Database were filtered to [E1 , E2 ]

107

to test the reliability of this range. Both the two testing datasets performed

108

better after filtering. This experimental result has validated our hypothesis that

109

there is an optimal range on the EAV axis.

110

Then, an approach is proposed to adjust image sharpness, which can be

111

taken as a preprocessing step in any texture analysis in biometrics (Fig. 4).

112

2.2. Apply appropriate filtering to an image Usually, all the original sample images in a dataset are clear images. Before an image goes to texture analysis, the eav of the image should be calculated using Eq. (2). If the eav is less than E1 , the image will be discarded because the image is too blurry for texture analysis. If the eav is larger than E2 , the image should be filtered with a filtering approach to decrease its eav. Then the filtered image will be used for texture analysis when its eav is within [E1 , E2 ]. The above process contains three actions. A = {F iltering(I, σ + ), T extureAnalysis(I), discard I}

(4)

where A is the action set and T extureAnalysis(I) is the texture analysis of image I. The eav of the image I is computed in every step and compared to the thresholds, after which Action is taken according to the comparison result.     discard I, eav < E1    Action = F iltering(I, σ + ), eav > E2      T extureAnalysis(I), otherwise 7

(5)

(a)

(b)

(d)

(e)

(c)

(f )

Fig. 2. Palmprint ROIs and Iris ROIs after filtering. (a) σ = 0.662. (b) σ = 1.40 (c) σ = 12.0. (d) σ = 0.662. (e) σ = 1.0. (f) σ = 2.0.

Fig. 3. The optimal range on the EAV axis.

8

Fig. 4. Flowchart of the filtering stage before texture analysis.

9

113

The algorithm is described in Algorithm 1. Algorithm 1 Pseudo codes of the proposed method 1:

Input: The clear image I; σ is initialized to a specific value;  is the step and initialized to a very small value.

2:

Output: The properly filtered image I for Texture Analysis; or discard I.

3:

eav ← EAV (I) // Use Eq. (2) to calculate eav

4:

if eav < E1 then

5: 6:

else if eav > E2 then

7:

F iltering(I, σ + )

8:

T extureAnalysis(I)

9: 10:

114

return false // discard I

else T extureAnalysis(I)

11:

end if

12:

return I

3. Experimental results

115

Experiments were conducted on the palmprint dataset using both Comp-

116

Code [17] and POC [23], and also on the iris dataset using IrisCode [22]. Both

117

the CompCode and IrisCode use 2-D Gabor filters to extract texture features,

118

and they are one of the most popular algorithms in palmprint recognition and

119

iris recognition, respectively; while POC uses four directional templates to ex-

120

tract features.

121

3.1. Calculation of the optimal range

122

The PolyU Palmprint Database [25] has two sessions containing 7605 grayscale

123

palm images collected on two separate occasions at an interval of about two

124

months. 386 different palms were captured, about 10 images of each palm in

125

each session. The most successful palmprint recognition algorithm CompCode

126

[17] was used in this experiment. A group of Gaussian filters with σ increasing 10

127

from 0.662 to 2.15 at a step of about 0.2 were applied to the palmprint images

128

(Fig. 5). All the genuine and impostor matching scores were computed as well

129

as the Equal Error Rate (EER). In the experiments, EER is utilized to evaluate

130

the recognition performance: the closer EER is to zero, the better the recog-

131

nition performance is. Table 2 and Table 3 show the results on the palmprint

132

database using CompCode; while Table 4 and Table 5 are results using POC.

133

Take Table 2 as example, the No.4 experiment achieves the best EER of 0.0353

134

and the corresponding eav is 16.5463. It is noticed that from No.3 to No.7, their

135

performance is much better than that of the original one. So here we can use

136

the eav of No.3 and No.7 to determine the optimal range, which is [14.8, 17.0]

137

after rounding. The selection of the optimal range will be discussed in Section

138

4.

139

3.2. Testing the optimal range on PolyU Palmprint Database

140

In this subsection, we use the optimal range [14.8, 17.0] obtained from Table 2

141

to test on PolyU Palmprint Database Session 1 (Table 3). In Table 3, the eav

142

of No.3 and No.4 are within the optimal range [14.8, 17.0], and eav of No.2 and

143

No.5 are close to the boundary of [14.8, 17.0]. All these four rows have a EER

144

lower than the original one, which verifies that if the eav is within the optimal

145

range, the recognition accuracy can be improved. This conclusion can also be

146

verified in Table 4 and Table 5, where POC is performed on PolyU Palmprint

147

Database.

148

3.3. Testing the optimal range on CASIA Iris Database

149

The CASIA Iris Database [26] was also employed to validate the optimal

150

range [14.8, 17.0] obtained from the palmprint recognition experiments. This iris

151

database contains 663 different irises, 6 images of each iris. The most popular

152

iris recognition algorithm IrisCode [16] was adopted. Similar to the palmprint

153

recognition experiments, a group of Gaussian filters with σ increasing from

154

0.31 to 0.662 at a step of about 0.03 were applied to the iris images (Fig. 6).

155

Table 6 shows the results on the iris database. Like the results on the palmprint

11

(a)

(b)

(c)

(d)

(e)

(f )

(g)

(h)

(i)

Fig. 5. (a) Original clear palmprint image. (b) - (i) Images after filtering by Gaussian filter using different σ.

156

database, all the experiments within the optimal range (No.2 to No.5 in Table 6)

157

have better or comparable performance.

158

4. Performance analysis

159

This section is an analysis on the palmprint recognition experiments us-

160

ing the algorithm CompCode (Table 2 and Table 3). To find the reason why

161

image filtering can improve texture analysis performance in biometrics, the gen-

162

uine distance and impostor distance are taken into consideration. Experiments

12

(a)

(b)

(c)

(d)

(e)

(f )

(g)

(h)

(i)

Fig. 6. (a) Original clear iris image. (b) - (i) Images after filtering by Gaussian filters using different σ.

13

Table 2: Gaussian filtering on PolyU Palmprint Database Session 2 (CompCode)

Experiment ID

σ

eav

variance of eav

EER(%)

dprime

GAR(F AR = 0)

Original

N/A

40.0518

25.2406

0.0437

6.7979

0.997821

1

0.662

23.0409

4.2902

0.0434

7.0982

0.998178

2

0.935

18.6188

2.5018

0.0433

7.3593

0.998254

3

1.12

17.0546

2.0424

0.0371

7.5512

0.998198

4

1.2

16.5463

1.9126

0.0353

7.6280

0.998055

5

1.3

16.0603

1.7983

0.0355

7.7326

0.998287

6

1.4

15.5612

1.6907

0.0426

7.7132

0.998225

7

1.635

14.741

1.5309

0.0428

7.6839

0.997911

8

2.15

13.5912

1.3439

0.0561

7.5792

0.997502

Table 3: Gaussian filtering on PolyU Palmprint Database Session 1 (CompCode)

Experiment ID

σ

eav

variance of eav

EER(%)

dprime

GAR(F AR = 0)

Original

N/A

42.3633

29.7047

0.0066

7.0585

0.999185

1

0.662

18.7503

1.7261

0.00135

7.6821

0.999380

2

0.935

17.1138

1.3058

0.000701

7.8471

0.999383

3

1.29

16.0637

1.0881

0.000827

8.0532

0.999633

4

1.42

15.4792

0.9810

0.000412

8.2289

0.999640

5

1.635

14.7045

0.8630

0.00028

8.4715

0.999703

6

2.15

13.5541

0.7189

0.000742

8.4212

0.998581

7

2.71

12.7588

0.6445

0.000933

8.3182

0.997710

8

4.2

11.6731

0.5670

0.0198

7.6146

0.987152

163

indicate both genuine distance and impostor distance are decreased after the

164

Gaussian filter is adopted. The selection of the optimal range and filtering step

165

are discussed at the end of this section.

166

4.1. Genuine distance and impostor distance From the experimental results in the previous section, it can be observed that although the sharpness of the images after filtering becomes lower, the EER of the whole dataset becomes lower, which means the performance becomes better. 14

Table 4: Gaussian filtering on PolyU Palmprint Database Session 1 (POC)

Experiment ID

σ

eav

variance of eav

EER(%)

dprime

GAR(F AR = 0)

Original

N/A

42.3633

29.7047

0.0518

5.2606

0.997211

1

0.662

18.7503

1.7261

0.0511

5.8277

0.997217

2

0.935

17.1138

1.3058

0.0363

6.3192

0.998247

3

1.29

16.0637

1.0881

0.0221

7.0681

0.998534

4

1.42

15.4792

0.9810

0.0201

7.8661

0.99879

5

1.635

14.7045

0.8630

0.0136

8.7731

0.99891

6

2.15

13.5541

0.7189

0.0325

8.6378

0.998112

7

2.71

12.7588

0.6445

0.0550

8.5731

0.997134

8

4.2

11.6731

0.5670

0.0741

7.4179

0.98488

Table 5: Gaussian filtering on PolyU Palmprint Database Session 2 (POC)

Experiment ID

σ

eav

variance of eav

EER(%)

dprime

GAR(F AR = 0)

Original

N/A

40.0518

25.2406

0.1893

5.1228

0.958332

1

0.662

23.0409

4.2902

0.1725

5.6535

0.958692

2

0.935

18.6188

2.5018

0.1587

6.0963

0.963929

3

1.12

17.0546

2.0424

0.1378

6.4459

0.969881

4

1.2

16.5463

1.9126

0.1369

6.6065

0.969878

5

1.3

16.0603

1.7983

0.1173

6.7791

0.974328

6

1.4

15.5612

1.6907

0.1145

7.0093

0.974333

7

1.5

14.741

1.5309

0.1344

6.8884

0.970210

8

2.15

13.5912

1.3439

0.1673

6.7930

0.962899

EER of the two palmprint database sessions decrease by 19.2% from 0.0437% to 0.0353% (Table 2) and decrease by 95.8% from 0.0066% to 0.00028% (Table 3), respectively. To find the reason why image filtering can improve texture analysis performance in biometrics, we employ the normalized hamming distance in [17] to measure the difference between two palmprint images. A distance of 0 means the two palmprint images are entirely the same while a distance of 1 means they are very different. The distance of two palmprint images which are from the

15

Table 6: Filtering experiments using Gaussian filter on CASIA Iris Database

Experiment ID

σ

eav

variance of eav

EER(%)

dprime

GAR(F AR = 0)

Original

N/A

17.2648

9.8971

0.1809

5.0298

0.955857

1

0.31

17.1582

9.5901

0.1808

5.0297

0.955857

2

0.345

17.0238

9.2782

0.1810

5.0310

0.957563

3

0.364

17.0546

9.0346

0.1685

5.7146

0.956451

4

0.382

16.8348

8.9173

0.1601

6.0213

0.961431

5

0.415

16.3560

8.1980

0.1674

5.7737

0.962635

6

0.45

14.1799

6.1511

0.1710

5.5048

0.96033

7

0.528

15.2415

7.0665

0.1892

5.3423

0.958898

8

0.662

12.3411

4.7354

0.1778

5.2392

0.965101

same palm is called genuine distance gDist, and the distance of two palmprint images which are from two different palms is called impostor distance iDist. The experiments with best performance in Table 2 and Table 3 are chosen to compare with the original ones. In Table 2, No.4 is compared to the original one and the comparison is shown in Fig. 7 (a). In Table 3, No.5 is compared to the original one and the comparison is shown in Fig. 7 (b). The distributions of genuine distance and impostor distance are compared before filtering and after filtering (Fig. 7). After filtering, both the genuine distance distribution and impostor distance distribution shift towards 0, while the shift of genuine distance is much bigger. That is to say, the decrease of the genuine distance is greater than the decrease of the impostor distance. To measure the separation of the genuine distance distribution and impostor distance distribution, dprime is introduced as dprime

mean(iDist) − mean(gDist) = p var(iDist)/2 + var(gDist)/2

(6)

167

A higher dprime usually means lower EER because the genuine distance distri-

168

bution and impostor distance distribution are more separated, resulting in that

169

it becomes easier to use a single threshold for classification. Fig. 8 depicts how

170

dprime changes as eav decreases. At the beginning, dprime keeps increasing as 16

(a)

(b)

Fig. 7. Distributions of genuine and impostor distance. PolyU Palmprint Database: (a) Session 2. (b) Session 1.

171

172

eav decreases, but after the optimal EER is reached, dprime keeps decreasing. As a conclusion, the main contribution to performance improvement is the

173

increase of dprime which is mainly caused by the decrease of genuine distance.

174

4.2. Optimal range

175

Fig. 9 describes the EER − eav curve. In the palmprint dataset and iris

176

dataset, the original images are clear images. When filtering the images, as σ

177

increases, the eav of the dataset keeps decreasing, and so does the EER. The

178

original EER calculated using the original images without filtering is EERori .

179

Eo1 and Eo2 are the corresponding eav that can reach EERori . As eav de-

180

creases, it will reach a value EAVopt at which the corresponding EER reaches

181

the smallest value EERopt , which means the performance is optimal. After

182

that, as eav continues decreasing, the EER will keep increasing. We can define

183

a tolerance parameter α > 0, and find two points on the EER−eav curve where

184

EER equals to (EERopt + α). Define the eav at these two points as E1 and E2 .

185

The optimal range [E1 , E2 ] is therefore can be determined. EAVopt and EERopt

186

are unique for a specific texture analysis algorithm on a specific dataset. How-

187

ever, [E1 , E2 ] can be various depending on the selection of α. Given that the

188

average EAVopt is around 15.0 according to Table 2 and Table 3, here Table 7

17

(a)

(b)

Fig. 8. dprime − eav curve. PolyU Palmprint Database: (a) Session 2. (b) Session 1.

Fig. 9. Optimal range and the EER − eav curve.

189

gives the suggested selection of α for palmprint recognition by our experiments. Table 7: Suggested α for palmprint recognition

190

EAVopt

EERopt

α

EER Range

Optimal Range

15.0

0.035%

0.008%

[0.035, 0.043]

[40.0, 15.5]

4.3. Filtering step In the filtering stage of Fig. 4, in each step the increase of σ should not be too large or too small. If the step  is too large, eav will decrease too fast 18

and miss the optimal range; while if  is too small, it will take a long time to filter the image to the optimal range. To find the appropriate initial σ and , the relation between σ and eav is analyzed in Fig. 10. The curves indicate the relation of σ and eav in the two sessions of PolyU Palmprint Database are very similar. The two curves in Fig. 10 also indicate that the relation function can probably be expressed by 1 = ax + b σ

(7)

191

where x = eav. After the calculation of σ −1 , the relation of σ −1 and eav is as

192

Fig. 11. Let y be σ −1 , the relation between x and y can be represented by a linear function. y = ax + b

(8)

193

a and b can be estimated using linear approximation. For Session 1, y =

194

0.1137x − 1.0687; for Session 2, y = 0.1019x − 0.8950. As the linear function

195

can be estimated, given a specific eav, the corresponding σ for image filtering

196

can be calculated. If the eav of an image is larger than E2 , it should be filtered

197

to the optimal range [E1 , E2 ]. The initial σ and the optimal  can be obtained

198

by the following steps.

199

200

• Compute the corresponding σ of E1 and E2 , mark the results as σ1 and σ2 . Set the initial σ as σ2 for image filtering.

201

• Compute  = (σ1 − σ2 )/2. After the first filtering, if eav is still larger

202

than E2 , filter the image with an updated σ = σ2 + ; if eav is less than

203

E1 , discard the filtered image and filter the original image with a update

204

initial σ = σ2 − .

205

The above initial σ can largely reduce the computational time of filtering, while

206

 can guarantee that the image can be filtered to the optimal range. Algorithm

207

2 is the procedure of the above filtering step adjustment, which is also described

208

in Fig. 12.

19

Algorithm 2 Pseudo codes of the filtering step adjustment 1:

Input: The palmprint image I; σ; .

2:

Output: The filtered image ready for texture analysis.

3:

σ ← σ2

4:

←0

5:

n←0

6:

loop:

7:

Gaussian(I, σ + )

8:

n←n+1

9:

eav ← EAV (I)

10:

if eav > E2 then

11:

2  ← n σ1 −σ 2

12:

goto loop

13:

else if eav < E1 then

14:

2  ← −n σ1 −σ 2

15:

goto loop

16:

else

17:

end if

18:

return I

20

Fig. 10. σ − eav curve of the PolyU Palmprint Database.

21

Fig. 11. σ −1 − eav curve of the PolyU Palmprint Database.

22

Fig. 12. Flowchart of the filtering step adjustment inside F iltering(I, σ + ) in Fig. 4.

23

209

4.4. Computational time

210

As illustrated in Fig. 4, the functions of EAV (I) and F iltering(I, σ + )

211

have been added to the system. To compare the time complexity of the pro-

212

posed method with the existing method, the average computational time of each

213

execution on a single image is calculated in Table 8. Take CompCode as ex-

214

ample, the complexity of the algorithm depends on the feature extraction time

215

and feature matching time. The proposed method increases the complexity by

216

adding the computational time of EAV (I) and F iltering(I, σ + ). Table 8: Computational time of the proposed method

217

Algorithm

Feature Extraction

Feature Matching

EAV (I)

F iltering(I)

CompCode

25.02 ms

0.26 ms

78.36 ms

19.88 ms

5. Conclusion

218

In this paper, we presented a finding that the performance of texture analy-

219

sis in biometrics can be improved by filtering images. Several experiments were

220

conducted on the PolyU Palmprint Database and CASIA Iris Database using

221

Gaussian filtering as a filtering method. The EER of palmprint recognition can

222

be improved by 19.2% at least and iris recognition can be improved by 11.5%.

223

The results show that the performance of both palmprint and iris recognition is

224

significantly improved and there is an optimal range [14.8, 17.0]. With the anal-

225

ysis on the case study of palmprint recognition, when all the images are filtered

226

to the optimal range, the genuine distance is reduced and dprime is increased.

227

As a result, the performance of the recognition is improved. After further anal-

228

ysis, the relationship between the reciprocal of the Gaussian filter parameter σ

229

and eav can be expressed by a linear function. The optimal filtering step and

230

the initial σ can be calculated, while filtering according to the linear function to

231

reduce the computational time. This paper provides the experimental support

232

to our future work. Further experiments will be conducted to theoretically an-

24

233

alyze the mechanism of the research result that adjusting clear images to lower

234

sharpness can improve the texture analysis performance in biometrics.

235

Acknowledgments

236

The authors would like to thank the editor and the anonymous reviewers for

237

their help in improving the paper. The work is partially supported by the GRF

238

fund from the HKSAR Government, the central fund from Hong Kong Polytech-

239

nic University, the NSFC fund (61332011, 61272292, 61271344), Shenzhen Fun-

240

damental Research fund (JCYJ20130401152508661, JCYJ20140508160910917),

241

and Key Laboratory of Network Oriented Intelligent Computation, Shenzhen,

242

China.

243

References

244

245

[1] Tuceryan, Mihran, A. Jain, et al., Texture analysis, Handbook of pattern recognition and computer vision 2 (1993) 207–248.

246

[2] J. G. Daugman, Complete discrete 2-d gabor transforms by neural net-

247

works for image analysis and compression, IEEE Transactions on Acous-

248

tics, Speech, and Signal Processing 36 (7) (1988) 1169–1179. doi:10.1109/

249

29.1644.

250

[3] A. C. Bovik, M. Clark, W. S. Geisler, Multichannel texture analysis us-

251

ing localized spatial filters, IEEE Transactions on Pattern Analysis and

252

Machine Intelligence 12 (1) (1990) 55–73. doi:10.1109/34.41384.

253

[4] B. S. Manjunath, R. Chellappa, A unified approach to boundary percep-

254

tion: edges, textures, and illusory contours, IEEE Transactions on Neural

255

Networks 4 (1) (1993) 96–108. doi:10.1109/72.182699.

256

[5] J. G. Daugman, High confidence visual recognition of persons by a test

257

of statistical independence, IEEE Transactions on Pattern Analysis and

258

Machine Intelligence 15 (11) (1993) 1148–1161. doi:10.1109/34.244676. 25

259

[6] M. Lades, J. C. Vorbruggen, J. Buhmann, J. Lange, C. von der Malsburg,

260

R. P. Wurtz, W. Konen, Distortion invariant object recognition in the

261

dynamic link architecture, IEEE Transactions on Computers 42 (3) (1993)

262

300–311. doi:10.1109/12.210173.

263

[7] B. S. Manjunath, R. Chellappa, C. von der Malsburg, A feature based ap-

264

proach to face recognition, in: Computer Vision and Pattern Recognition,

265

1992. Proceedings CVPR ’92., 1992 IEEE Computer Society Conference

266

on, 1992, pp. 373–378. doi:10.1109/CVPR.1992.223162.

267

[8] B. Manjunath, C. Shekhar, R. Chellappa, A new approach to image

268

feature detection with applications, Pattern Recognition 29 (4) (1996) 627

269

– 640. doi:http://dx.doi.org/10.1016/0031-3203(95)00115-8.

270

URL

271

0031320395001158

http://www.sciencedirect.com/science/article/pii/

272

[9] R. He, W. S. Zheng, B. G. Hu, Maximum correntropy criterion for robust

273

face recognition, IEEE Transactions on Pattern Analysis and Machine In-

274

telligence 33 (8) (2011) 1561–1576. doi:10.1109/TPAMI.2010.220.

275

276

[10] Y. LeCun, Y. Bengio, G. Hinton, Deep learning, Nature 521 (7553) (2015) 436–444.

277

[11] A. Jain, L. Hong, R. Bolle, On-line fingerprint verification, IEEE Transac-

278

tions on Pattern Analysis and Machine Intelligence 19 (4) (1997) 302–314.

279

doi:10.1109/34.587996.

280

[12] D. Maio, D. Maltoni, R. Cappelli, J. L. Wayman, A. K. Jain, Fvc2000: fin-

281

gerprint verification competition, IEEE Transactions on Pattern Analysis

282

and Machine Intelligence 24 (3) (2002) 402–412. doi:10.1109/34.990140.

283

[13] C. Barral, A. Tria, Fake fingers in fingerprint recognition: Glycerin super-

284

285

286

sedes gelatin, in: Formal to Practical Security, Springer, 2009, pp. 57–69. [14] L. Flom, A. Safir, Iris recognition system, uS Patent 4,641,349 (Feb. 3 1987). 26

287

288

289

290

[15] D. Zhang, Automated biometrics:

Technologies and systems, Vol. 7,

Springer Science & Business Media, 2013. [16] R. P. Wildes, Iris recognition: an emerging biometric technology, Proceedings of the IEEE 85 (9) (1997) 1348–1363. doi:10.1109/5.628669.

291

[17] D. Zhang, W.-K. Kong, J. You, M. Wong, Online palmprint identification,

292

IEEE Transactions on Pattern Analysis and Machine Intelligence 25 (9)

293

(2003) 1041–1050. doi:10.1109/TPAMI.2003.1227981.

294

[18] D. Zhang, W. Zuo, F. Yue, A comparative study of palmprint recognition

295

algorithms, ACM Comput. Surv. 44 (1) (2012) 2:1–2:37. doi:10.1145/

296

2071389.2071391.

297

URL http://doi.acm.org/10.1145/2071389.2071391

298

[19] J. Daugman, How iris recognition works, IEEE Transactions on Circuits

299

and Systems for Video Technology 14 (1) (2004) 21–30. doi:10.1109/

300

TCSVT.2003.818350.

301

[20] J. Daugman, Probing the uniqueness and randomness of iriscodes: Results

302

from 200 billion iris pair comparisons, Proceedings of the IEEE 94 (11)

303

(2006) 1927–1935. doi:10.1109/JPROC.2006.884092.

304

305

[21] J. Daugman, New methods in iris recognition, IEEE Transactions on Systems, Man, and Cybernetics, Part B 37 (5) (2007) 1167–1175.

306

[22] A. W. K. Kong, D. Zhang, M. S. Kamel, An analysis of iriscode, IEEE

307

Transactions on Image Processing 19 (2) (2010) 522–532. doi:10.1109/

308

TIP.2009.2033427.

309

[23] X. Wu, K. Wang, D. Zhang, Palmprint Authentication Based on Orienta-

310

tion Code Matching, Springer Berlin Heidelberg, Berlin, Heidelberg, 2005,

311

pp. 555–562. doi:10.1007/11527923_57.

312

URL http://dx.doi.org/10.1007/11527923_57

27

313

[24] H. WANG, W. ZHONG, J. WANG, D. XIA, Research of measurement

314

for digital image definition, Journal of Image and Graphics 9 (7) (2004)

315

828–831.

316

317

318

[25] PolyUPalmprintDatabase. [link]. URL http://www4.comp.polyu.edu.hk/~biometrics/ [26] CASIAIrisDatabase. [link].

319

URL

320

mode=Iris

http://biometrics.idealtest.org/findTotalDbByMode.do?

28