Radial Symmetry Guided Particle Filter for Robust Iris Tracking

8
Radial symmetry guided particle filter for robust iris tracking Francis Martinez, Andrea Carbone and Edwige Pissaloux Universit´ e Pierre et Marie Curie (UPMC) CNRS, UMR 7222, Institut des Syst` emes Intelligents et de Robotique (ISIR) 4 place Jussieu, 75005 Paris, France [email protected] Abstract. While pupil tracking under active infrared illumination is now relatively well-established, current iris tracking algorithms often fail due to several non-ideal conditions. In this paper, we present a novel approach for tracking the iris. We introduce a radial symmetry detector into the proposal distribution to guide the particles towards the high probability region. Experimental results demonstrate the ability of the proposed particle filter to robustly track the iris in challenging conditions, such as complex dynamics. Compared to some previous methods, our iris tracker is also able to automatically recover from failure. Keywords: iris tracking, particle filter, radial symmetry. 1 Introduction Visual gaze tracking is of particular interest in many applications [1], ranging from human-computer interactions to cognitive studies. A first step to estimate the direction of the gaze often relies on tracking eye features such as the pupil or the iris. Then, the point-of-regard can be computed thanks to a mapping between the eye and the environment, usually a planar surface. We refer to [2] for an extensive survey on eye tracking and gaze estimation. In this paper, we address the problem of tracking the iris in close-up images with a low-cost camera and under uncontrolled conditions. In comparison with pupil tracking [3], iris tracking remains still a challenging task due to eye- (motion, shape variation, eyelid/eyelash occlusion, pupillary dilation, ethnicity), camera- (focus, resolution) and environment- (light variation, passive illumination, exter- nal occlusion) dependent factors. It can be categorized into two separate classes: non-probabilistic [4][5][6][7] and probabilistic [8][9] approaches. Non-probabilistic approaches. They are mostly improvements of the Star- burst algorithm proposed in [3] originally designed for pupil tracking. These methods rely on feature detection followed by a RANSAC [10] ellipse fitting. Extensions of the algorithm include : a distance filter and constraints about di- rections of rays [4], constraints about size of iris and number of inliers [5], a limbus-pupil switching mechanism using an adaptive threshold to account for light variations [6] and a constrained search of strong gradients along normal

Transcript of Radial Symmetry Guided Particle Filter for Robust Iris Tracking

Radial symmetry guided particle filterfor robust iris tracking

Francis Martinez, Andrea Carbone and Edwige Pissaloux

Universite Pierre et Marie Curie (UPMC)CNRS, UMR 7222, Institut des Systemes Intelligents et de Robotique (ISIR)

4 place Jussieu, 75005 Paris, [email protected]

Abstract. While pupil tracking under active infrared illumination isnow relatively well-established, current iris tracking algorithms often faildue to several non-ideal conditions. In this paper, we present a novelapproach for tracking the iris. We introduce a radial symmetry detectorinto the proposal distribution to guide the particles towards the highprobability region. Experimental results demonstrate the ability of theproposed particle filter to robustly track the iris in challenging conditions,such as complex dynamics. Compared to some previous methods, our iristracker is also able to automatically recover from failure.

Keywords: iris tracking, particle filter, radial symmetry.

1 Introduction

Visual gaze tracking is of particular interest in many applications [1], rangingfrom human-computer interactions to cognitive studies. A first step to estimatethe direction of the gaze often relies on tracking eye features such as the pupilor the iris. Then, the point-of-regard can be computed thanks to a mappingbetween the eye and the environment, usually a planar surface. We refer to [2]for an extensive survey on eye tracking and gaze estimation.In this paper, we address the problem of tracking the iris in close-up images witha low-cost camera and under uncontrolled conditions. In comparison with pupiltracking [3], iris tracking remains still a challenging task due to eye- (motion,shape variation, eyelid/eyelash occlusion, pupillary dilation, ethnicity), camera-(focus, resolution) and environment- (light variation, passive illumination, exter-nal occlusion) dependent factors. It can be categorized into two separate classes:non-probabilistic [4][5][6][7] and probabilistic [8][9] approaches.Non-probabilistic approaches. They are mostly improvements of the Star-burst algorithm proposed in [3] originally designed for pupil tracking. Thesemethods rely on feature detection followed by a RANSAC [10] ellipse fitting.Extensions of the algorithm include : a distance filter and constraints about di-rections of rays [4], constraints about size of iris and number of inliers [5], alimbus-pupil switching mechanism using an adaptive threshold to account forlight variations [6] and a constrained search of strong gradients along normal

lines under the implicit assumption of smooth movements [5][7].Probabilistic approaches. Hansen et al. [8] was the first to propose an iristracking algorithm based on particle filter. The contour log-likelihood ratio ismodelled thanks to a Generalized Laplacian to approximate the distribution ofgray-level differences and a Gaussian distribution to consider the deformationsalong the measurement lines [11]. In [9], Wu et al. present an iris and eyelidstracking method based on particle filter and a 3D eye model. An intensity cuealong with edge computation is employed to update the weights of the particles.

Relying solely on feature detection and ellipse fitting is often prone to failure,even if strong constraints are applied in both stages. Moreover, the iris motionis constrained by the eyeball rotation and hence, a bounded state space couldbe accordingly defined. However, in practice, such a strategy requires a largenumber of particles to sample from. Iris distortions also become more importantin close-up images and should be considered in the algorithm. In this work, amore general framework is provided to, at the same time, guide and constrain thefeature detection and the ellipse fitting. Our contribution is to combine an iris-based detector, namely radial symmetry, and a Sequential Monte Carlo approachto robustify iris tracking and improve state-of-the-art approaches.

2 Radial symmetry transform

The radial symmetry transform used in our work is the one proposed by Loy et al.[12]. We briefly review the algorithm and for more details, the reader is referred to[12]. The idea behind the radial symmetry transform is to accumulate orientationand magnitude contributions at different radii from a position p = x, y in thedirection of the gradient:

p± = p± round

(g(p)

||g(p)||r

)(1)

where r ∈ Nr is the radius with Nr being the discrete set of radii and g(·) isthe gradient computed thanks to the 3x3 Sobel operator. At each step, orienta-tion and magnitude projection images are updated depending on positively- ornegatively-affected pixels:

Or(p±) = Or(p±) ± 1 Mr(p±) = Mr(p±)± ||g(p)|| (2)

For each radius r, the radial symmetry transform then becomes:

Sr = Fr ∗Ar with Fr =Mr

kr

(|Or|kr

)α(3)

where Ar is Gaussian filter that allows spreading the symmetry contribution, αis the radial-strictness parameter and kr is a normalizing factor so that Mr andOr can be represented on a similar scale. Or(p) and kr are given by:

Or(p) =

Or(p) if Or(p) < krkr else

with kr =

9.9 if r > 18 else

(4)

Finally, the symmetry contributions can be averaged over the set of radii:

S =1

|Nr|∑r∈Nr

Sr (5)

3 Proposed iris tracker

Particle filter. Let xt ∈ X denote the hidden state and zt ∈ Z the observationat time step t. The principle of the particle filter is to infer the marginal posteriordistribution p(xt|z1:t) of the state xt given the observation sequence z1:t. GivenBayes’ theorem and the Chapman-Kolmogorov equation, it can be expressed bythe following Bayesian recursive formula under a first-order Markovian assump-tion:

p(xt|z1:t) = p(zt|xt)p(xt|z1:t−1)p(zt|z1:t−1)

∝ p(zt|xt)∫p(xt|xt−1)p(xt−1|z1:t−1)dxt−1

(6)

However, in practice, because the integral is intractable, this distribution is ap-

proximated by a set of weighted samples x(n)t , w

(n)t Nn=1:

p(xt|z1:t) ≈N∑n=1

w(n)t δ(xt − x

(n)t ) (7)

where δ(·) denotes the Dirac function and the unormalized weights are sequen-tially updated according to:

w(n)t = w

(n)t−1

p(zt|x(n)t )p(x

(n)t |x

(n)t−1)

q(x(n)t |x

(n)0:t−1, z1:t)

(8)

Typically, the proposal distribution is set equal to the prior transition leadingto a simplified update scheme known as the Bootstrap filter or Condensation

algorithm [13]: w(n)t = w

(n)t−1 p(zt|x

(n)t ).

State space. The iris shape is represented by an ellipse : x = [ cx cy a b θ ]T ∈R5x1 where (cx, cy) are the center coordinates, (a, b) the semi-minor/major axisand θ the angle of the ellipse with respect to the horizontal axis.Dynamic model. The idea of including a detector within the particle filter toboost the tracking, is not new and was proven to be very effective [14]. Further-more, radial symmetry transform is well-suited to detect pupil and aid to locateiris in non-ideal conditions [15]. It relies only on the boundaries which makes itsuitable to ignore specular reflections (often located inside or outside the iris).In our work, we apply the radial symmetry transform at a low resolution on arestricted set of radii based upon the iris size at t−1. Moreover, because the irisis darker than its surrounding, only negatively-affected pixels p− resulting fromstrong gradients are considered. To integrate the radial symmetry detection, theproposal distribution is modelled by a mixture of Gaussian distributions:

q(xt|xt−1, zt) = α qobs(xt|xt−1, zt) + (1− α) p(xt|xt−1) (9)

where α is the mixture coefficient. If α = 0, the proposal distribution equalsthe prior transition i.e. the particle filter is only driven by a random walk. Theadvantage of the proposed state evolution is that it makes use of the currentobservation thanks to the strong detector in order to generate particles in theregion of high probability and allows recovering from failure. Particles are gen-erated thanks to:

x(n)t ∼ qobs(xt|x(n)

t−1, zt) = N (xt; xdt (x

(n)t−1, zt),Σ1)

x(n)t ∼ p(xt|x(n)

t−1) = N (xt; x(n)t−1,Σ2)

(10)

Fig. 1. Radial symmetry guided particle filter. The gray particles are generatedaccording to p(xt|xt−1) while the white particles are propagated by qobs(xt|xt−1, zt).

where xdt (x(n)t−1, zt) is the new proposed state obtained by radial symmetry trans-

form and Σk = diagσ2cx,k

, σ2cy,k

, σ2a,k, σ

2b,k, σ

2θ,k, k ∈ 1, 2. Σ1 favors iris defor-

mations and Σ2 puts more emphasize on translational movements. The princi-ple of the radial symmetry guided particle filter is depicted in Fig.1. Defining

xdt (x(n)t−1, zt) is not straightforward as compared to [14]. The radial symmetry

detector only provides an estimate of the center. For the remaining parame-ters of the state model, two main strategies are possible : either the shape andorientation of (i) each particle or (ii) the estimated state at t − 1 are kept.These strategies have a non-negligible impact on tracking : the latter one allowsmuch more freedom in deformation than considering each particle independently.Based on this observation, we intuitively chose the first approach because theiris keeps the same size, even if it is occluded. This intuition was then confirmedexperimentally.Likelihood estimation. As a trade-off between tracking robustness and com-putational cost, we use a simple yet effective likelihood model based on contourinformation. Under conditional independence assumption, we define the jointmeasurement density by:

p(zt|x(n)t ) =

∏u∈Ω

p(zt(u)|x(n)t ) (11)

with Ω a discrete set of NML measurement locations of length L and:

p(zt(u)|x(n)t ) ∝ |∇Ip(νm(u))|2 exp(−||νm(u)− νo(u)||2

2σ2c

) (12)

where νo and νm denote respectively the reference and the detected featurepoint on the normal line to a hypothesised contour. The gradient magnitudeprojected on the normal line at νm is also incorporated in the likelihood to pro-vide higher weights to boundaries having larger magnitudes. [8] instead assumesa homogeneous intensity distribution inside and outside the object through theGeneralized Laplacian.Confidence measure. In some cases, the detector is not reliable and is at-tracted by spurious distractors. However, the likelihood evaluation sometimesalleviates the problem of false positives. To further mitigate tracking failure, we

Algorithm 1 Proposed iris tracker

Input - Set of weighted samples x(n)t−1, w

(n)t−1Nn=1

Resampling : Duplicate high-weighted particles and eliminate low-weighted par-

ticles to form a new set x(n)t−1,

1N

Nn=1.

Prediction : xt = xt ∪ xdt 1. Randomly select N1 = αN particles from xdt defined by :

xdt = [ cdx(zt) cdy(zt) a

(n)t−1 b

(n)t−1 θ

(n)t−1 ]T

where (cdx(zt), cdy(zt)) are the center coordinates obtained by the radial

symmetry detector. Then, generate xdt according to (10).2. Randomly select N2 = (1 − α)N particles from xt−1 and generate xt

based on (10).

Update : Evaluate the weights according to the likelihood (11) and normalize

them: w(n)t =

w(n)t∑N

i=1 w(i)t

.

Output - Estimated state given by (15).

introduce a confidence measure acting on the guidance strategy:

conf(Smt , dt) =

1 if Smt > Sref ∨ pN (

Smt −S

ref

Sref )pN ( dtMa,b

t−1

) > τ

0 else(13)

where pN (x) , N (x; 0, σ2), Sref and Smt are respectively the reference and thecurrent maximum magnitude of the symmetry contribution given by (5), dt is thedistance between the current center detected by the radial symmetry transformand the estimated center at time step t − 1, Ma,b

t−1 = maxaMAPt−1 , bMAP

t−1 andτ is a constant threshold set to 0,25. The underlying assumptions are that : (i)if the symmetry contribution is low, the output of the detector is not reliableand (ii) if, furthermore, the distance to the detector is high, the output of thedetector is ignored. The product of the Gaussian distributions allows assumingan uncertainty about the high/low transition of the symmetry contribution. Thehard decision given by the confidence measure serves then to modify the mixturecoefficient α of the proposal distribution:

α = αd · conf(Smt , dt) (14)

with αd a user-defined threshold reflecting the mixture strategy.Local refinement. For visualization, the point estimate is given by the maxi-mum a posteriori (MAP) state estimate:

xMAPt = argmax

xt

p(xt|z1:t) ≈ argmaxxt

N∑n=1

w(n)t δ(xt − x

(n)t ) (15)

which is refined to locally capture boundaries. Feature points are detected alongnormals to the ellipse and the best fitting ellipse is computed using a RANSACmethod such as in [4]. Within the ellipse fitting step, the evaluation of the ellipsefollows 2 criteria:

1. Similarity in size : |at − at−1| < δa, |bt − bt−1| < δb2. Bounded size ratio : Rmin <

atbt< Rmax

Table 1. Experimental results obtained for moderate and abrupt motion.

Moderate motion Abrupt motion

Subject A B C D E

Nb. frames 763 524 854 244 351Max amp. (pixels) 63,39 47,01 42,72 95,71 81,56

Pm 3,90 % 2,87 % 2,70 % 16,87 % 14,72 %

Ps 79,46 % 76,86 % 85,92 % 65,61 % 70,11 %Pd 47,94 % 47,80 % 40,21 % 62,55 % 55,28 %

Tracking initialization. In order to initialize the tracking algorithm, the priordensity p(x0|z0) , p(x0) has to be defined. We represent it as a Dirac functioncentered at the position given by the non-probabilistic approach described in [4]with the seed point obtained by the radial symmetry transform.

4 Experimental results

The algorithm was implemented in C (non-optimized) and tested on an IntelCore i5 CPU 2,27 GHz and runs in real-time at less than 40 fps with a resolu-tion of 640x480. For visual assessment, the reader is invited to see video demosat: http://people.isir.upmc.fr/martinez.

Datasets. Unfortunately, there exists no publicly available database of irisvideos. For this reason, experiments were carried on a self-made database con-sisting of 5 subjects (3 Europeans, 1 African and 1 Asian). Videos were capturedwith a webcam running at 30 fps and providing unfocused images. Hand labeledground truths, cx and cy, were estimated for each video. Please note that anno-tation of noisy iris images introduces an error highly dependent on the iris size.Performance measure. The robustness was experimentally evaluated insteadof the accuracy (being similar to the other iris trackers). We define a robustnessmeasure by the percentage of successful tracked frames : Ps = Ns

NTwith NS the

number of successful tracked frames and NT the total number of frames. A suc-cessfull track is determined by setting a threshold τs relatively to the averagesemi-minor/major axis. The percentage of abrupt motion, Pm, and the percent-age of detector-generated xMAP

t , Pd, are also indicated in the table.Parameters setting. All parameters have been experimentally set, but moststayed unchanged for the whole dataset. N , NML, L and αd were respectivelyset to : 50, 30, 20 and 0.5. The optimal number of particles was computed byaveraging Ps over the 5 subjects and is given in Fig.2 (a).Results. Table 1 shows the results obtained for the 5 subjects. Fig.2 (b) showssnapshots of the abrupt motion dataset. Although the videos exhibit large irismovements and significant blur, the proposed method is able to track the iris.According to Pd, one can notice that the more abrupt motion occurs, the morethe MAP estimate is generated by the detector. The presented particle filter wasalso compared against other methods to evaluate its robustness : (i) the non-probabilistic tracker proposed in [7], (ii) our tracker without radial symmetryknowledge for N = 100 and Σ2 slightly increased (it actually has a behaviour

(a) (b)

(c)

Fig. 2. (a) Robustness as a function of the number of particles N , (b) Snapshotsand corresponding point estimate for the dataset with abrupt motion (Notice the highappearance change between some adjacent frames), and (c) Comparison of x -coordinatetrajectory of the iris center between state-of-the-art and proposed iris trackers.

similar to the one described in [8]) and (iii) our full tracker with radial symme-try knowledge. Results showed that all approaches perform well under smoothmotion. However, state-of-the-art trackers are not able to track the iris whenlarge-amplitude motion occurs. An example of such behaviour is illustrated inFig.2 (c). The reason of track loss are that state-of-the-art methods do notincorporate specific knowledge in order to model the iris and especially relyon modelling and detecting the contour. On the contrary, the radial symmetrytransfom provides two iris-specific information guiding the tracking: (i) a close-to-radial shape due to the contribution of convergent gradient directions and (ii)a sharp limbus by only keeping negatively-affected pixels. Furthermore, even ifour tracker fails in some situations, such as eyelid occlusion, it still can quicklyrecover from failure which was never the case with the other methods. This lastcrucial point opens the door towards long-term and fully-automated trackingsystems, not yet handled by state-of-the-art iris trackers.

5 Conclusion

The key idea presented in this paper is that iris tracking can be enhanced by em-bedding radial symmetry knowledge into the particle filter. Experimental resultshave shown the effectiveness of the proposed approach compared with state-of-the-art approaches to cope with complex dynamics and track loss. However,

there is still room for improvement. Iris side-appearance can significantly affectthe radial symmetry detector and should be better modelled to handle strongerelliptical shape and partial occlusions. Future works could be also to model theeye state (open/closed) to handle eyelid occlusion while tracking and to integratea gaze estimation method to infer the point-of-regard.

6 Acknowledgements

This work was supported by the European Comission under the 7th FWP projectAsTeRICS (Assistive Technology Rapid Integration & Construction Set, G.A.No.247730).

References

1. Duchowski, A.T.: A breadth-first survey of eye-tracking applications. BehaviorResearch Methods, Instruments & Computers 34 (2002) 455–470

2. Hansen, D.W., Ji, Q.: In the eye of the beholder: A survey of models for eyes andgaze. IEEE Trans. on PAMI 32 (2010) 478–500

3. Li, D., Winfield, D., Parkhurst, D.J.: Starburst: A hybrid algorithm for video-basedeye tracking combining feature-based and model-based approaches. In: Proc. ofCVPR. (2005) 79–86

4. Li, D., Parkhurst, D.: Open-source software for real-time visible-spectrum eyetracking. In: COGAIN. (2006) 18–20

5. Colombo, C., Comanducci, D., Del Bimbo, A.: Robust iris localization and trackingbased on constrained visual fitting. In: Proc. of ICIAP. (2007) 454–460

6. Ryan, W.J., Duchowski, A.T., Birchfield, S.T.: Limbus/pupil switching for wear-able eye tracking under variable lighting conditions. In: Proc. of ETRA. (2008)61–64

7. Ryan, W.J., Duchowski, A.T., Vincent, E.A., Battisto, D.: Match-moving for area-based analysis of eye movements in natural tasks. In: Proc. of ETRA. (2010)235–242

8. Hansen, D.W., Pece, A.E.C.: Iris tracking with feature free contours. In: Proc. ofAMFG. (2003) 208–214

9. Wu, H., Kitagawa, Y., Wada, T., Kato, T., Chen, Q.: Tracking iris contour witha 3d eye-model for gaze estimation. In: Proc. of ACCV. (2007) 688–697

10. Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for modelfitting with applications to image analysis and automated cartography. Commu-nications of the ACM 24 (1981) 381–395

11. Pece, A.E.C., Worrall, A.D.: Tracking with the em contour algorithm. In: Proc.of ECCV. (2002) 3–17

12. Loy, G., Zelinsky, A.: Fast radial symmetry for detecting points of interest. IEEETrans. on PAMI 25 (2003) 959–973

13. Isard, M., Blake, A.: Condensation - conditional density propagation for visualtracking. IJCV 29 (1998) 5–28

14. Okuma, K., Taleghani, A., De Freitas, N., Little, J.J., Lowe, D.G.: A boostedparticle filter: Multitarget detection and tracking. In: Proc. of ECCV. (2004) 28–39

15. Zhang, W., Li, B., Ye, X., Zhuang, Z.: A robust algorithm for iris localizationbased on radial symmetry. In: Proc. of CISW. (2007) 324–327