MMUGait Database and Classification Baseline Results

19
MMUGait Database and Classification Baseline Results Hu Ng 1, 1 , Wooi-Haw Tan 1 , Hau-Lee Tong 1 , Chiung Ching Ho 1 , Kok-Why Ng, Timothy Tzen-Vun Yap 1 , Pei-Fen Chong 1 , Junaidi Abdullah 1 , Chikkannan Eswaran 1 , 1 Faculty of Computing and Informatics, Multimedia Univeristy, 63100 Cyberjaya, Malaysia {nghu, twhaw, hltong, ccho, kwng, timothy, pfchong, junaidi, eswaran}@mmu.edu.my Abstract. This paper describes the acquisition setup and development of a new gait database, MMUGait DB. The database was captured in side and oblique views, where 82 subjects participated under normal walking conditions and 19 subjects walking under 11 covariate factors. The database includes sarong and kain samping as changes of apparel, which are the traditional costumes for ethnic Malays in South East Asia. Classification experiments were carried out on MMUGait DB and the baseline results are presented for validation purposes. Keywords: gait database, gait biometrics, segmentation, classification. 1 Introduction Biometrics is an approach to identify individuals through their physical and behavioral characteristics such as gait, fingerprint, face, iris and spoken speech. These characteristics are known as the biometric identifiers that are both distinctive and does not change over time for any individual. Gait is an emerging biometric that has gained public recognition and high acceptance as one 1

Transcript of MMUGait Database and Classification Baseline Results

MMUGait Database and ClassificationBaseline Results

Hu Ng1,1, Wooi-Haw Tan1, Hau-Lee Tong1, Chiung Ching Ho1, Kok-Why Ng, Timothy Tzen-Vun Yap1, Pei-Fen Chong1,

Junaidi Abdullah1, Chikkannan Eswaran1,

1 Faculty of Computing and Informatics, Multimedia Univeristy, 63100 Cyberjaya, Malaysia

{nghu, twhaw, hltong, ccho, kwng, timothy, pfchong, junaidi,eswaran}@mmu.edu.my

Abstract. This paper describes the acquisition setupand development of a new gait database, MMUGait DB.The database was captured in side and oblique views,where 82 subjects participated under normal walkingconditions and 19 subjects walking under 11 covariatefactors. The database includes sarong and kainsamping as changes of apparel, which are thetraditional costumes for ethnic Malays in South EastAsia. Classification experiments were carried out onMMUGait DB and the baseline results are presented forvalidation purposes.

Keywords: gait database, gait biometrics,segmentation, classification.

1 Introduction

Biometrics is an approach to identify individuals throughtheir physical and behavioral characteristics such asgait, fingerprint, face, iris and spoken speech. Thesecharacteristics are known as the biometric identifiersthat are both distinctive and does not change over timefor any individual. Gait is an emerging biometric thathas gained public recognition and high acceptance as one

1

the security assessment tools. This is mainly becausegait does not require any intervention from the user orclose contact with the capturing device. Besides that, itis still competent to recognize people from a distanceeven if other biometrics identifiers are intentionallyobscured, for example: hand gloves to cover finger printand mask to cover face. Since 1998, many gait datasets [1-5] have been

developed for performance evaluation of the gaitrecognition systems. While good gait datasets areavailable, we argue that all of them have neglected oneimportant challenge: none of the male subjects arewearing long fabrics that cover the legs. Most of thecurrently available datasets contain male subjects thatare predominantly wearing short trousers. The closest comparison is the Soton dataset [1]; where

there are two female subjects (out of 115 subjects)wearing long blouse and Indian traditional garment(shalwar kameez). Apart from the Soton dataset, the othercomparable dataset is the OU-ISIR Treadmill Database B[2] which involves short skirt as a covariate factor forthe participants. However, the short skirt is onlypartially covering the legs while the knees are stillclearly visible. The main reason why subjects werepreviously not recorded wearing long fabrics is to avoidthe legs from being occluded. This limitation is imposedas most model-based gait recognition systems [7-9]requires the labeling of the legs manually during thefeatures extraction process. The occluded legs will causethe failure of generating the hip and knees trajectoriescorrectly and hence result in a low CorrectClassification Rate (CCR). However, our system does notdepend on the detection of each of the lower limbs.Therefore, it can handle occluded silhouette. In reality males do wear long fabrics such as 'sarong'

and 'kain samping' worn by the ethnic Malays in SouthEast Asia, long 'dhoti' by ethnic Indians in SouthernAsia and kilt by the Scots in Europe. These reasonsmotivated the authors to build a gait dataset withsubjects wearing long fabrics in the form of 'sarong' and

'kain samping'. A further motivation is to prove that ourgait recognition system will still perform well even whenthe legs are non-detectable.The rest of this paper is organized as follows. Section

2 reviews the existing gait databases. In Section 3, theMMUGait database setup and development is presented. InSection 4, the proposed gait recognition system isbriefly explained. Section 5 discusses experimentalsetups and the corresponding results. Lastly, Section 6concludes the paper.

2 Existing Gait Databases Reviews

The existing major gait databases are summarized in Table1.

Table 1 Existing major gait databases

Database Covariates Factors Numberof

Subjects

Numberof

WalkingSequences

UCSD DB [3] 1 view with personalclothing

6 42

Soton Large DB[1]

2 views with personalclothing

115 2,163

Soton Small DB[1]

2 views with 15 covariatefactors

(apparel, bags & speed)

11 3,178

HumanID Gait Challenge DB [4]

2 views with 5 covariatefactors

(apparel, bags, surface &time)

122 1,870

CASIA Dataset A[5]

3 views with personalclothing

20 240

CASIA Dataset B[5]

11 views with 3 covariatefactors

(apparel & bag)

124 13,640

OU-ISIR TreadmillDataset A [2]

1 view with 9 covariatefactors (speed)

34 612

OU-ISIR TreadmillDataset B [2]

1 view with 32 covariatefactors

(apparel & bag)

68 2,746

The University of California San Diego (UCSD) gaitdataset [3] is the first gait dataset that was capturedin 1998. However, it is limited to six subjects with 42outdoor walking sequences. In 2002, University ofSouthampton released the Soton dataset [1] that comprisesthe Large DB with 115 subjects and the Small DB with 11subjects involving 15 covariate factors. In 2001, CASIAdataset A [4] that was captured in 3 view angles (side-view, 45o and 90o) was available for public use. In 2005,the HumanID Gait Challenge [4] that comprises 122subjects with 5 major covariate factors was made publiclyavailable. In the same year, CASIA dataset B [4] thatcaptures 11 different view angles and 3 covariatefactors was disclosed . In 2012, Osaka Universityreleased databases that contains subjects walking ontreadmills: OU-ISIR Treadmill Dataset A and B [5].Treadmill Dataset A involves 34 subjects with ninedifferent walking speeds. Treadmill Dataset B involves64 subjects with 32 combinations of clothing variation.Although the OU-ISIR Treadmill Dataset A and B containssimilar clothing variations as the MMUGait Database(MMUGait DB), Lee et al. [6] demonstrates that there aredifferences in optical flow between subjects walking ontreadmill and solid ground. XXXX Apa ni?MMUGait DB contains 82 subjects recorded under normal

condition for reliable performance evaluation on thelarge population and 19 subjects recorded with 11covariate factors to emphasis the evaluation of theexploratory factor analysis (do you mean the effects oflong fabrics....exploratory factor analysis sounds veryvague) of gait recognition. It contains side-view andoblique-view videos, the extracted silhouette frames,subject still face images, and ancillary data likesubject specific information, camera setups and floormeasurements.

3 The MMUGait Database

MMUGait database (MMUGait DB) is part of the MMU GASPFAdatabase [10]. It was captured and recorded in the Setand Background Studio located in the Faculty of CreativeMultimedia, Multimedia University. The acquisition wasdone over a period of four days in December, 2011 andinvolved 82 subjects. Ethical approval was obtained fromthe subjects by signing an approval consent form prior tovolunteering. The recording of MMUGait DB was done in anindoor environment with green backdrop and white solidsurface. Details of the development of database arediscusses in the following subsections.

3.1 Acquisition of the MMUGait DB

Two SONY HDR-XR160E full HD video camera recorders(camcorders) were used during the recording. Therecorded videos are in MPEG Transport Stream (MTS) formatwith resolution of 1920 (Height) x 1080 (Width) pixels.The video stream was captured using progressive scan witha frame rate of 50 frames per seconds (fps). Thecamcorders captured the walking sequence from twodifferent view angles, which were side-view (frontalparallel) and 60o oblique-view.The subjects walked back-and-forth on a track

continuously and captured in both directions. Both endsof the walking track were the turning point that subjectsmust turn around and repeat the walking sequence inanother direction. Figures 1a and 1b show the acquisitionenvironment and the recording layout respectively. 82 subjects took part in the recording of the MMUGait

DB. They were from different genders, nationalities andage groups. For the gender class, 15 subjects werefemale, while the remaining 67 were male. Figure 2a andFigure 2b show the distribution of nationalities and agegroups respectively. There are two categories of dataset collected in this

phase. The first set (MMUGait Large DB) contains 82

subjects that walked for at least 20 sequences in bothdirections with personal clothing (own shoes and owncloth). On the other hand, the second set (MMUGaitCovariate DB) contains 19 subjects that walked in bothdirections wearing different types of clothes, shoes andcarrying bags, with varying walking speed. This setincludes 11 covariant factors, which are sarong, kainsamping, personal clothing, carrying hand bag, slungbarrel bag over shoulder), carrying barrel bag, carryingrucksack, walking slowly, walking quickly, walking inflip flops and bare feet. In total, there wereapproximately 110 sequences per subject.

(a) (b)Figure 1 (a) The acquisition environment. (b) The recording

layout.

(a) (b)Figure 2 (a) Distribution of nationalities. (b) Distribution

of age

To the best of our knowledge, this is the first paperthat introduces gait database with ethnic Malays’

traditional costumes, which are common attire for ethnicMalays in South East Asia, especially during Fridayprayers and religious festivals. In this case, werecorded subjects with kain samping and sarong ascovariate factors as changes in apparel. We believe thatit can act as a benchmark database for performanceevaluation by other gait recognition systems.

3.2 Data Processing

The original video format filmed was in MPEG transportstream format (MTS) with 50 frames per second (FPS). Forfurther processing, all the recorded videos wereconverted to Audio Video Interleave (AVI) format withresolution of 1920 (High)*1080(Width) pixels. After that,the videos were extracted into individual frames in JointPhotographic Experts Group (JPEG) format.

3.3 Silhouette Generation

The silhouette generation processes are described asfollows: Step 1: To extract human silhouette from each frame as

the region of interest (ROI) by employing backgroundsubtraction technique. We improved the conventionalbackground subtraction technique by summing up the graylevel results from (a) background image subtracted byforeground image, and (b) foreground image subtracted bybackground image. This new technique is able to amplifythe difference between the foreground and the backgroundimages. As a result, the foreground object is moredistinct from the background. Step 2: To increase the contrast of the foreground

object image by adjusting the pixels’ intensity. Theprocess is obtained via optimal thresholding by applyingOtsu’s method [11]. Once the threshold was found, it wasused to rescale the gray level values to new values suchthat values between the lowest input value and thethreshold were scaled to values between 0 and 1.

Step 3: Convert the rescaled foreground object image tobinary image by using another threshold obtained fromOtsu’s method. In this case, if a pixel value was belowthe threshold, it was set to zero; otherwise it was setto one in the binary image.Step 4: Apply the morphological operations with a 7x7

diamond shape structuring element are applied to enhancethe generated foreground object. Morphological openingwas applied to separate the shadow into isolated regions,while morphological closing is used to close the smallgaps in the foreground object.Step 5: To eliminate pseudo objects in the image,

connected component labeling was applied to label allregions in the image. Subsequently, those regions witharea smaller than 1500 pixels are removed. Figure 3 showsthe resulting images at different stages of silhouettegeneration.

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)Figure 3 (a) Original foreground image. (b) Color image ofoutput of foreground subtracts background. (c) Color image ofoutput of background subtracts foreground. (d) Gray level imageof b. (e) Gray level of c. (f) Combination of d and e. (g)Foreground object after intensity adjustment. (h) Foregroundobject in binary values. (i) Foreground object aftermorphological opening. (j) Final extracted silhouette.

4 Gait Recognition System

We present a hybrid gait recognition system which combinemodel-free approach to extract subject’s height, width,step-size and crotch height as the static features andmodel-based approach to extract joint angulartrajectories as the dynamic features. The processes arediscusses in the following subsections.

4.1 View-point Normalization

To normalize the oblique walking sequence into the side-view plane, the perspective correction technique isemployed. First, all silhouettes in a walking sequenceare superimposed into a single image, as shown in Figure4a. Next, line X and Y are drawn horizontally based on the

highest and lowest point among the silhouettes. As thenormal gait cycle is periodic, a sinusoidal line isformed when the highest points of all silhouettes in awalking sequence are connected. Line Z is then drawn byconnecting the first peak and the last peak of thesinusoidal line. The perspective correction technique consists of two

stages: vertical and horizontal adjustments. For verticaladjustment, each silhouette is vertically stretched fromline Z towards line X. In addition, each silhouette isalso vertically stretched from the bottom towards line Y.Details of the normalization technique can be referred to[12]. Figure 4b shows superimposed silhouettes afterperspective correction.

Line ZLine X

Line Y

(a) (b)Figure 4 (a) Superimposed silhouettes from one walking sequence.(b) Superimposed silhouettes after perspective correction.

4.2 Gait Feature Extraction

Referring to a priori knowledge of the body proportions[13], the vertical position of hip, knee and ankle areestimated as 0.48H, 0.285H and 0.039H with referring tothe body height H. The lower body joints that define thepivot points in human gait are then identified and thejoint trajectories are computed. Details of the gaitfeature extraction technique can be referred to [12].The joint angular trajectory () can be determined byusing the following equation:

11

2 1tan 2 1x x

y y

p pp p

(1)1

23 1tan 3 1

x x

y y

p pp p

(2)1 2 (3)

where p1x, p2x and p3x are the x-coordinates of joint p1,p2 and p3, respectively, and p1y, p2y and p3y are the y-coordinates of joint p1, p2 and p3, respectively. Figure5a illustrates the joint angular trajectory is determinedfrom two joints.In our gait system, five joint angular trajectories

have been extracted, as there are the five main joints onthe limbs. These angular trajectories are hip angulartrajectory (1), front knee angular trajectory (2), backknee angular trajectory (3), front ankle angulartrajectory (4) and back ankle angular trajectory (5). The height and width of the human silhouette is

measured. The Euclidean distance between the ankles isused to represent the subject’s step-size (S). Then, theEuclidean distance between the ground and the subject’scrotch is being calculated as crotch height (CH). If thecrotch height is found lower than the height of knee, wewill assume that it is equal to zero, as the crotch isconsidered occluded.

As the presence of outliers in the extracted featureswould hinder the classification process, Gaussian filterwith sigma values (σ) equal to 2.5 is applied to removethem. In order to normalize the extracted features fromvarious dimensions to be independent and standardizedLinear scaling technique [14] has been applied tonormalize each feature component to the range between 0and 1. Figure 5b shows a sample of a human silhouettewith the nine extracted gait features.

(a) (b)Figure 5 (a) Joint angular trajectory computation. (b) Nine

extracted gait features

4.3 Feature Vector and Features Selection

To construct the feature vector, maximum hip angulartrajectory (1

max) was determined during a walkingsequence. When 1

max was identified, the corresponding S,W, H, 2, 3, 4, 5 and CH were also determined. To betterdescribe the human gait, 24 features were used toconstruct the feature vector as shown:

where AW, AH, ACH, A1, A2, A3, A4, A5 and AS are the average ofthe local maxima detected for width, height, crotchheight, hip angular trajectory, front knee angulartrajectory, back knee angular trajectory, front ankleangular trajectory, back ankle angular trajectory and

step-size, respectively; RAH, RACH, RAS, RCH, RH and RS are theratio of AH, ACH, AS, CH, H and S to W, respectively.The performance of a recognition system is determined

by the effectiveness of the selected features, which canmaximize inter-class variance. In our work, Ranker [15]is used to rank features by their individual evaluations,which helps to identify those extracted features thatcontribute positively in the recognition process. Basedon the scores obtained, all twenty four features haveexhibited positive contribution. Thus, all of them areused in our system. Figure 6 shows examples of successfuljoint detection from self-occluded silhouettes andsilhouettes with external occlusion.

(a) (b) (c)

(d) (e) (f)Figure 6 Examples of joints detection on occluded humansilhouettes. a) Wearing kain samping. b) Wearing sarong. c)Barrel bag slung over shoulder. d) Walking in bare feet. e)Carrying rucksack. f) Carrying barrel bag by hand.

4.4. Classification technique

To study the performance of our approach in gaitrecognition system, multi-class Support Vector Machine(SVM) with Radial Basis Function (RBF) kernel wasapplied to evaluate the performance of the proposed gaitrecognition system. This is because RBF has proven toperform better than other SVM’s kernels [16]. For thispaper, the SVM technique was implemented by applyingLIBSVM package [17]. The kernel's parameters such as g(gamma) and regularization parameter C were trained as tofind the best correct classification rate. Three qualitymeasures were used in the experiment: correctclassification rate (CCR), true positive rate (TPR) andfalse positive rate (FPR).Ten folds cross validation was employed for this work,

where the walking sequences from the gait databases wererandomly divided into ten disjoint subsets, nine subsetsused for analysis training and one subset is used forvalidation. The cross-validation process was iterated for10 turns with features vectors of each disjointed subsetchanneled into classifiers as the validation test. Then,the mean correct classification rate can be obtained byaveraging the cross validation results.

5 Experimental Results and Discussion

This section presents and discusses the results ofexperiments which were aimed to assess the recognitionrate of the proposed system with respect to viewnormalization, large population and covariate factors.Two datasets were employed for performance evaluation;MMUGait Large DB, MMUGait Covariate DB. In the evaluationof each dataset, the analysis is performed on walkingsequences captured from side-view (Side), normalizedoblique-view (NorOb) and a combination of both views(Com).

5.1 Experimental Results of MMUGait Large DB

The performance was evaluated on 80 subjects from MMUGaitLarge DB. The walking sequences were from the side-view,normalized oblique-view and combination of both views are2961, 2843 and 5804 respectively. The overall CCR resultsare summarized in Table 2.

Table 2 CCRs of MMUGait Large DB

Side NorOb ComCCR (%) TPR

(%)FPR(%)

CCR (%) TPR(%)

FPR(%)

CCR (%) TPR(%)

FPR(%)

95.4 95.4 0.1 91.6 91.6 0.1 93.2 93.2 0.1

The gait recognition system managed to obtain high CCRsand TPRs (above 91%) in the large population recognition.The system achieved low FPRs (0.1%).

5.2 Experimental Results of MMUGait Covariate DB

The performance was evaluated on 19 subjects from MMUGaitCovariate DB. The walking sequences were from the side-view, normalized oblique-view and combination of bothviews are 3780, 3713 and 7493 respectively. The overallCCR results are summarized in Table 3.

Table 3 CCRs of MMUGait Covariate DB

Side NorOb ComCCR (%) TPR

(%)FPR(%)

CCR (%)TPR (%) FPR(%)

CCR (%) TPR(%)

FPR(%)

96.0 96.0 0.2 93.6 93.6 0.4 94.2 94.2 0.3

For group covariate factor analysis, the databasecategorized into five groups: Group 1 (G1) differentspeeds; Group 2 (G2) variety of footwear; Group 3 (G3)various objects carrying; Group 4 (G4) various type ofclothes; Group 5 (G5) personal clothing without carryingany object. The overall CCR results for group and

individual covariate factor are summarized in Tables 4and 5 respectively.

Table 4 CCRs of group covariate factors for the MMUGaitCovariate DB

Side NorOb ComCCR(%)

TPR(%)

FPR(%)

CCR(%)

TPR(%)

FPR%)

CCR(%)

TPR(%)

FPR(%)

G1 (speed) 95.9 95.9

0.2 94.2 94.2

0.3 95.1 95.1

0.3

G2 (shoes) 95.8 95.8

0.2 94.4 94.4

0.3 95.0 95.0

0.3

G3 (carrying) 97.7 97.7

0.1 95.0 95.0

0.3 95.7 95.7

0.2

G4 (apparel) 95.5 95.5

0.3 94.2 94.2

0.3 94.3 94.3

0.3

G5 (personalclothing)

96.6 96.6

0.2 94.2 94.2

0.3 95.2 95.2

0.3

Table 5 CCRs of individual covariate factor from MMUGaitCovariate DB

Side NorOb ComCCR(%)

TPR(%)

FPR(%)

CCR(%)

TPR(%)

FPR(%)

CCR(%)

TPR(%)

FPR(%)

Wearing sarong 95.1 95.1

0.3 93.7 93.7

0.3 93.7 93.7

0.4

Wearing kainsamping

97.1 97.1

0.2 96.4 96.4

0.2 96.3 96.3

0.2

Carryinghandbag

98.2 98.2

0.1 95.5 95.5

0.3 96.1 96.1

0.2

Slung barrelbag

97.7 97.7

0.1 94.1 94.1

0.3 96.4 96.4

0.2

Carrying barrelbag

96.8 96.8

0.2 93.6 93.6

0.4 95.0 95.0

0.3

Carryingrucksack

98.1 98.1

0.1 96.0 96.0

0.2 96.4 96.4

0.2

Walking slowly 97.3 97.3

0.2 95.3 95.3

0.3 95.6 95.6

0.3

Walking quickly 95.3 95.3

0.3 93.7 93.7

0.3 94.4 94.4

0.3

Walking in flipflops

96.6 96.6

0.2 93.9 93.9

0.3 95.1 95.1

0.3

Walking withbarefeet

95.2 95.2

0.3 94.9 94.9

0.3 95.2 95.2

0.3

In general, our gait recognition system managed toobtain high CCRs and TPRs (above 94%) in the experiments.The system achieved low FPRs, which are in the range of0.1 % to 0.4%. From Table 4, it can be observed that our system isrobust to covariant factors as it has resulted in highCCRs. For that reason, we found that Group G1 generatedhigh CCR as the duration of the walking cycle was notincluded as a feature. Similarly, the high CCRs generatedfrom group G2, G3 and G4 shows that changes of shoetypes, occlusion by bags or apparel did not affect theextracted feature. From Table 5, it can be observed that our system

managed to provide high CCRs even when the subjects werewearing long fabrics. Nevertheless, this factor hasresulted in the lowest CCR as it was not possible toidentify the crotch height due to the occlusion by thesarong.

6 Conclusion

We managed to develop a new gait database that consistsof sarong and kain samping as the changes of apparel. Ourautomated multi-view gait recognition system is able toextract gait features from silhouette effectively. It candetect the body joints even from self-occludedsilhouettes or those occluded by apparel or bags. Inaddition, the high CCRs and TPRs, low FPRs also show thatit is robust and can achieve good performance either ingait databases with various covariate factors or largepopulation of subjects and multiple view angles. The authors are planning to allow public access to the

MMUGait DB in the near future. We believe that thewalking sequences with special apparels will beinvaluable for performance evaluation of other gaitrecognition systems.

Acknowledgments

The authors would like to thank all the anonymousparticipants, voluntarily helpers and the university inthis research work. We would also like to thank themanagement of the university and all the helpers thatmake this dataset collection possible.

References

1. Shutler, J., Grant, M., Nixon, M.S., Carter, J.: On A LargeSequence-based Human Gait Database. Proc. of 4thInternational Conference on Recent Advances in SoftComputing, pp. 66-71. Nottingham (UK) (2002)

2. Makihara, Y., Mannami, H., Tsuji A., Hossain, M.A., Sugiura,K., Mori, A., Y.: The OU- ISIR Gait Database Comprisingthe Treadmill Dataset. IPSJ Trans. on Computer Vision andApplications, vol. 4, No. 0, pp. 53-62 (2012)

3. Little, J., Boyd, J.: Recognizing People by Their Gait: TheShape of Motion, Videre: Journal of Computer Vision Research1, no.2, pp: 1-32 (1998)

4. Sarkar, S., Phillips, P.J., Liu, Z. Vega, I.R., Grother, P.,Bowyer K.W.: The Humanid Gait Challenge Problem: Data Sets,Performance, and Analysis. IEEE Transactions on PatternAnalysis and Machine Intelligence on 27, no.2, pp.162-177(2005)

5. Yu, S. Tan, D, Tan T.: A Framework for Evaluating The Effectof View Angle, Clothing and Carrying Condition on GaitRecognition. Proc. of International Conference on PatternRecognition, vol. 4, pp 441- 444 (2006)

6. Lee, S.J., Hidler J.: Biometrics of Overground Vs. TreadmillWalking in Healthy Individuals, Journal of AppliedPhysiology, vol. 104, no. 3, pp. 747 – 755 (2008)

7.Yam, C., Nixon, M.S., Carter, J.N.: Automated PersonRecognition by Walking and Running via Model-BasedApproaches. Pattern Recognition, 37(5), pp. 1057-1072 (2004)

8.Wagg, K., Nixon, M.S.: On Automated Model-Based Extractionand Analysis of Gait. In Proc. of 6th IEEE InternationalConference on Automatic Face and Gesture Recognition, pp.11-16 (2004)

9.Bouchrika, I., Nixon, M.S.: Model-based Features Extractionfor Gait Analysis and Recognition. In Proc. of Mirage:Computer and Vision / Computer Graphics CollaborationTechniques and Applications, pp. 150-160 (2007)

10. Ho, C.C., Ng H., Tan, W.H. Ng K.W., Tong, H.L., Yap, T.T.V.,Chong, P.F., Eswaran, C., Junaidi A.:MMU GASPFA: A COTSmultimodal biometric database. Pattern Recognition Letters(in press)

11. Otsu, N.: A Threshold Selection Method from Gray-levelHistograms, Automatica, vol. 11, no. 285-296 , pp. 23-27(1975)

12. Ng H., Tan W.H., Junaidi A.: Multi-view Gait Based HumanIdentification System with Covariate Analysis. InternationalArab Journal of Information Technology (accepted forpublication)

13. Dempster, W.T., Gaughran, G.R.L.: Properties of BodySegments Based on Size and Weight. American Journal ofAnatomy, vol. 120, pp. 33-54 (1967)

14. Aksoy, S., Haralick, R.: Feature Normalization andLikelihood-based Similarity Measures for Image Retrieval.Pattern Recognition Letters, vol. 22, issue 5, pp.563-582(2001)

15. Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann,P., Witten, I.: The WEKA Data Mining Software: An Update. ACMSIGKDD Explorations Newsletter, vol. 11, issue 1, pp. 10-18(2009)

16. Byun, H., Lee, S.W.: A Survey on Pattern RecognitionApplications of Support Vector Machines. InternationalJournal of Pattern Recognition and Artificial Intelligence,vol. 17, no. 03, pp. 459-486 (2003)

17. Chang, C.C., Lin, C.J.: LIBSVM: A Library for Support VectorMachines (2001). Software available athttp://www.csie.ntu.edu.tw/~cjlin/libsvm