Black Pearl: An Alternative for Mouse and Keyboard

6
Black Pearl: An Alternative for Mouse and Keyboard Rajesh Kumar, Anupam Kumar Department of Electrical Engineering, MNIT, Jaipur, India-302017 [email protected] , http://www.mnit.ac.in Abstract This paper presents “Black Pearl” system which is a programmable human-computer interface by which one can control computer mouse cursor & keyboard by using movements of head or mouth or any part of body as he or she wishes. It provides a way to control computer hands free in real time frame. The system tracks the user’s movements and translates it into the movements of mouse pointer. The system is consisting of a Web-Cam & a GUI interface which is developed in visual basic. Cursor position is navigated by calculating the correlation coefficient of tracing window in image space. Size of sub-image window, speed of mouse and “dwell- time” (time to click on a particular object or icon of window) can be varied according to the user convenience. “Black Pearl” with its onscreen keyboard also provides a way to write and manipulate data by the user and hence act as an alternative for keyboard. Experimental results show that “Black Pearl” with its easy GUI can control mouse cursor efficiently and hence act as a medium for any physically challenged person who wants to interact with the computer but cannot do so due to his physical disability in controlling mouse and keyboard. Keywords: Computer–vision, Hand free control, Human– computer interface, Perceptual User Interfaces (PUI), Virtual mouse & keyboard. 1. Introduction During last two decades extensive research has been done in the field of human-computer interaction which enables one to interact with computer in a natural way, instead of using hand-controlled input devices, like mouse and keyboard. New advancements in technologies in computer vision and human computer interaction have brought a great change in living conditions of physically challenged children suffering with Sensory Integration Disorder (SID) and the elderly people. Perceptual user interface (PUI) is one of such technique which tracks human body movements as video input, from which the intention of human can be given to computer for its further response. Especially, such techniques are useful for physically challenged people. It provides them a method, so that they can access internet, can exhibit their emotions and improve their life a lot by using computer. PUI provides computers the ability to segment, track, and understand the pose, gestures, and emotional expressions of any human. Mainly the research has been done in the field of controlling mouse cursor of computer. Initially solutions were proposed in the form of hardware where one can enter data into computer by pressing some sort of switches, infrared emitters & reflectors attached to the user’s glasses, head band, or cap, transmitter over the monitor [1, 2]. All these solutions are not properly adaptable as some people feel uncomfortable in wearing helmets, glasses, mouth sticks or other interfacing devices. Eye-gaze [3] method is one of the popular methods to trace the movement of eye. It consists of an infrared light source. The change in the position of eyes is feed into computer for further application. The limitations of the system are bright-eye effect and also to operate such system the user must maintain his or her head in a nearly stationary position. “Eagle Eye” system uses electrooculographic potential (EOG) [4, 5] to detect eye movements to subsequently control mouse cursor. The change in EOG of eyes is measured using electrodes which are attached to the user’s face. But children feel uncomfortable with the electrodes at their faces whereas in some cases user perspires. “Camera Mouse” [6] is one solution which requires no body attachments, easy to use and requires no calibration. It has limitation of its visual tracking algorithm [10]. Many systems were proposed for Perceptual User Interface (PUI) solutions. “Face as Mouse through Visual Face Tracking” [7, 8, 9] introduced a novel camera mouse system driven by 3D model based visual face tracking technique. One of the systems is able to track multiple faces under varying pose, tilted and rotated reliably in real-time [8]. Recently a new web-cam based mouse system named as “hMouse” [9] has come into existence. This system uses an algorithm based on the reliable tracking, the user’s head roll, tilt, yaw and scaling, horizontal and vertical motion for mouse control. Nouse is also one such solution that tracks a face using a convex-shape nose feature as well as does face-tracking with two off-the-shelf cameras. This allows ICGST-GVIP, ISSN 1687-398X, Volume (8), Issue (III), October 2008 1

Transcript of Black Pearl: An Alternative for Mouse and Keyboard

Black Pearl: An Alternative for Mouse and Keyboard

Rajesh Kumar, Anupam Kumar

Department of Electrical Engineering, MNIT, Jaipur, India-302017

[email protected], http://www.mnit.ac.in

Abstract This paper presents “Black Pearl” system which is a programmable human-computer interface by which one can control computer mouse cursor & keyboard by using movements of head or mouth or any part of body as he or she wishes. It provides a way to control computer hands free in real time frame. The system tracks the user’s movements and translates it into the movements of mouse pointer. The system is consisting of a Web-Cam & a GUI interface which is developed in visual basic. Cursor position is navigated by calculating the correlation coefficient of tracing window in image space. Size of sub-image window, speed of mouse and “dwell- time” (time to click on a particular object or icon of window) can be varied according to the user convenience. “Black Pearl” with its onscreen keyboard also provides a way to write and manipulate data by the user and hence act as an alternative for keyboard. Experimental results show that “Black Pearl” with its easy GUI can control mouse cursor efficiently and hence act as a medium for any physically challenged person who wants to interact with the computer but cannot do so due to his physical disability in controlling mouse and keyboard. Keywords: Computer–vision, Hand free control, Human–computer interface, Perceptual User Interfaces (PUI), Virtual mouse & keyboard. 1. Introduction During last two decades extensive research has been done in the field of human-computer interaction which enables one to interact with computer in a natural way, instead of using hand-controlled input devices, like mouse and keyboard. New advancements in technologies in computer vision and human computer interaction have brought a great change in living conditions of physically challenged children suffering with Sensory Integration Disorder (SID) and the elderly people. Perceptual user interface (PUI) is one of such technique which tracks human body movements as video input, from which the intention of human can be given to computer for its further response. Especially, such techniques are useful for physically challenged people. It provides them a

method, so that they can access internet, can exhibit their emotions and improve their life a lot by using computer. PUI provides computers the ability to segment, track, and understand the pose, gestures, and emotional expressions of any human. Mainly the research has been done in the field of controlling mouse cursor of computer. Initially solutions were proposed in the form of hardware where one can enter data into computer by pressing some sort of switches, infrared emitters & reflectors attached to the user’s glasses, head band, or cap, transmitter over the monitor [1, 2]. All these solutions are not properly adaptable as some people feel uncomfortable in wearing helmets, glasses, mouth sticks or other interfacing devices. Eye-gaze [3] method is one of the popular methods to trace the movement of eye. It consists of an infrared light source. The change in the position of eyes is feed into computer for further application. The limitations of the system are bright-eye effect and also to operate such system the user must maintain his or her head in a nearly stationary position. “Eagle Eye” system uses electrooculographic potential (EOG) [4, 5] to detect eye movements to subsequently control mouse cursor. The change in EOG of eyes is measured using electrodes which are attached to the user’s face. But children feel uncomfortable with the electrodes at their faces whereas in some cases user perspires. “Camera Mouse” [6] is one solution which requires no body attachments, easy to use and requires no calibration. It has limitation of its visual tracking algorithm [10]. Many systems were proposed for Perceptual User Interface (PUI) solutions. “Face as Mouse through Visual Face Tracking” [7, 8, 9] introduced a novel camera mouse system driven by 3D model based visual face tracking technique. One of the systems is able to track multiple faces under varying pose, tilted and rotated reliably in real-time [8]. Recently a new web-cam based mouse system named as “hMouse” [9] has come into existence. This system uses an algorithm based on the reliable tracking, the user’s head roll, tilt, yaw and scaling, horizontal and vertical motion for mouse control. Nouse is also one such solution that tracks a face using a convex-shape nose feature as well as does face-tracking with two off-the-shelf cameras. This allows

ICGST-GVIP, ISSN 1687-398X, Volume (8), Issue (III), October 2008

1

one to track faces robustly and precisely in both 2D and 3D the help of with low resolution cameras [10]. In this paper authors propose a new System named “Black Pearl”, as one of the successor to “Camera mouse” with an improved section of image trace module, user-friendly Graphical User Interface (GUI) and an on-screen keyboard. All existing system can only control mouse cursor position by tracing head position, but proposed system can control mouse cursor position using any part of body and it’s on-screen keyboard provides hands free control of the computer. The organization of the paper is the following. Section (2) summarizes basis of all web-cam based mouse. Section (3) describes proposed system by giving an overview of algorithm & GUI. Section 4 shows the experimental results for the proposed system with their detailed discussions. Section (5) summarizes the paper and gives some analyses for future. 2. Web-Cam Based Mouse Web- Cam based mouse control systems are robust than other Perceptual User Interface (PUI) systems as these require no body attachments, are easy to control and can easily trace body part. The framework of “camera mouse” system is described in figure 1 [7].

Figure1. Frame work of camera mouse

As shown in the framework, the Web- Cam based mouse system captures the motion of the user using a webcam. After capturing the image, the system traces the required feature, mostly experimental feature is head as head is more statistically consistent in color, shape and texture, and thus allows computer to detect and track with robustness and accuracy [7, 9]. For tracking feature different methods & algorithm are used, like correlation coefficient [6, 7], SNoW technique [11], Continuously Adaptive Mean Shift (CAMSHIFT) [12] etc. After finding change in position of the required feature, motion parameters are sent to mouse control module which controls all the mouse events. Initial “Camera Mouse” [6] consists of two computers that are linked together (a “vision computer” and a “user computer”). The vision computer executes the visual tracking algorithm and sends the position of the tracked feature to the user computer. The limitation of “Camera Mouse” [6] is only with its visual tracing module which fails during extreme movements, user jump, multi-user occlusion, hand/object occlusion & turning around. “Face as Mouse through Visual Face Tracking” [7, 8] involves a new 3D camera mouse system which decomposes head movement into three rigid movements: rotation, translation, and non-rigid movement, such as the open/close of mouth, eyes, and facial expressions, etc.

This system is not fully optimized in its algorithm & speed. “Reliable and Fast Tracking of Faces under Varying Pose” [8] presents a reliable real-time system that is able to track multiple faces with largely tilts and rotations in fast motion with high accuracy. They also describe an online learning based face tracking algorithm. “hMouse” [9] system consists of a robust real-time head tracker, a head pose/motion estimator, and a virtual mouse control module. Its algorithm is based upon Continuously Adaptive Mean Shift (CAMSHIFT) [12, 13]. Limitations of “hMouse” system are its cursor control mode and its head pose estimation module not being very precise. 3. Black Pearl Authors propose a system named “Black Pearl” which has a robust hand free perceptual user interface. System can act as an alternative to both mouse and keyboard. Like all “Camera Mouse” systems, it consists of Image trace module and mouse control module. Its algorithm is based upon matching image using correlation coefficient. The experimental findings show that it works fine for different conditions like user jumping, multi-user occlusion, and large degree rotation. Experimental results state that it can be used to trace any body feature. The additional feature that system provides is an On-Screen Keyboard which has not been discussed in any previous “Camera Mouse” systems till date. Using this On-Screen keyboard one can easily enter data into computer without a conventional keyboard. a. System Overview “Black Pearl” System includes webcam for capturing images and GUI (Graphical User Interface) showing a user interface and on-screen keyboard. The overview of System is shown in figure 2.

Figure 2: System overview

System gets its frame input from webcam and then feeds it to Image Tracer module. It crops the required feature as sub-image from the main image. Using correlation method it finds out the new position of required feature and feeds it to the Mouse Control Module. Mouse Control Module changes the cursor position corresponding to change in position of feature. Depending upon dwell time it generates click events. User can also use on-screen keyboard for typing purpose or entering data into computer.

Webcam

Graphical User

Interface

Image Tracer

Module

Mouse Control Module

Operating

System

Call for next

Frame Input

Mouse Events

Size of Sub-image & Dwell time

Change in image Position

ICGST-GVIP, ISSN 1687-398X, Volume (8), Issue (III), October 2008

2

b. Algorithm Pattern recognition is an important part of any PUI system. Any system recognition procedure can be divided into two steps one as image segmentation and other as the recognition of image patterns. Various algorithms and principles for recognition procedure are discussed in references [14, 15]. Image Segmentation: Image Segmentation involves partitioning of an image into a set of homogenous and meaningful regions, such that the pixels in each region have some identical set of properties which may be gray levels, contrast, spectral value or textural properties. An image is thus defined by a set of regions that are connected and overlapping, so that each pixel in the image acquires a unique region label that indicates the region it belongs to. An image R is segmented in to N finite set of regions represented by 1 2 3, , , ..., NR R R R⟨ ⟩ and following are the conditions

1 2 3 .. . . NR R R R R= ∪ ∪ ∪ , .i jR R i j= Φ ∀ ≠∩

( ) ,iP R T r u e i= ∀ ( ) , .i jP R R f a l s e i j= ≠∪

Where P represents the choice of property associated in the region and Ф is the null set. Condition (i) indicates that the segmentation must be complete; i.e. every pixel must be in a region. Condition (ii) requires that the points in a region be connected to some predefined region. Condition (iii) indicates that region must be disjoint. Condition (iv) states that the properties must be satisfied by all the pixels in the segmented region. Last condition indicates that adjacent regions Ri and Rj are different in the sense of predicate P. Segmentation algorithms are based on one of the two basic properties of grey-scale values-discontinuity and similarity among the pixels [16]. In the “Black Pearl” system, we have used the clustering technique. This technique groups pixel or sub-region into clusters. Cluster-oriented segmentation uses the multidimensional data to partition the image pixels into cluster using different algorithms (Watershed transform, Distance transform or gradients). We segment the image into cluster of fixed size and the resultant image is consisting of cluster having the result of correlation coefficient data. Recognition of image patterns: Once an image is segmented, the next step is to recognize the segmented objects in the scene. Each object is a pattern and the measured values are the features of the pattern. Patterns may be described by a set of features. A number of pattern classification techniques have been used for the recognition of patterns. Here we have used the unsupervised technique in recognition of best position of required image. In Unsupervised technique the methods of clustering are Hierarchical method, K-means methods and graph theoretic methods. In hierarchical algorithm the data set is partitioned in a number of clusters in a hierarchical fashion. In K-means clustering approach the input is divided into K partitions. Here the distances may be

computed only from the centroid. After getting image consisting of clusters depending on correlation data, we find the cluster having maximum value of coefficient correlation for a desired image. This cluster is the required feature to be traced by the system. In the next sequence the find cluster become the desired cluster and the new search is done for this cluster.

Figure 3: Algorithm used in Black Pearl System

Detailed algorithm of the “Black Pearl” is showed in figure 4. When the user starts the system, the webcam starts capturing the frames and those are shown on the screen. The user may then provide the “dwell time” to the system and also the size of the image to be cropped according to his convenience. The user then selects feature in the image to be tracked by the system by clicking over it. If user directly clicks on the feature to be tracked without entering any data; the system itself takes default “dwell- time” as 1000 ms and size to be cropped as 20 pixels. A square window, double the size that user enters, with black boundaries is drawn around the feature, which is to be tracked. Sub-image A(x,y) within this square is cropped out of the image frame and stored for future comparison. After saving sub-image, the system calls for next frame B(x, y). Calculations for reference image rx,y are performed as follows:

, ,

,2 2

, ,

A A B Bx y x yx yrx y

A A B Bx y x yx y x y

− −⎛ ⎞ ⎛ ⎞⎜ ⎟ ⎜ ⎟− −⎜ ⎟ ⎜ ⎟⎝ ⎠ ⎝ ⎠

=⎛ ⎞ ⎛ ⎞− −⎛ ⎞ ⎛ ⎞⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟− −⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟⎝ ⎠ ⎝ ⎠⎝ ⎠ ⎝ ⎠

∑ ∑

∑ ∑ ∑ ∑

Where

1,A Ax yN x y

−= ∑∑

1,B Bx yN x y

−= ∑∑

Where N is the number of image pixels. We also use the function sortpixel_val to sort down the value of pixel value in resultant image to get brightest pixel which is next position of sub image. The obtained sub-image A(x, y) containing desired feature is then searched in the new frame B(x, y) using correlation coefficient algorithm. For this, we have used the correlate_images function which takes two images (sub-image A(x, y) & test image B(x, y) ) as input and

Frame From

Webcam

Default Value of Parameters

Set user defined value

Image Tracer Module

Mouse Module

On-screenKeyboard

Windows Application

On Tracing

Is user set all

Parameter choices?

Yes

Yes

No

No

ICGST-GVIP, ISSN 1687-398X, Volume (8), Issue (III), October 2008

3

gives r(x, y) as output where the resultant image is filled with pixel data that represent the correlation coefficient calculated between the reference image and the test image at each pixel location. The pixel values can range from zero (negative correlation) through 128 (no correlation) up to 255 (perfect positive correlation). A negative correlation would mean a good match with the negative image of the reference mark. For movement of mouse cursor, the relative change in the cursor position is calculated and corresponding new mouse position is feed into SetCursorPos function. The speed of mouse movement can be controlled by changing the value of scrx and scry variables which control the x and y coordinate shift of mouse cursor respectively. If a,

b is the current position of the mouse cursor and xΔ

and yΔ is the relative change in the position of the tracked feature then the new cursor position can be obtained using a a scrx x and b b scry y= + × Δ = + × Δ

setval = SetCursorPos(a, b) Mouse click event handling is done using visual basic 6.0. On-Screen keyboard, when started, checks out all various windows applications running in memory & stores their process id and arranges them in priority of their earlier use. After activating application as desired, the user can move mouse cursor over On-Screen keyboard to activate it and subsequently user can click on any button depending upon “dwell-time” settings. When user click any button on On-Screen keyboard that character is passed to previously latest activated window and appears on the screen. Graphic User Interface of Black Pearl: The GUI of “Black Pearl” is shown in figure 6 including its keyboard. It is extremely user-friendly as user can set his choice of features to trace, can set “dwell time” (time to click on particular object or icon of window) and also the speed of mouse movement. GUI of system is developed in Visual Basic 6.0. The interface of system has two textbox as inputs one for setting the “dwell- time” and another for getting the size of image to be cropped.

Figure 4: User-Interface of Black Pearl.

GUI also contains “Video Format” and “Video Source” options. “Video Format” option allows user to change video stream settings like its resolution, pixel depth,

compression and size. “Video Source” option handles the image properties like its brightness, contrast, gamma range, hue value, its saturation, sharpness and frequency of source. When the systems starts, picture 1 (as shown in top left of figure 4) appears as video being shown of captured images of webcam. The user then fills necessary details like “dwell time” (in ms) and size of image to be cropped. On clicking the mouse over the desired feature under trace, a square window block of double the size as explained earlier in black colour appears around the selected feature as shown in picture2 in figure 4. The On-Screen keyboard of the system, shown in figure 5, contains entire numerical, alphabet, arithmetic, special characters and other necessary keys. Whenever mouse moves over the On-Screen keyboard, the key over which mouse is at any given time gets enlarged so that user can easily identifying which key he is pressing .User can also switch status of “Caps” button to “on” or “off” state from keyboard itself which is displayed underneath the keyboard keys. For typing the special keys user has to activate shift key by clicking it first and then clicking the required special key.

Figure 5: On-Screen Keyboard of Black Pearl.

4. Experiment and Results The Experiments are performed using an Intel(R) Pentium(R) IV, 3.00 GHz processor with 512 MB RAM running on Windows XP Professional SP2 Version 2002 operating system. A Video Camera (QHM500LM USB PC Camera) having a pixel resolution of 500K Pixels is used. We set the property of Video camera resolution as 160 X 120 with Pixel depth & compression as RGB 24, thereby giving a size of 57600 bytes at 60 Hz frequency. To check the performance of “Black Pearl” system , we performed some experiments and for all the experiments dwell time is taken to be 500 ms. Nose Tracking: For tracking nose, we have fixed the size of sub-image as 7 pixels, thus the cropped image size becomes 14 pixels as explained earlier. Nose is one of the desirable tracking features as it is always in the centre of face and occlusion factor is lowest for nose. Figure 6 shows the tracing result for nose.

Figure 6: Tracing result for nose.

ICGST-GVIP, ISSN 1687-398X, Volume (8), Issue (III), October 2008

4

Eye Tracking: For tracking eye position, we fix the size of sub-image as 10 pixels and the sub-image is fixed on one of the glasses of the spectacles worn by the user. Results show that system tracks only the selected part of the image as shown in figure 7. As seen in the figure user wears glasses which give reflection of monitor on them during experiment still the system tracks the selection under all conditions despite having a similar feature in close proximity.

Figure 7: Tracking eye

Head Tracking : For tracking head position, we fix the size of sub-image as 15 pixels. During head tracing we found that if the size of sub image is increased the results come with high accuracy as evident from figure 10 which shows tracking of head. This result also shows that system works successfully for large degree rotation.

Figure 8: Tracking of head

Head tracking under different circumstances: The next experimentations are performed for checking the system performance under different conditions like user jumping, extreme movement, large degree rotation, turning around, hand/object occlusion and multi-user occlusion. For this purpose we choose head as desired feature.

Figure 9: Tracking of head under extreme movement

Figure 9 shows tracking result when user undergoes extreme movement. Here it is evident that the system does not loose the tracking even when the 90% of the tracking portion (i.e. head) is covered with hand. This

shows the robustness of Image Tracking Module of the system, which is further proven by tracking head under object occlusion as shown in figure 10. Figure 11 shows the system performance under multi-user occlusion. All the result proves that Image Tracking Module of “Black Pearl” system works fine under all extreme circumstances.

Figure 10: Tracking of head under object occlusion

Figure 11: Tracking of Head under multi-user occlusion.

On-Screen keyboard: We have typed some words in MS office 2007 and also tried the control movements of the On-Screen keyboard. Figure 12 depicts this process. Using this keyboard one can enter data into computer and hence can use it as a virtual keyboard. It performs better than windows On-Screen Keyboard as the button size is bigger and the size of button gets zoomed when user brings cursor over it. Figure 13 show the user is working in the real time frame playing and working.

Figure 12: User typing on MS Word 2007 using On-Screen keyboard.

Figure 13: User playing game with four other players in real time.

ICGST-GVIP, ISSN 1687-398X, Volume (8), Issue (III), October 2008

5

0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

2

4

6

8

10

12

14

16

Button Width and Height(inches)

Tim

e fo

r clic

king

But

tons

(sec

onds

)

Efficiency Comparision of Mouse Operations

Black-PearlhMouseHand-Control WINDOWS Mouse keys3D Camera Mouse Direct Mode3D Camera Mouse Joystick Mode3D Camera Mouse Differential ModehMouse

System Performance Comparison: We have carried out the experiment 5 times for each array of different button size. The average time of carrying out button clicking indicates the convenience for the subject to navigate the mouse cursor to the specified area and carry out click operation. We also compare the average button clicking time of system with that of hmouse, hand-controlled Microsoft Windows MouseKeys (left ALT + left SHIFT + NUMLOCK) and the 3 modes of 3-D camera mouse as shown in figure 14. From the figure, it is evident that the performance of “Black Pearl” system is comparable to that of the hand controlled Microsoft windows mouse keys and is better than hmouse.

Figure14: Efficiency Comparison of different mouse operations.

5. Conclusion Experimental results show that “Black Pearl” with its easy and user friendly GUI can act as an alternate to conventional mouse and keyboard. The system will be of great help to the people with serve disabilities who want to access computer without using hand controlled mouse and keyboard. Its On-Screen keyboard is robust in all senses. Its image tracer module is reliable as it can trace any body part under different conditions. Once the camera mouse mode is turned on, user can comfortably navigate in Windows and use any software as per his choice and wishes. Due to all these reasons we can say that “Black Pearl” is a competent and efficient successor to the camera mouse system. 6. References [1] R. C. Simpson and H. H. Koester, “Adaptive One-

Switch Row-Column Scanning,” IEEE Trans. Rehab. Engg., vol. 7, pp. 464–473, 1999.

[2] Y. L. Chen, F. T. Tang, W. H. Chang, M. K. Wong, Y. Y. Shih, and T.S. Kuo, “The New Design Of An Infrared-Controlled Human-Computer Interface For The Disabled”, IEEE Trans. Rehab. Eng., vol. 7, pp. 474–481, 1999.

[3] T. Hutchinson, K. P. White Jr., W. N. Martin, K. C. Reichert, and L. A. Frey, “Human-Computer Interaction Using Eye-Gaze Input”, IEEE Trans. Syst., Man, Cybern., vol. 19, pp. 1527–1533, 1989.

[4] Z. A. Keirn and J. I. Aunon, “Man-Machine Communications Through Brain-Wave Processing”, IEEE Engg. Med. Bio., pp. 55–57,1990.

[5] M. Pregenzer and G. Pfurtscheller, “Frequency Component Selection for An EEG-Based Brain To

Computer Interface”, IEEE Trans. Rehab. Engg., vol. 7, pp. 413–419, 1999.

[6] M. Betke, J. Gips, and P. Fleming, “The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities”, IEEE Trans. on NSRE, vol. 10, pp.1-10, 2002.

[7] J.L. Tu, T. Huang, and H. Tao, “Face As Mouse Through Visual Face Tracking”, IEEE CRV’05, pp. 339-346, 2005.

[8] T. Yang, S.Z. Li, Q. Pan, J. Li, and C.H. Zhao, “Reliable And Fast Tracking Of Faces Under Varying Pose” IEEE FGR’06, pp. 421-428, 2006.

[9] Yun Fu and Thomas S. Huang, “hmouse: Head Tracking Driven Virtual Computer Mouse”, WACV'07.

[10] Dmitry O. Gorodnichy, Gerhard Roth, “Nouse Use Your Nose As A Mouse Perceptual Vision Technology For Hands-Free Games And Interfaces” Image and Vision Computing , Elsevier, pp. 931–942, 2004.

[11] G.R. Bradski, “Real Time Face And Object Tracking As A Component Of A Perceptual User Interface,” IEEE Workshop on WACV ’98, pp. 214-219, 1998.

[12] J.L.Tu, H. Tao and T. S. Huang, “Face as Mouse through Visual Face Tracking”, CVIU, V4HCI, 2006.

[13] Gary R. Bradski, “Computer Vision Face Tracking For Use in a Perceptual User Interface”, Intel Technology Journal Q2 ’98, pp. 1-15, 1998.

[14] S.T.Gandhe and K.T.Talele and A.G.Keskar, “Intelligent Face Recognition Techniques: A Comparative Study”, ICGST International Journal on Graphics, Vision and Image Processing, pp. 59-66, 2007.

[15] A. Saradha and S. Annadur ai, “A Hybrid Feature Extraction Approach for Face Recognition Systems”, ICGST International Journal on Graphics, Vision and Image Processing, pp. 1-8, 2007

[16] Weiqi Yuan and Binxiu Gao, “Iris Recognition System Based on Difference of Two Images Gray”, ICGST International Journal on Graphics, Vision and Image Processing, pp. 1-4, 2006.

Author Biographies

Rajesh Kumar received the Ph.D. degree in Intelligent Systems from University of Rajasthan, India. He is working as Reader at the Department of Electrical Engineering, MNIT, Jaipur. His research interests include intelligent systems, evolutionary algorithms, fuzzy and neural methodologies and image processing. Dr. Kumar received the Career Award for Young Teachers (CAYT) in 2002. He is life member ISTE and memberIE(I). Anupam Kumar is pursuing B.Tech degree in electronics & communication engineering from Malaviya National Institute of Technology (MNIT), Jaipur. His areas of interests include robotics, interfacing through computer and image processing.

ICGST-GVIP, ISSN 1687-398X, Volume (8), Issue (III), October 2008

6