Press "Enter" to skip to content

A smart guiding system to the physically challenged using image processing

A Smart guiding system to the Physically challenged using Image Processing

S. AISHWARYA1, V. AISHWARYA2, A. CLADIUS MIDHUNA3, Mrs. P. BINI PALAS4

123UG Students, Department of Electronic and Communication Engineering, Easwari Engineering College, Chennai-600089, India.

4Assistant professor, Department of Electronics and Communication Engineering, Easwari Engineering College , Chennai -600089, India.

Abstract

In this world 19 % of the population are physically disabled. It is a difficult task to interface and access the computer along with its accessories. Researchers discovered many techniques for the physically challenged to help them in accessing the computers. These techniques were implemented by analysing the biometric signals and various parts of eye. However these methods requires additional equipment, which makes them unfeasible. To prevail over these issues, we propose a Open CV based model which analyses the movement of the Iris with the help of camera. The position of the Iris is identified and the cursor is moved based on the movement.

Keywords – Open CV; Iris; Haar Cascade; Webcam; EAR algorithm.

1 .INTRODUCTION

People who have a broken skeletal structure finds difficulties in accessing the computer. Computer was invented to perform arithmetic and logical operations. This helps human beings to do the complex task in an easier and quicker manner. Normal humans can easily access the computer without any difficulties but for physically impaired people accessing the computer is a strenuous task.

In this model the centroid of the iris is determined. Based on the centroid, the cursor is moved according to the movement of the eye. The eyeball tracking mechanism involves many applications like home automation using python GUI robotic control. Existing method uses MATLAB to detect the iris and control the cursor. But in MATLAB it is difficult to predict the centroid of the eye, so we go for Open CV. The hardware requirement includes computer, a specialised video camera to capture the image of the eye. No external hardware is required. Camera obtains the input from the eye. Initially centroid of the eye is detected, then the variation on the pupil position provides different command for virtual keyboard.

Other essay:   Integrating the pilp-mineralization process into arestorative dental treatment

II.METHODOLOGY

The proposed model obtains the streaming movies from the camera. Then the video is converted into frames. From the input frames, the faces are detected using Haar algorithm. After detecting face using facial landmarks, the eye is detected using Haar cascade. The next target is to determine the centroid of eye and gradient of iris movement.

III.IMPLEMENTATION

By taking the following steps cursor can be controlled with the movement of the eye.

A.Face detection

Initially the webcam records the image of the user and convert them into frames. The obtained input image frames are converted into gray scale image.

The Haar classifier is trained with several images. This algorithm consists of several classifiers. Each classifier is allocated with different facial landmarks. If the classifier passes through one stage, it will proceed to the next . Otherwise, it will discard the image frame. Using this algorithm the face in the input frame is detected and resized for processing.

[image: ]

Fig. 1: Detection of face using Haar cascade

B. Identification of eye

The obtained facial images are divided into different segments. Features such as eyelash, eyelid, eyebrow, surrounding skin are used to find the position of the eye from the segmented images.

[image: ]

Fig. 2: Detection of eye

C. Detection and analysis of Iris movement

After successful identification of the eye, the position of the iris is determined from the edges of the eye. Markings are made over the outer region of eye. The location of the iris is obtained using the centroid formula.

Other essay:   System

[image: ]

Fig. 3: Determining the iris position

The centroid formula

“,

The deviation of iris from its initial position in both x and y co-ordinates is analysed from the obtained images.

[image: ]

Fig. 4: Analysing x coordinate

[image: ]

Fig. 5: Analysing y coordinate

The above shown graphs shows the variation in iris movement in both x and y axis from 10 input images. The gradient is calculated from the datum obtained from the graphs. Using the value of gradient the cursor is made to move in the direction of eye.

D. Eye blink Detection

There are 6 (x, y) coordinates used to represent each eye starting from the left corner. Relationship between the height and the width is derived from Eye Aspect Ratio.

EAR=

Here are the facial landmarks. EAR is zero when there is a blink and it is constant when the eye is kept open.

IV.CONCLUSION

A system that enables the physically challenged to interact with the computer was developed and tested. The method can be further enhanced to be used many other applications. The system can be adopted to help the disabled to control home appliances such as TV sets, lights, doors etc. The system can also be adopted to be used by individuals suffering from complete paralysis, to operate and control a wheel chair. The eye mouse can also be used to detect drowsiness of drivers in order to prevent vehicle accidents. The eye movement detection and tracking have also potential use in gaming and virtual reality.

REFERENCES

[1] B.Rebsamen”,C.L.Teo Q Zeng M Ang Jr.”controlling of wheel chair indoors using thought “ IEEE Intelligent Systems “,2007, pp 18-24.

Other essay:   Art blurb

[2] C A Chin “Enhanced Hybrid Electromyogram./Eye gaze tracking cursor control system for hands free computer interaction “.Proceedings of the 28th IEEE EMBS annual International Conference, New york city “,USA”,Aug30 sept 3″,2006″,pp 2296-2299.

[3] J Kierkels, J. Riani, J. Bergmans “Using an Eye tracker for Accurate Eye movement Artifact Correction”, IEEE Transaction on Biomedical Engineering, Vol 54, no 7, July 2007 “,pp 1257-1267.

[4] A E Kaufman, A Bandyopadhyyay B D Shaviv “,” An Eye Tracking Computer User Interface “”,Research Frontier in Virtual Reality Workshop Proceedings “,IEEE computer society press “,October 1998”,pp 78-84.

[5] T Kocciko, ”Device which will allow people suffered from Lateral Amyotropic Sclerosis to communicate with the environment “”,MSc “,thesis “,January 2008.

[6] C Collet A .Finkel , R. Gherbi “,”A Gaze Tracking System in Man-Machine Interaction “”,Proceedings of IEEE International Conference on Intelligent Engineering Systems “,September 1997.

[7] B. Hu, M. Qiu, ” A New Method for Human Computer Interaction Conference on System Man and Cybernetics “,October 1994.

[8] P Ballard, G C Stockman “,”Computer Operation via Face Orientation “”,Pattern Recognisation vol 1.Conference A Computer Vision and Applications “,Proceedings 11th IARP International Conference “,1992.

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

0 Shares
Share via
Copy link

Spelling error report

The following text will be sent to our editors: