Title Human computer interaction using hand gestures. Publication Details P.
Premaratne, Human computer interaction using hand gestures. Singapore: Springer, Abstract Human computer interaction HCI plays a vital role in bridging the 'Digital Divide', bringing people closer to consumer electronics control in the 'lounge'. Abstract The proposed work is part of a project that aims for the control of a videogame based on hand gesture recognition.
This goal implies the restriction of real-time response and unconstrained environments. In this paper we present a real-time algorithm to track and recognise hand gestures for interacting with the videogame. This algorithm is based on three main steps: hand segmentation, hand tracking and gesture recognition from hand features.
In this modern age the advancement in ubiquitous computing has made the use of natural user interface very much required. The presence of computers and. Human computer interaction (HCI) plays a vital role in bridging the 'Digital Divide' , bringing people closer to consumer electronics control in the 'lounge'.
For the hand segmentation step we use the colour cue due to the characteristic colour values of human skin, its invariant properties and its computational simplicity. To prevent errors from hand segmentation we add a second step, hand tracking. Tracking is performed assuming a constant velocity model and using a pixel labeling approach.
Tutorials, in spite of being expensive to produce, are an essential ingredient for NI projects. Creating a UI that responds to such a complex set of user actions is no simple feat. A waving hand means goodbye is an example of dynamic gesture and the stop sign is an example of static gesture. Corso, S. Emulating traditional input is a license for disaster. It surprised him that defining effective gestures was so difficult. However, today, the sign language is predominantly associated with disabilities from congenital to accidents.
From the tracking process we extract several hand features that are fed to a finite state classifier which identifies the hand configuration. The hand can be classified into one of the four gesture classes or one of the four different movement directions. For that purpose, a second database has been recorded using two cameras.
The goal of these gestures is to manipulate virtual objects on a screen. We propose to investigate on this second database the state-of-the-art sequence processing techniques we used in the previous chapter.
We then discuss the results obtained using different features, and using images of one or two cameras. In conclusion, we propose a method for HPR based on new feature extraction. For HGR, we provide two databases and comparison results of two major sequence processing techniques.
We also present some possible applications of these techniques, applied to two-handed gesture interaction. We hope this research will open new directions in the field of hand posture and gesture recognition.