With the rapid emergence of 3D applications and virtual environments in computer systems; the need for a new type of interaction device arises. This is because the traditional devices such as mouse, keyboard, and joystick become inefficient and cumbersome within these virtual environments. In other words, evolution of user interfaces shapes the change in the Human-Computer Interaction (HCI). Intuitive and naturalness characteristics of “ Hand Gestures” in the HCI have been the driving force and motivation to develop an interaction device which can replace current unwieldy tools. A survey on the methods of analyzing, modeling and recognizing hand gestures in the context of the HCI is provided in this paper. Taxonomy of the different algorithms based on the applications that they have been developed for and the approaches that they have used to represent gestures is presented. In addition, direction of future developments is discussed.