The future of virtual reality is natural interaction. The user wants the ability to control what he sees in a smooth way using hand position and gestures. Using Intel’s depth camera we will attempt to create a panel in a virtual reality environment which can be interacted with using precise hand motions. We will attempt to test the limit of the precision necessary to operate the panel such as moving it around the virtual space, enlarging and reducing the size and pressing buttons on the panel.
In this project we will develop and implement an application.