Introduction

The goal of this project was to track the motion of a human face and to display a projection of a cube on a VGA monitor that would change according to the motion of the user’s face. We wanted it to seem as if the user was actually looking at a 3D cube, so if the user moved their head to the right for example, the projection should be displayed as if the user were actually looking at a cube from the right, and if the user gets closer to the camera, then the cube should get larger. The cube was chosen for simplicity and as a proof of concept, but this project could potentially be extended to more complicated objects or virtual reality environments.

In this project, we connected a video camera to the FPGA and determined the location of the user’s face in each frame by examining the color content of each pixel and determining which ones could represent human skin. The face’s offset from the center of the camera’s field of view, as well as its size, was then used to draw the appropriate projection.