Tracking of ball held by Cathy's arm, displayed on VGA screen

The above picture shows what the camera on the FPGA sees. The images is presented in black and white, and puts a bounding box around the ball. The center of this box is used in the neural network to calculate the x distance from the center and thus how quickly the ball should move. The picture shows Cathy holding the lacrosse ball in the lab, which is highlighted in red while everything else has been grayed out. Overall BALL-E performed as we had hoped. It tracked the ball in real time and would follow it. At peak speed, we estimate it was going at a foot a second. The turning was a little less precise. As one can see from the video, BALL-E had a few issues with traction when turning left. We speculate that is due to a lack of friction with the ground and the power, or lack thereof of the servo motors we used. The motors were fairly stable, but not as powerful as we would've liked to get maximum traction. This was due to the motors that we used. We used approximately 9% of the board, the following are our hardware configuration statistics

Hardware Statistics


The best way to show our results are with video. They are available as follows:
BALL-E's view of the world
Video from our demo day in December 2008