Zhe Yang (zy49)
Kun Yi (ky237)
|Introduction||High Level Design||Hardware||Software||Results||Conclusion||Appendices|
We created a wearable game controller that uses accelerometers to acquire action of the hand and then maps an action to an arbitrary keystroke. The types of actions we are trying to recognize should be suitable as input control for video games.
We placed 3 z-axis accelerometers on tips of the thumb, the index finger and the middle finger, and three accelerometers on the back of the hand for x,y, and z acceleration. The Atmega644 microcontroller read the output of the accelerometers and simulates a finite state machine to compute the gesture and the motion of the hand. The gesture and motion information is then transmitted to PC through serial connection, and a Java program is used to read the information and map it to an arbitrary keystroke.
Some current game controllers such as the Wii remote take use of sensors and are capable of using the player’s motion as input, to some extent. However, none of the controlling method can take advantage of the expressibilities of natural gestures. As avid gamers, we determined that a hand-action-based controller would be a novel and fun input device.
By observation, we have found that many meaningful actions performed by a hand (e.g. smash, swing or push) can be described by the hand gesture and the movement of the palm. Furthermore, the hand gesture should be independent with the hand movement and orientation, but can only be changed from one to another by movements of fingers. On the other hand, the orientation is only dependent on the movement of the hand. Therefore in our project, we propose a prototype of real time hand action recognition by 8-bit microcontroller using acceleration data from finger tips and back of the hand.
Accelerometers: the accelerometers we used measures acceleration from a capacitive sensing cell (g-cell) which forms two back-to-back capacitors. As shown in Figure 1, when the center plate deflects due to acceleration, the value of capacitors will change and acceleration data can be extracted.
Figure 1: Physical Model of the Accelerometer
Each accelerometer’s reading consists of dynamic acceleration and static acceleration, namely the gravity. For a z-axis accelerometer placed flat, Figure 2 shows the sign of the reading. The static acceleration is +1g, where as the dynamic acceleration reading is positive if the direction of the acceleration is upward.
Figure 2: Sign of Measurement of the Accelerometer
By placing three accelerometers on the back of the hand, we set up a coordinate system. If the hand is still, we can measure direction of the hand plane. Moreover, if the finger moves slowly, then at each reading its acceleration are mostly due to gravity.
Figure 3: Hand Coordinate System
If the hand moves very rapidly, the process is a sequence of acceleration-deceleration. Therefore the output will show a large peak and afterwards a peak with opposite sign. Figure 4 shows the z-axis acceleration output when a hand suddenly moves downward and goes back.
Figure 4: Accelerometer Output for moving Hand
Figure 5: Logical Structure
The high level design of our project is shown in Figure 5. This diagram shows the flow of data from when the user end Java Application opens the port for communication to when then information is transmitted to the Application and mapped to a keystroke.
One of the major limitations for our project is the functionalities of the accelerometers. Limited by budget, available materials and soldering technique, we only used one-axis low-g analog accelerometers. Although these accelerometers require simpler hardware and software than a digital or multi-axis accelerometers does, their outputs are very noisy. In addition, they have a measurement range of only 1.5g, meaning that fast movements will cause the output to reach the rails. It is difficult to integrate the acceleration to get velocity, not to mention position. However, we applied several techniques to reduce the noise and were still able to acquire much useful information from the accelerations. Specifics are covered in Software and Hardware sections.
Another tradeoff is expandability of the system versus usability. We cannot hard code every possible action. However, adding more actions meaning adding states in the state machine, which poses a difficulty for the user. We have to define a system that is ready to go, but can also be easily expanded.
Our hardware consists of three parts: the custom PCB for Atmega644, the accelerometer circuit, and the Pololu USB AVR programmer, which we used as the serial connection between PC and the custom PCB. Schematics for all hardware can be found in the Appendix III: Hardware Schematics.
We used the custom PCB designed by ECE 4760 instructor Bruce Land to interface Atmega644 with our circuits. The board design is shown in Figure 6 and the layout is shown in Figure 7.
The accelerometer circuit consists of four MMA1260D z-axis analog accelerometers, three MMA2260D x-axis analog accelerometers, and the external circuit, which consists of decoupling capacitors and low pass filtering circuits as the datasheet of MMA1260D and MMA2260D suggests. From our tests, the circuit decreased the magnitude of the noise in the output by 0.03V, which is desirable. The schematics are shown in Figure 8.
Figure 8: ADC Circuit for Accelerometer
The accelerometers are connected to Pin A of the MCU for ADC, and the reference voltage is set to Vcc. The correspondence of the accelerometers and the pins are shown in the following table:
|X-axis*||Back of the Hand||A.1|
|Z-axis||Back of the Hand||A.3|
|X-axis*||Back of the Hand||A.5|
*Note: Although both accelerometers are X-axis, we let them measure different axis by orienting one 90 degrees away from another. See picture below for details.
We used a Pololu USB AVR programmer as the USB-to-Serial Adaptor. The programmer contains two control lines, TX and RX, used to send asynchronous serial communication. When the programmer receives one byte from USB, it transmits one byte on TX. The lines transmit data 8 bits at a time with no parity and one stop bit. It looks like a standard serial port to operating system, so it can be accessed from serial port functions.
Figure 9: Picture and Schematics of the Pololu USB AVR Programmer
Our program naturally consists of two parts according to the logical structure: hand action recognition codes for the MCU and PC end application. Different aspects of the software are described below in details:
The action recognition code makes up most of the code for the MCU. We used a state machine design for our program. With the acceleration data from the six accelerometers, we are able to acquire several pieces of information: hand motion, hand orientation, and each finger’s status. The input to the state machine is all these information combined. Each state corresponds to an action, and the new state depends on the current input and the current state. The figure next summarizes the program structure:
Figure 10: Program Structure
We used the analog-to-digital converters on Atmega644 to obtain acceleration measurements from the 6 accelerometers. For an accelerometer, V_OH=5V,V_OL=0V, so we chose V_cc(5V) as the reference voltage. We took all 10 bits that VDC read and subtracted by 512 as an acceleration value, which would be in the range [-512, 512].
During every loop of execution, each of the input ports is read in turn by incrementing ADMUX, the ADC input select register. Before beginning a conversion, we wait until the last conversion is done, i.e. the ADSC bit is set high.
We need to convert raw acceleration data to three pieces of useful information: hand motion, hand orientation, and status of each finger. As explained in the hardware tradeoff section, the acceleration data acquired is noisy, which must be taken care of when we extract information about the hand’s action.
In order to reduce the effect of noise, we decided to set a small noise threshold and a large peak threshold. After trial and error we set noise threshold = 50 and peak threshold = 150. Oscillations smaller than 50 are regarded as caused by noise and are suppressed, whereas oscillations larger than 150 are detected as peaks, indicating a rapid motion. Figure 11 shows the acceleration data of a moving finger.
Figure 11: Acceleration Data of a Finger, Together with Noise and Peak Thresholds
Hand motion is found by detecting peaks of acceleration. For example, if the z-axis accelerometer on the back of the hand detects a large negative peak, then the hand is moving to the negative end of z-axis as defined in previous sections. The correspondence between peak sign and motion direction is summarized in table 2:
Besides, hand direction can be directly calculated if the hand is not moving and gravity is dominant. The acceleration with the largest magnitude will indicate which of the axes is in direction with the gravity. If hand moves in a different way than the previous direction, the new direction is the same.
Three different statuses are defined for fingers: straight, bent, and moving. Detection of finger status depends on hand motion and direction. If the palm is still and facing upward or downward, a finger must be straight if its z-axis acceleration is the same as hand z acceleration, and bent otherwise. For situations other than this, dynamic acceleration of the finger is used, since a rapid moving finger must produce a peak in acceleration. Therefore we set the status of an originally bent/straight finger to moving once a peak is detected, wait until the reading stabilizes to remove the “bounce” in acceleration, and set its status to straight/bent.
Through these simple methods, we were able to retrieve the information we need for the main state machine. Please note that slow finger motion can only be detected when gravity is a dominant acceleration of both hand and fingers, since we only used one axis accelerometers for the fingers.
The action recognition state machine is the most important and complex component of our whole algorithm. Understanding constructing a state machine for more than 30 actions, which corresponds to more than 30 states, can be tedious and subject to error, we first set up a classification for the actions. Based on this classification, we used a mixed Mealy and Moore machine design to implement the main logic.
We divide hand actions into four groups: basic gestures, refined gestures, and actions. Basic gestures are the most general actions and only depend on current input. Refined gestures are same as basic gestures except that they have more restrictions on input, and these gestures are assigned meaning. For example, we defined gesture “aim” to be an input of straight index finger/thumb, bent middle finger and –X downward direction.
Actions are states that depend on the current input and previous state. For example, we define that if the previous state is “Aim” and the input indicates that the hand moves, the state changes to “Fire”.
Figure 12: Refined Gesture “Aim” and Action “Fire”
To keep this action state library flexible and expandable, we keep a numbering convention: the index of a refined state is the index of its basic state plus a multiple of 10, the index of an action state is the index of the previous state plus a multiple of 100.
We chose Java to implement the PC end Control Manager for its support of serial connection and multi platforms.
The serial connection is used to transmit the printed strings of hand direction and action obtained by MCU to PC. In our design the MCU is the transmitter, whereas the PC is the receiver. Because the USB-serial converter simulates the USB port as a serial port in the computer, before the application starts receiving data, we can read the available serial ports for the user to choose.
Since Suns doesn’t provide built-in serial connection handling for Java, we chose to use an open source package RXTX. The control manager uses functions provided by this package to handle the serial connection.
The key mapping codes use the Java class Robot to parse the string received from serial port, and triggers the corresponding key event.
The application reads predefined strings from a .map configuration file for what gestures the device can detect. These actions are presented in the window, and user can choose mapping each action to a keystroke, a key press, a key release, a long key stroke, or a key combination.
Screenshots of the interface and a .map configuration file is shown below:
Figure 13: Screenshots of Control Mapping Manager
We originally planned to use 3-axis digital accelerometers provided by ST electronics. However, these accelerometers are of package LGA-16 and we didn’t have the time to etch and solder a small PCB for every accelerometer.
We also tried to use Matlab to implement the PC end control manager, but found that Matlab would sometimes crash if an unrecognized string was received. After we switched to Java this problem was solved. Besides, implementing the key events in Matlab is significantly harder than in Java.
Following are some pictures of the complete hardware setup of our project.
Overall, the results of our device are satisfactory. The prototype we implemented can successfully recognize the actions according to predefined states, transmit the action to PC, and map the action to an arbitrary key stroke. We have tested a mapping file for game Super Mario, and the manager would successfully the left, right, jump and fire gestures.
Our speed of execution is limited by the timing of the A/D conversion and the response time of the accelerometers. According to datasheet of Atmega644 the time of AD conversion should be significantly less than the response time. Since the bandwidth of both MMA 1260 and MMA 2260 is 50 Hz, to acquire accurate reading we cannot measure at a rate more than that. Moreover, the electrical response time is 2ms, so we chose a time of 10ms for each AD conversion. The total time of updating one gesture is <70 ms, so the update frequency, 14.3Hz, should be larger than the bandwidth of typical hand movement (~10Hz). Based on our experience, we conclude that the controller is sensitive and respond rapidly to hand action.
We have separated tested the accuracy of detecting hand movement, hand direction, finger movement and the overall action. For each test, we made sure the actions were in every directions.Test Type Total Tests Correct Accuracy(%) Hand Move 60 47 78.3 Hand Direction 60 60 100.0 Finger Movement 120 97 80.8 Action 45 35 77.8
Based on our result, we conclude that the accuracy was very high for static gestures such as hand gestures. The accuracy for dynamic movement is low due to noise.
We enforced safety in several ways. We use insulating tape to completely wrap the metal contacts to make sure no short circuit occurs. Consequently, the only parts that the user will touch are the device itself, the USB port and the power plug, which are all secure. In addition, we make sure the project doesn’t contain any sharp object or small components, and is thus safe for children. However, we have to state that this project is not suitable for children under age 10 to use. In addition, our project didn’t have the issue of interference, since it operates on a relative low voltage and didn’t produce any EM wave.
The idea of our project is simple, but as explained before, there exists a tradeoff between the expandability of the action library and usability. We took the approach of predefining all basic movements and most common actions that are suitable for game inputs (such as aim/fire), while leaving user the freedom to expand the gestures that could be recognized. Using the control mapping manager is very easy, and once the manager is set up, the device can operate just like simulating keyboard.
Overall our design met our initial expectation in several aspects. It detected hand motion and gesture with a high accuracy, correctly used a state machine to update the action, and successfully mapped the output to a keyboard event. If we were to do the project once more, we wished to search for more suitable accelerometers, design a better hardware layout, improve PC end manager, and possibly improve our algorithm for noise control by adding a training function.
Our design conforms to the RS232 standard for serial connection. The standard is taken care of by USB-to-serial adaptor on the MCU side and the Java RXTX functions on the PC side.
As mentioned before, we used open source RXTX library to handle serial connection. The wiki page for the library is http://rxtx.qbang.org/wiki/index.php/Main_Page
This project does not pose any ethical considerations, and we closely observed the IEEE code of ethics. In making our decisions, we always keep design safety in our minds by avoiding using high voltage and making sure the project doesn’t contain any short circuits, sharp objects or small components. There is no conflicting interest for this project. Besides, all claims we made in this report are honest. We understand our limits as inexperienced engineers, and had no intention to hide, cheat or boast about any observations or conclusions. Neither of us received bribery in any form, and will not accept if offered.
In designing we made efforts to promote understanding of technology, and didn’t use any design that discriminates toward any particular race, religion, gender, or nationality. We believe that technology has the purpose of improving people’s living quality. Although the accelerometer technology is mature, we employ it for a new application and developed a novel and fun control device. During this process, we are open to criticisms and suggestions. We didn’t seek to injure or harm any person or their projects by false and malicious actions, and we have been glad to help fellow classmates whenever they have questions or want to share a resource.
contains all MCU side codes for hand action recognition.
contains PC end user interface. To use: 1. Extract the archive. 2. Install RXTX library according to instruction contained. 3. Open eclipse and switch workspace to the extracted directory. 4. Choose "Run->Run Configurations -> Java Application"
Custom PC board
|Custom PC board:||-||1||4.00||Lab|
|Pololu USB Programmer||1||19.95||Pololu.com|
We would like to thank the following individuals and companies:
Freescale and ST electronics, for their donation of accelerometers.
Prof. Bruce Land, for teaching this course and providing assistance and suggestions
Allison Smyth and other 4760 TA’s for assisting in many aspects of the project
Thank you very much!