The project involves an experiment in implementing a human-computer interface by tracking finger and wrist motions.
“Ever wish you could control a computer just by moving your fingers or your hand?”
The primary goal is to design and build a functional prototype implementing an intuitive interface for a user to interact with a computer (or computing device). The MCU provides a control module for direct access to the sensors and currently acts as the computer role of the human-computer interface. To demonstrate the interface, some sample demo applications include:
- A calculator
- A monitor
- A stopwatch
- A rock-paper-scissors game
This project required us to create analog and amplifier circuits in addition to the software modules. Our goals were to create a streamlined device with a low profile, low weight, low power consumption, and comfortable to wear.
High level design
The project idea initially involved a robot to output a raster on sidewalk. Then it involved a portable multiplayer gaming device. Then it involved a virtual keyboard / abacus. Finally, it evolved into the current project.
One of the ECE 476 assigned readings, “Synthetic Serendipity”, features a device which uses gestures and body motions to interact with a wearable computer – We immediately realized that it would be very possible create such a Human Controllable Interface (HCI) with PVDT film.
We focused on the interface aspect because computer interfaces have not changed drastically in over 30 years (since the mouse and its cousins touchpad, trackball, and thumbstick). A major obstacle to ubiquitous and portable computing is the intrusiveness of computer interfaces. Previously, the size and awkwardness of computers presented the major obstacles to portability. However, with today’s technology, it seems that today’s primary obstacle is the lack of an intuitive and convenient interface.
The orthodox interface for current computation is the keyboard and mouse, devices that do not lend themselves very well to portability. A significant amount of research has been made into alternatives such as voice interfaces, eye focus tracking, and one-handed keypads. We intended to experiment with a system that uses hand gestures and movements.
There are two main goals to this project.
- To create a simple input mechanism
- To create example applications that use the input scheme
Since the primary goal is to create an input device, the Gauntlets of MicroComputation (Gauntlets of uComputation), that can be used for any application. In addition, the software infrastructure should allow almost any external application to be integrated into the system. This means that this project might be able to control or run other projects!
The project high level design involves several modules:
The sensor module is responsible for reading the fingers and hand. The signals feed into the signal processing module which cleans up the signal and amplifies them for the MCU in the control module. Finally, the output pipes to the LCD inside the display module.
The equation for the frequency response of the low pass filter is:
f = 1/(2*pi*RC) = 3.1 Hz
Most people will not be moving a finger more than three times per second, so the filter does not restrict operation.
The gain of a non-inverting op-amp is equal to:
A = 1 + R1/R2 = 7.1
This gain is necessary to amplify the signal from the piezoresistor for the ADC.
On the hardware, we were unable to obtain a 3-axis accelerometer (which would allow up-down motion detection) from Kionix. However, since the gesture sensing is the primary focus, a 2-axis accelerometer sampled from Analog Devices provides adequate functionality.
There is a size and weight tradeoff that is made in order to incorporate the 4×20 LCD. If a smaller (4×16 or 2×16) LCD was used, there would be some space and weight savings. In addition, the backlight is not disconnected, and contrast level is locked by hardware.
Another issue is using lycra gloves for the foundation of the gesture sensing. While the double lycra glove allows more reliable and accurate sensing, they may make wearing the device less than completely transparent.
While accurate sensing is always a goal, reliability and device issues force the sensing to and processing to rely on reading finger position as a binary state rather than a continuum of positions.
On the software side, the limited memory on the MCU prevents us from implementing any long and complicated programs. This requires a systems programming mindset focusing on specialized code and design simplicity.
Standards, Trademarks, Copyrights, and Patents
The device software is ANSI C compliant as much as the Codevision compiler allows and does not infringe on any trademarks or copyright…
There are patents for gesture input systems such as the electric field sensing but none of them use piezoelectrics for sensing motions.
The device may seem similar to the Nintendo Power Glove but there are a few differences. First, the Power Glove uses optics and light intensity to gauge finger bend while this device uses piezoresistors. In addition, the Power Glove does not sense hand motions.
The whole device is mounted on a lycra glove from Campmor and a martial-arts arm guard. Lots of electrical tape and duct tape is used to reinforce different wires and cover the device.
There are three main non-MCU hardware components:
- Piezo film strips for the fingers
- Accelerometers for the hand
Piezoelectric materials deform when subjected to an electric field and produce an electric charge when mechanically deformed. The equivalent electrical circuit is an AC voltage source in series with a capacitor. Thus, these materials can be used as mechanical sensors by analyzing the voltage change across the film to deduce the deformation of the film. The voltage spike is in response to a changing mechanical stress and not to the absolute stress, essentially acting like a dynamic strain meter. However, by adding in a capacitor, the charge can be stored for a longer time and make it so that the charge generated is proportional to the absolute stress. These properties are the basis of the finger-sensing scheme.
Piezo films are laid across the fingers and taped down to ensure a proper response from the piezo. However, the current generated by the piezo is very small. If the piezo is just connected to a voltmeter, the voltage spike is on the order of 3 – 5 V. When the piezo is connected to other resistive elements such as the rest of our circuitry, the voltage spike drops down to 0.3 – 0.5 V so signal is amplified in the signal processing module.
A 2-axis accelerometer from Analog Devices is mounted on top of the hand to tracks left-right and forward-back hand movement. The accelerometer is taped down to prevent shifting. Unfortunately, we were unable to obtain a 3-axis accelerometer to allow for up-and-down movement.
The Analog Devices accelerometer, ADXL203CE, has a range of +-1.7 g with a sensitivity of 1 V/g. This allows the accelerometer to be used directly without amplification. In addition, the accelerometer has a resting state voltage of about 2.3 V.
Signal Processing Module
This stage amplifies signals from the piezo strips. The schematic for amplifier is displayed below.
This amplifier has a gain of approximately 7 and is enough to ensure that the signal ranges from 0 to +5 V. A low pass filter with a frequency response of 3 Hz filters out noise. The frequency response is fast enough since it is unlikely that someone will move a finger more than three times a second. Diodes are placed at the input to the ADC to protect the MCU.
The signals from this module feed into Port A on the MCU.
For more detail: Gauntlet of uComputation