You Are Here: Home » AVR ATmega Projects » Robotics - Automation Projects » Clap-E acoustic tracking robot using atmega1284

Clap-E acoustic tracking robot using atmega1284




An Introduction

For the ECE 4760 final project, we designed and built a sound follower robot named Clap-E. As its name implies, Clap-E receives a clap sound and moves toward the source of clapping. It has the ability to change its position after multiple claps and readjusts its path toward the source of sound. We didn’t want our robot to be sensitive to human voice or other “common” surrounding sounds, so it detects mainly clapping. However, if someone generates a sound as loud and sharp as the sound of clapping , Clap-E can follow him or her.

Clap-E acoustic tracking robot using atmega1284

High Level Design

Rationale and Source of Our Project Idea

Robotics is a progressive field which is evolving every day. Our inclination to build a project that helps us gain experience in both hardware and software, drove us towards designing a sound follower robot.

The biggest challenge and the first step, was to identify the direction of sound. Three sensors were used in the final design to detect the sound coming from all directions. Our initial design was to use the difference in amplitudes of voltages received by the sensors to detect the direction of sound. The idea was that the microphone sensor closer to the sound source receives a sound of greater amplitude when compared to the sensor farther away from the source. However, this does not work accurately as the sensors are separated only by a distance of 23 cm, which is not enough for the sound waves to be dampened.

Background Math

Our next approach was to measure the difference in the ‘Time of Arrival’ of the sound waves between two microphones. This can be calculated using simple trigonometry. The sensors were arranged as shown in figure 1. The sound waves were assumed to be in parallel with each other since the sound source is at negligable distance from the sensors. In figure 1, S1 and S0 are the microphone sensors seperated by a distance ‘d’. When the sound source ‘S’ is closer to sensor S0, sound waves will have to travel an additional distance of (d *sin Ѳ) in order to reach sensor S1. Ѳ is the angle between the sensor and the source. From simple speed theory, we have:

d*sin Ѳ = ΔT * V ———————————————————————————————– (1)

where,

ΔT is the difference in time of arrival of sound to the two sensors,
V is the standard velocity of sound in air (approximately 343m/s),
d*sin Ѳ is the distance travelled by the sound wave

Logical Structure

Using equation (1) and simple coding logic, we can compute the direction of the sound source ( either left or right) and angle of the sound source from the sensor ( between 0 – 90 degrees). The position of the source is then defined by a pair of two elements (Left/Right, angle). However, using just two sensors, fails to identify whether the sound source is located forward or backwards. In other words, we wouldn’t be able to decide whether sound was generated in front or behind the robot. This problem was solved by introducing a third sensor S2. The third sensor improved significantly the design and enabled us to identify the sound source in a field of 360 degrees. Figure 2 shows the 3-sensor-triangular arrangement that has been adopted in the project . The difference in arrival time (for calculating the angle ‘Ѳ’) is measured between S1 and S0 only. S2 is strictly used to determine whether sound was generated in front or behind the robot.

The final circuitry used for the sensors is shown in figure 3. The analog data from sensor S0 is fed into an amplifier. The voltage gain of the amplifier is set to 20 by using R1 = 2k and R2 = 10k. The amplified voltage is passed through a peak detector circuit, which is internly connected to a voltage comparator. We have used LM358 ICs which contain two op-amps for each sensor to form the amplifier and the comparator circuitry. The trimpot is adjusted such that, the reference voltage is 2.56V.

The comparator’s output for one clap sound, as observed on the oscilloscope is expected to look as shown in figure 4. The algorithm to determine the direction of the sound source is given on figure 5a, 5b, 5c, 5d and 5e. The three external interrupt pins of the microcontroller were used. INT0, INT1 and INT2 are activated to generate an interrupt when there is a rising edge. The output of the comparator in the sensor circuit, as shown in figure 3, is connected to the external interrupts. Whenever sound is detected by the sensor, a rising edge is generated at the output of the comparator and the corresponding external interrupt is invoked.

The original position of the sound source is determined by checking which sensor triggered the first interrupt. If S0 generates the first interrupt, the system understands that the source comes from the right. However, the robot doesn’t know if the source is at an angle greater than 90 degree from the front of the robot. To solve this issue, we must also consider the sensor that triggers the second interrupt. If S1 generates the interrupt before S2 (Back sensor), then the sound source lies within 90 degrees from the front of the robot. However, if S2 interrupts before S1, then the source lies at an angle greater than 90 degrees from the front of the robot. Similarly, if S1 interrupts first, then the source is to the left, and by considering the order of generation of the interrupts for S0 and S2, we can certainity determine the angle of the source from the front of the robot. The last condition would be that S2 is interrupted first, which means that the sound source is at an angle greater than 90 degrees from the front of the robot. The direction, i.e left or right depends on whether S1 or S0 interrupts second respectively.

Clap-E acoustic tracking robot using atmega1284 Schemetic

There is also the scenario where multiple interrupts are invoked at the same time. For example, if S0 and S1 receive the sound at the same moment, the angle Ѳ is 0 degrees. If all three interrupts are invoked at the same time, it implies that the source is equidistant from all three sensors, this can be treated as a ‘destination reached’ state. This multiple interrupt situation has been handled by checking the interrupt flags, INTF0, INTF1 and INTF2 of the EIFR Register.

After determining the angle Ѳ, which defines the direction of the sound source, it was time to rotate the robot in the direction of sound. The drive system of the robot consists of three wheels, two of which are driven by servos and the third wheel constitutes the support wheel. Continuous rotation servos were used to achieve the motion of the wheels. The servos require pulses of width in a range between 1.3 ms and 1.7 ms spaced with intervals of 20 ms. The datasheet of the motors is attached to the appendix. The trimpot of the servos is adjusted so that 1.5 ms pulse acts as the dead centre (i.e no rotation). A 1.7ms pulse rotates the servo anticlockwise at full speed, while a 1.3 ms pulse rotates the servo clockwise at full speed. The opto-isolator circuitry used for the servo rotation is the same as the one that was used in Lab 4 of the ECE 4760 course, with a few modifications. The Servo driver circuit is shown in figure 6. The modifications were required since the speed and/or direction were changed after modifying the pulse width.

Parts List:

PartUnit PriceQuantityTotal PriceVendor
White board$63$184760 Lab
Atmega 1284p$51$54760 Lab
Microphone sensor$03$04760 Lab
Capacitor$011$04760 Lab
Resistor$033$04760 Lab
Trimpot$03$04760 Lab
LM358$02$04760 Lab
4N35$02$04760 Lab
1.5V Battery$1.254$4.997 Eleven
Servo$12.992$25.98Digikey
Wheel$5.95211.90Jameco
Red LED$02$04760 Lab
Green LED$08$04760 Lab
Card Board$02$0From us
Wires$0Many$04760 Lab
9V Battery for MCU$71$74760 Lab
Battery holder$02$04760 Lab
TOTAL:$72.87

 

For more detail: Clap-E acoustic tracking robot using atmega1284

Leave a Comment

You must be logged in to post a comment.

Read previous post:
Animated characters help patients discuss ailments, levels of pain

The company's vision initially seems fanciful: Create applications for health care featuring animated characters that can understand language in all...

Close
Scroll to top