Project 3 - The Conversationalist
It's a good thing we didn't need to do much mechanical work on a robot for the second project, because I spent most all of my time working on circuitry. After all, I was the electrical guy on the team! This time around, our robot needed to be able to "hear" a 1kHz tone, figure out what direction it was coming from, and move to within a few inches of the source. My tolerance for annoying sounds was also pushed to the limit as I was constantly subjected to a 3.5kHz tone emitting from the lab as everyone was persistently testing and tweaking their own circuits. I still don't understand why we couldn't have used a less audible tone, say 40kHz...

Project Requirements

This robot only had a few tasks to complete. The challenge was to create a capable circuit and to plan an adequate strategy to complete said tasks!
  1. Listen for a 3.5kHz tone.
  2. Determine location of the sound source.
  3. Move to within a few inches of the sound source.
  4. Do not exit the playing field - a 3 meter wide circle outlined with black nylon tape.

Mechanical Design

As mentioned, we didn't change the mechanical design of the robot much from project 2 - just enough to fit the new sensors and circuitry. I also tried to remove a few pieces I felt were unnecessary from the original build.

Electrical Design

This wasn't a huge challenge for me; it just took a bit of basic electronics theory and the right parts. To start, I created a simple block diagram of the different circuit stages.

Obviously, the sound must be captured by something - in this case, a microphone. It is then amplified before being filtered so that only the 1kHz signal gets through. The next stage converts the AC signal into a DC voltage level which is fed into the XBC for measurement. The XBC will then (in theory) direct the robot on where to go based on the intensity (volume) of the sound. Although power could be drawn from the XBC directly, a 9V battery was converted to a useable 5V with an LM2931 regulator in parallel with a 0.1uF and a 10uF capacitor.

A gain amplifier and active bandpass filer are used to block out any other sounds that the robot doesn't need to hear. I wanted to create a band pass filter centered at 3.5kHz with a frequency range of 500Hz in either direction. The signal initially travels through a passive high-pass filter. The corner frequency is calculated to be 3386.3Hz using Equation 1. The gain amp stage was created using an LM2274 op-amp in a typical first order low-pass filter arrangement with a corner frequency of 4336.6Hz (Equation 1). The band pass filtering stage was again created with the LM2274 op-amp but in a higher order filter arrangement. It was centered at 3497.6Hz (Equation 2) with a bandwidth of 677.3Hz (Equation 3) and a gain of 23.5 (Equation 4) within this band. The schematic is shown below. Resistors R3, R4, and R5 correspond to resistors R1, R2, and R3 of the equations, respectively. The identical value of capacitors C3 and C4 is then used in the equations for C.Simulating this circuit in PSpice yielded the following frequency spectrum result.

The intensity measurement stage uses an active half-wave rectifier to ignore the negative voltages while monitoring the the peak voltage to create a DC voltage signal with minimal ripple across the sinusoidal wave peaks at the filter output. Because a few people were having problems with this part of the design, our instructor actually gave us the schematic to use. I hadn't actually gotten to this stage yet because we were still discussing how the robot should behave, but there was no sense in reinventing the wheel, so to speak, so I used it!
This circuit was also simulated in PSpice with the output of the previous two stages serving as the input. A plot voltage output vs. time is generated with the circuit being initiated at 1ms. It can be noted that the input and output have different center lines. This is because the the input of this stage is a sinusoidal waveform centered at 2.5 volts, while the output is desired to idle around 0V like a standard DC signal.
Although this circuit worked flawlessly, it was not enough to complete the tasks. A second microphone and identical circuit was created such that the robot would have two sound sources to compare. A phase comparator was also added to monitor the difference between the two sound waveforms. The output of this comparator is proportional to the offset angle of a line running between the two microphones and the sound source. Apparently, students were having trouble with this new circuit addition as well, so the solution was given to us yet again. Like before, I wasn't even to this stage yet, but decided to use what was given to us to save time. The complete circuit was built on a breadboard and secured to the top of the robot. The circuit is shown below with the individual stages noted.


We had planned to use two microphones long before the "phase comparator" circuit was given to us, so we never even bothered with incorporating it into our strategy. The robot would be placed in the center of the arena in any direction. The sound source would then be placed in some arbitrary location within the arena with the speaker facing the center. To start, the robot would begin to rotate, constantly comparing the current sound level from each microphone with both the other microphone and the previous sound levels. If either mic were generating a consistently higher level, then the robot would continue to turn in that direction until that sound level began to drop. It would then rotate to the previous angle and begin to drive towards the sound source, occasionally stopping to recheck the levels and compare between the two mics, slightly rotating in either direction if necessary.

The total gain of the microphones had actually been designed to max out the A2D converter in the XBC when the robot had reached the correct distance from the sound source as no types of distance sensors could be used. To ensure the robot would stay within the arena, two light sensors were placed in dark tubes and pointed towards the ground. As long as they saw some reflection of light, the robot would be within the arena. If the sensors saw an absence of light, it would be because the robot was about to cross the black outline of the arena, and it would redirect its course accordingly.


Although our strategy seemed pretty sound at the time (no pun intended), there were many things we did not originally take into consideration. The microphone placement could have been better to more easily distinguish between the two intensity levels, and the phase comparator could have helped to determine what direction the sound was coming from. We also had to change the way the robot moved and recorded data because of the Doppler Effect. The biggest problem we faced was getting the robot to figure out what direction the sound was actually coming from. We humans take this ability for granted since we don't have to work at it, but it can even be difficult for us to determine the origin of a sound when one just one of our ears is covered, injured, or muffled.

In the end, we almost completed this task successfully, but ended up running over the sound source instead of stopping in front of it. This happened because the robot got a little lost at one point trying to find the source and after turning from the edge of the arena began to approach the sound source from the side. Although it was still able to drive directly towards the source, the expected intensity of the sound was much lower because of the approach angle, and so the robot failed to realize how close it actually was to it. Oh well, at least the light sensors kept us from driving out of the arena and straight into a disqualification!

Our robot is back for more.

A closeup of the new front sensors on the bot.

Equation 1

Equation 2

Equation 3

Equation 4

Completed Block Diagram

The Phase Comparator

A portion of this exact circuit was reused in Project 5. Good thing it worked so well...