Inspired by the popular iDog from around 2005, our team has decided to develop an interactive robot dog. Our robot dog reacts to music for multiple commands. Each team member utilized the robot car, Segbot, and various sensors to realize this project.
The robot dog responded to sounds of different frequencies, like whistle tones, to execute commands. These commands include sitting, begging, barking, providing a paw, fetching, and turning. The dog expresses different emotional states using LED lights and speakers.
Our team divided this project into three main tasks: implementing movement, expressing emotions, and developing user interaction features. The robot car will represent the dog on all four legs, while the Segbot will depict the dog standing on its hind legs. We also developed a mechanism using RC servo motors connecting through PWMs for the robot car to transform into the Segbot orientation, which will mimic a dog sitting and begging.
This project aims to go beyond a simple toy, exploring potential applications in assistance and service, security and surveillance, social interaction, and innovation and research. Each team member collaborated to implement these functions, with the ultimate goal of completing a realistic and interactive robot dog. For more info on how it works, check out our video where we go into detail on how the code works!
CADUsing Solidworks and the base robot given to us, we used servo brackets to mount to the base and then modified the servo horns along we designed arms with some likeness to dogs. In terms of the positioning, we actually used a motion study in Solidworks to simulate what the robot would do given that the arms ran into the ground. This led to the optimized position that maintained our low cg and pushed the correct half off the ground. The STL files have been linked below.
Printing/AssemblyFor print setting and orientation, use the standard.2mm or.3mm layer height settings for your printer of choice and make sure to mirror the arm components. The ears can be printed upright with no supports with the standard settings and do not have to be mirrored, just duplicated. Assembly was accomplished with M4 nuts and bolts available but whatever you have available works as well, just make sure you drill the holes larger in your designs or scale the parts appropriately.
State MachineTo track which motion needs to be performed given an input command, a state machine was used. Many cases were utilized, with case 5 being the most entered case. This was the listening case and the state machine would remain in case 5 until a pitch was identified upon which the corresponding case would be entered. An important aspect of the state machine was to ensure any command could be executed from within each state. Therefore, each step of the execution from one command to another had to be accounted for. For example, if the current case was 10, ie. begging mode, and the pitch for case 40, ie. lowering into robot car mode was played, the state machine would first have to ensure that the robot stopped being a segbot, lower into segbot mode, and then go into car mode.
Pitch IdentificationIn order to command the robot dog to perform each task, music notes were played, and the input was registered by the microphone. Since there were six actions, six notes were used: A, B, C, E, F, and G all in the sixth octave. A fast Fourier transform (FFT) approach was taken to identify which pitch was being played by constantly checking the power input for large amplitude. When the read-in frequency was within the designated range of the pitch for at least five listening cycles, the note was acknowledged as a command and the corresponding case in the state machine was entered. The commands represented by each note are as follows: A - Sit, B - Beg, C - lower to robot car mode, E - paw, F - fetch, and G - spin.
SpinWe had the robot spin a full circle when it heard a certain note. We store the angle of the robot when a certain note is heard in "initial_angle" and the current angle of the robot in "bearing". So if bearing - initial_angle is larger than 2*pi, which is 360 degrees, we made robot to stop. We've tried all sorts of speeds, and it gives us a turn speed that's neither too slow to take a long time, nor too fast to go over the desired angle.
FetchAs mentioned, we use musical notes to command our robot and implement a state machine coding approach to switch between different commands and states. When the 'fetch' note is played, the 'fetch' variable is set to one. The processor then goes through the fetching section inside the if(fetch == 1) statement. At the beginning of that statement, we store the goal coordinates sent from LabVIEW. Before commanding the robot to start moving, we also save the initial position and bearing angle of the robot. We use these as the second goal position since we want the robot to return to its initial position after reaching the goal. The next step involves using the 'xy-control' function. This function receives the address of turn and Vref, the goal and current position, and some threshold constants like the target radius and target radius near. The latter determines how close we want the robot to get to the goal before considering it reached. The 'xy-control' function sets 'Vref' and 'turn' so that the robot moves forward ('Vref') and adjusts its bearing angle ('turn') to eliminate the angle error as it moves forward. Aside from setting 'Vref' and 'turn', the 'xy-control' function also returns 1 when the robot reaches its goal. Therefore, when the robot reaches the goal, 'danNear' equals one, indicating that the next goal should be the robot's initial position. When it returns to the initial position, 'danNear_2' becomes one, signifying that the 'fetch' command is over, and we should reset all the flags to their default values.
Resource: please check out the user_ xy file.
Emotions and BarkingExpressed emotions and sounds using LEDs and a buzzer. When happy, the word 'HAPPY' is displayed through the LED, and at the same time, cheerful music is played through the buzzer. Similarly, when the mood is sad, the word 'SAD' is shown through the LED, accompanied by sad music. In this way, tried best to represent a dog's emotions using the LED display and the sounds from the buzzer.
LabVIEW Interface for the Fetch CommandLabVIEW was used to communicate with the robot during the fetch command execution. Code Composer communicates with LabVIEW, and the robot IP address is used as a TCP protocol input. PuTTY transmits the robot x and y coordinates and bearing angle to LabVIEW over Wi-Fi. To specify a goal position, the coordinates are inputted into the LabVIEW interface and upon clicking SEND, this is sent back to the robot. The current position of the robot is saved so it can return to its starting position. The code used to send data to LabVIEW is shown below.
if (NewLVData == 1) {
NewLVData = 0;
printLV1 = fromLVvalues[0];
printLV2 = fromLVvalues[1];
printLV3 = fromLVvalues[2];
printLV4 = fromLVvalues[3]; // project : sending x_g from LabView
printLV5 = fromLVvalues[4]; // project : sending y_g from LabView
printLV6 = fromLVvalues[5]; // project : receiving start
printLV7 = fromLVvalues[6];
printLV8 = fromLVvalues[7];
}
// project we only use labview to receive x,y and bearing
if((numTimer1calls%62) == 0) { // change to the counter variable of you selected 4ms. timer
DataToLabView.floatData[0] = xr;
DataToLabView.floatData[1] = yr;
DataToLabView.floatData[2] = bearing;
DataToLabView.floatData[3] = printLV2;
DataToLabView.floatData[4] = printLV2;
DataToLabView.floatData[5] = (float)numTimer0calls;
DataToLabView.floatData[6] = (float)numTimer0calls*4.0;
DataToLabView.floatData[7] = (float)numTimer0calls*5.0;
LVsenddata[0] = '*'; // header for LVdata
LVsenddata[1] = '$';
for (int i=0;i<LVNUM_TOFROM_FLOATS*4;i++) {
if (i%2==0) {
LVsenddata[i+2] = DataToLabView.rawData[i/2] & 0xFF;
} else {
LVsenddata[i+2] = (DataToLabView.rawData[i/2]>>8) & 0xFF;
}
}
serial_sendSCID(&SerialD, LVsenddata, 4*LVNUM_TOFROM_FLOATS + 2);
}
Kat: CAD, Printing/assembly, Video Editing, State Machine
Diya: State Machine, Pitch Identification, Beg
Dahui: Spinning, Fetch
Fatemeh: Spinning, Fetch
Jun: Emotions and Barking
Comments
Please log in or sign up to comment.