Taking a robotic toy to the next level
- Dave Coleman
- Ethan Landress
- Lauren Schmidt
- Vince Thiele
Expand the ability of Pleo, the off-the-shelf animatronic dinosaur made by Ugobe. Using an overhead camera Pleo is more aware of his world and can accomplish various goals. So far Pleo can locate fruit, plan trajectories around obstacles, knock over model Tokyo skyscrapers, and avoid contact with secondary robot.
- Commands sent to Pleo via USB serial port
- Overhead Logitech webcam captures environment
- Software written in MS Visual Studio C++
- A-Star algorithm currently implemented for path finding
- OpenCV vision library used for image processing
- Background Subtraction
- Blob Detection
- HSV Filtering
**Project 1 **
While we had discussed ideas and begun experimenting with Visual Studio, MiGIO, Pleo, and the OpenCV libraries before, today was our first official working meeting. Heretofore, Vince and Lauren had been working on getting the Pleo to walk and turn (both of which he now does quite well), Dave had started working on blob detection, to find Pleo and the fruit using the appropriate HSV combinations, and Ethan has been poking code here and there, mostly hoping some major insight on one of our problems might just fall into his lap…it's not very effective. -_-
During the meeting, Lauren and Vince measured the Pleo's movements (how fast can he go, how much space can he turn in, etc.), and improved his movement abilities by making him receptive to touch, moving in different ways based on which of his tactile sensors felt any pressure. Meanwhile, Dave made made great progress with blob detection and extraction, modifying the sample code to be able to select colors from an image from the camera and to isolate the colors of the Pleo and fruit.
As these seemingly disjoint areas of work not begin to converge upon one another, the main challenge before us is to make Pleo see and respond to the images we are extracting from the camera. Ethan has now written a basic control method to determine the angle and distance between Pleo and the fruit, with blank areas to be filled in later with code which actually moves the Pleo. Due to the way Pleo's motion currently works (tectile sensor input —> motor output), it seems that the most plausible solution will be to use camera information to send signals to Pleo which make him think that he is feeling pressure on one of his tactile sensors, which will in turn cause Pleo to move as we want, whether he knows why he is moving or not.
TL;DR - Pleo walks well, but doesn't really know where to go yet.
We met for a while tonight, with Vince and Lauren working on improving Pleo's movement functions, and Ethan trying to figure out how to both control Pleo (after finding his angle and distance from the fruit) and how to send information to Pleo which would, as mentioned above, make him think he had a sensor active, and hopefully move accordingly.
One of the major issues to be corrected with Pleo movement is that Pleo tends to sway and move his head a bit when he walks, which can introduce error into our control code, which makes use of the position of Pleo's head and backside. By modifying Pleo's default walking methods, we were able to make Pleo's control code more accurate.
Speaking of control code, no one really had any experience with serial ports and the like in C++, so Ethan decided to whip up a fresh batch of MiGIO flavored copypasta. After some trial and error and a number of linking errors, the serial port code began to come together, however, we did not have the webcam at this meeting, so no real testing could be done.
We met for a while tonight as well, and finally succeeded in getting Pleo to move to the fruit (most of the time), based on the information he received from the camera. There're are still minor problems to be worked out: Pleo still sways a bit and does not turn terribly sharply, it is sometimes difficult for the camera to find Pleo and the fruit in more than one setting, and the control logic could use a bit more fine tuning. Despite these remaining problems, Pleo is beginning to behave as desired, and he should be ready to demo when the time comes.
Ethan and Lauren met for a little while today to test out the Pleo's new (and improved?) control code, and to see how difficult it would be to adapt Pleo's blob detection to a different surface. For your information, it's pretty difficult. -_- As for progress up to now, as long as Pleo doesn't start, say, right next to the fruit, he is generally able to work his way to the fruit. Now essentially all that remains before demoing Pleo's behavior is to fine tune the changes we have already made, hopefully to the ends that Pleo can move and know where to go on multiple surfaces. Hopefully our next meeting will see our completion of Pleo's code and our report.
During tonight's meeting, Pleo's movements were further improved, and the implementation of his angle finding was changed substantially. The original idea had been to use acos and an old CalcII formula, but this returned an unsigned angle, which made it really hard to decide if Pleo needed to change the direction of his turn. The new way, using atan2(), returns a signed angle, which fit in much more nicely with Pleo's existing control logic, and made it much easier for Pleo to find his fruit. Regarding Pleo himself, he now has fancy new shoes for traction and trendiness, and is no longer shy about letting us know how happy he is to reach his fruit.
Pleo preformed fairly well during the demo for project 1, but not as well as we had hoped he would. Pleo easily found and reached the fruit in the first trial (fruit straight ahead), but required two attempts to find the fruit in the other trials (fruit to the side and fruit behind). While Pleo's movements and control seemed spot on (owning heavily to a new method of determining when to back up utilizing the ratio of the distances between Pleo's front to the fruit and Pleo's back to the fruit), we sometimes ran into trouble locating the objects (especially if they were moved). This problem was mainly caused by the fact that we always tested our program on a plain white table and reset the program between every trial, not accounting for the multiple colors in the floor or the fact that Pleo may be moved without allowing for a restart of our program. This will be an important issue to address for project 2.
Due to hectic schedules, the team had hardly any time at all to meet during the week before Spring Break, so we all attempted to begin parts of the projects over the break. Fun Fact: At Georgia Tech, spring breaks you.
We met today, and worked for several hours. Dave perfected the vision algorithms he had been working on over spring break (which detect and amplify the differences in two images (i.e the background before and after adding obstacles (lol, it's Lisp))). Meanwhile, Vince constructed a model of Pleo's possible motions to be used in modifying Pleo's movements to possible include turning in place and other, more complex, behaviors. Ethan and Lauren are working on modifying A* search to search the graph that will be formed form Dave's image of differences mentioned above, to get Pleo a path to and from the fruit which does not touch any obstacles.
Tonight's meeting consisted of working out bugs and incrementally improving all parts of the algorithms that will be used to guide Pleo to the fruit and back. Dave improved the already impressive functions for extracting obstacle, Pleo, and fruit locations from the environment. Vince continued to work on modeling and creating new and improved motions for the Pleo which might make it easier for him to navigate on the grid he doesn't realize we have him walking on. Lauren and Ethan worked on writing A*, but soon realized a while that trying to treat pointers as arrays in C++ leads to all kinds of failure.
The purpose of this meeting was to refine Pleo's algorithm for locating the fruit (without touching obstacles) enough that we could focus on motion and manipulation for the rest of the time alloted for the project. After working out all of the bugs in Pleo's A* however, we discovered…that we had actually mad the bug2 algorithm (and here you heard writing dumb algorithms was hard). We did eventually complete the intended A* algorithm (see the pictures below), and it worked out for the best that we had messed up a bit before (algorithm #2, get). Now the challenge remains to actually get Pleo to the fruit using the coordinates he has from his A* algorithm, and get him back to his starting location. Oh, and we have about one day to do it.
We met one last time for Project 2, and alas, we didn't get quite done. Pleo can get to the fruit along his A* path just fine, but he is having trouble getting back. It's comforting that we still seem to be doing as well or better than most of the other groups…I guess the next best thing to winning alone is failing with everyone else. :) We got a minor miracle on one trial for Project 1; I guess that's the goal for this project's demo as well.
Pleo made it to the fruit, but did not bring the fruit back, which is nearly all we could hope for, given the time constraints. Our blob detection worked just fine this time, which means we have fixed our largest problem from Project 1, we just did not have enough time to work on getting Pleo back while holding the fruit.
At our first meeting for the final project, we decided what needs to be done to make Pleo destroy Tokyo without being stopped by Rovio, and assigned individual tasks. Basically, we will be treating Rovio as an obstacle, and have a queue of buildings to visit, shortest distance first. Pleo will visit each building, walk through it to knock it down, and move on to the next building by replanning with Rovio's new location and the next building in the queue as the goal. Other things to do include creating our building (the Mode Gakuen Cocoon Tower), and precisely modeling Pleo's movements mathematically.
Pleo's movements are being planned and simulated now based on a model of his actions. We are hopeful that through testing we will be able to create a very accurate model for Pleo which will get him to every tower and help him avoid Rovio (and not cause him to get off course due to modeling errors).
We met for many hours tonight, but finally got Pleo moving in response to the commands he gets sent from the model. The problem is, it is very hard to get the model to match Pleo's actual actions. This is because Pleo's movements depend on his state, so a left turn after a left turn goes farther than a left turn after a right turn. With the deadline drawing near, we may be faced with ignoring the model altogether, and using our old control code to move Pleo.
During this meeting, we did indeed decide to revert Pleo's control code to all of project 2's angle and distance minimizing glory. While the motion stub technique worked perfectly for modeling and simulation, the volatile motions of the Pleo were just too difficult to model reliably. This being done, our Pleo actually began demonstrating rudimentary intelligence, and became able to plan and execute a path to a building. One of the major challenges of the night was how to get points in the best locations of the buildings, and how to let Pleo know when he has actually hit a building. Rovio is being treated as an obstacle (project 2 style) with a pseudo-configuration space applied to his location. Pleo has furthermore been armored (and looks like a total badass), which will add stability and make it nearly impossible for an enemy Rovio to flip him or push him off balance.
Jurassic Park wins! Our hours and hours (and hours) of hard work were not wasted; our Pleo handily outperformed the competition, both Pleos and Rovios. What a great way to end the year!
2/21/10 - Some pictures marking our progress.
Some of the output from the blob detection/extraction that Dave has been working with, showing the image from the overhead camera on the left, and the resulting image, with desired colors extracted as white on black, on the right, and the code handling it all in the background.
The fruits of our blob extracting labor, a Pleo and an apple, with two marks on the Pleo (one near the head, one near the tail), and one mark in the center of the fruit, courtesy of Dave. These points are used to calculate the angle and distance separating Pleo's head from the sweet, sweet fruit he longs to nom.
Below are some pictures of the results of our planning algorithms. Each grid uses black for an empty square, green for Pleo or the fruit, and white for an obstacle.
Bug2: what we accidentally created while going for A*. Note how bug Pleo stays right up against the configuration space when it doesn't know where to go. In this picture, blue squares represent the configuration space, and the blue squares represent Pleo's path. Just ignore the green square inside the obstacle…
A*: Now that's more like it, nice and optimal. In this picture, the configuration space is represented as black squares with a yellow outline, the path Pleo should take is made up of the blue squares, and the cyan squares represent every node considered during the A* algorithm.
These results are also quite relevant, as they show that Bug2 actually is a pretty good estimator for A* if one is short on time or resources.
2/23/10 - We now have a remote control for our pleo
The files that make him work are in the files section below.
To compile the file to load on Pleo's SD card, make sure "pleo_control.upf" and "sensors.p" are in the folder \PleoDevelopmentKit\examples\pleo_control along with the build.bat file (for Windows). In the command window, call build pleo_control and it should create a bunch of folders. The file you want to move to the SD card is the one that's in the "Build" folder. Now put the SD card in Pleo, and connect the USB to your computer. Turn Pleo on and he should move into his neutral position and wait for your command.
To tell him what to do, run the pleoDriver.jar file. This will connect to him with the serial port and if successful, open a remote control window. He should now walk in the directions you want him to go. I also included the Pleo.java file that I created the pleoDriver.jar file from if you want to modify it.