While we had discussed ideas and begun experimenting with Visual Studio, MiGIO, Pleo, and the OpenCV libraries before, today was our first official working meeting. Heretofore, Vince and Lauren had been working on getting the Pleo to walk and turn (both of which he now does quite well), Dave had started working on blob detection, to find Pleo and the fruit using the appropriate HSV combinations, and Ethan has been poking code here and there, mostly hoping some major insight on one of our problems might just fall into his lap…it's not very effective. -_-
During the meeting, Lauren and Vince measured the Pleo's movements (how fast can he go, how much space can he turn in, etc.), and improved his movement abilities by making him receptive to touch, moving in different ways based on which of his tactile sensors felt any pressure. Meanwhile, Dave made made great progress with blob detection and extraction, modifying the sample code to be able to select colors from an image from the camera and to isolate the colors of the Pleo and fruit.
As these seemingly disjoint areas of work not begin to converge upon one another, the main challenge before us is to make Pleo see and respond to the images we are extracting from the camera. Ethan has now written a basic control method to determine the angle and distance between Pleo and the fruit, with blank areas to be filled in later with code which actually moves the Pleo. Due to the way Pleo's motion currently works (tectile sensor input —> motor output), it seems that the most plausible solution will be to use camera information to send signals to Pleo which make him think that he is feeling pressure on one of his tactile sensors, which will in turn cause Pleo to move as we want, whether he knows why he is moving or not.
TL;DR - Pleo walks well, but doesn't really know where to go yet.
2/21/10 - Some currently much too small pictures marking our progress.
Some of the output from the blob detection/extraction that Dave has been working with, showing the image from the overhead camera on the left, and the resulting image, with desired colors extracted as white on black, on the right, and the code handling it all in the background.
The fruits of our blob extracting labor, a Pleo and an apple, with two marks on the Pleo (one near the head, one near the tail), and one mark in the center of the fruit, courtesy of Dave. These points are used to calculate the angle and distance separating Pleo's head from the sweet, sweet fruit he longs to nom.
OM NOM NOM
2/23/10 - We now have a remote control for our pleo
The files that make him work are in the files section below.
To compile the file to load on Pleo's SD card, make sure "pleo_control.upf" and "sensors.p" are in the folder \PleoDevelopmentKit\examples\pleo_control along with the build.bat file (for Windows). In the command window, call build pleo_control and it should create a bunch of folders. The file you want to move to the SD card is the one that's in the "Build" folder. Now put the SD card in Pleo, and connect the USB to your computer. Turn Pleo on and he should move into his neutral position and wait for your command.
To tell him what to do, run the pleoDriver.jar file. This will connect to him with the serial port and if successful, open a remote control window. He should now walk in the directions you want him to go. I also included the Pleo.java file that I created the pleoDriver.jar file from if you want to modify it.
/*Team members feel free to add details (especially about things you've done that I don't know the details of), add pictures, correct errors, think of a cool team name, etc…*/