Thursday, May 31, 2007

Meeting 003

In our third meeting, we finally started to assemble the prototype candy sorter. This project is going to take a while, so the week-to-week progress seems small ...

Still, we are having a great time working together, and learning how tricky going from concept to implementation can be.


We've decided to use a rotating "bucket" that will accept a single piece of candy, rotate underneath the a "scanner," and rotate back to a particular hopper where the candy piece will be dumped.

I spent part of this meeting trying to figure out why my rotation sensor wasn't returning any values, and ultimately discovered that, for reasons I still can't explain, it simply wouldn't work on one of the three input channels available on the RCX we were using. We expect the rotation/angle sensor to make our task much easier, as it seems to be quite accurate.

The image above is actually from a video you can view, if you would like a better idea of how we envision the candy sorter working.


Above: Part of our group: Irena S., Kimberly P., "LegoDoug," "Birt," and Peter P.

Thursday, May 24, 2007

Meeting 002

In the second meeting, we got started with the project itself.

Our first task was to determine, having chosen to build a candy-sorting machine, whether we could actually sort candies reliably based on color. Our answer would come with an investigation of the light sensor and how it read colors under various conditions. I suppose we could think of this as the basic research phase, as opposed to applied research, which one doesn't normally think about when doing a robotics project.

We fired up the Bricx Command Center, which offers some great remote-control options for the prototyping and testing stage, as well as very decent NQC development environment. After some initial stumbling (it has been a long time since I've used this), we were able to set the sensor types and get data back in its various formats.

Above: Ron, Irina, and Laura work on testing the M&Ms in our test apparatus.

We chose M&Ms as our candy to sort primarily because the color consistency is carefully quality-controlled, so we knew even if we ate them all, we'd be able to get more in the same color. In addition, we rejected jelly beans because of variation in color, especially among multiple brands. This particular M&M type is also "mostly round" in shape, which should help with the color identification process, at least in theory. We surmised, additionally, that ratio of logo size to overall surface area on plain M&Ms might produce too much deviation in color readings, although we did not positively determine if that was the case.

Discovery: The two different light sensors we have, seen in the test candy holder above, do not produce the same readings. There is enough variation to throw off any programmed color detection. We will need to account for this when we actually get down to writing code. (This is why it's good programming practice to define constants to allow for value tweaking.)

We talked about how ambient light as well as supplied light might greatly affect the readings, and discovered that was completely correct.

We mounted the Lego light and the Lego light sensor tangent to the candy, and checked our readings with and without the light. This turned out to be important, because the readings we got for yellow and orange were very similar, but by adding and toggling the external light source, we can easily differentiate the colors under those conditions.

Because ambient light levels so dramatically affected the readings we got, we put the whole apparatus under a large box to get our final readings. We will have to enclose or shade the test chamber in our final product.

Our final test bench, above, used a pocket set at an angle to keep the M&M in the proper place.

The readings, done under a box, gave us the following results using "raw" sensor data:

Color Value Value with Added Light
Orange 725 565
Red 735 558
Yellow 721 617
Blue 819 571
Green 779 565
Brown 785 564

Notice that the values for orange and yellow are very close, but that if we get into that range, we can programmatically activate the light to differentiate them. Otherwise, we shouldn't need to activate the second light.

Next week we will begin prototyping the mechanism, our first bit of real Lego assembly. I am eager to see how it goes.

Thursday, May 17, 2007

Meeting 001

Although I envisioned doing this long ago, we finally started a group at work doing robotics using the Lego MindStorms Robotic Invention System.

Our first meeting was spent providing an introduction to the Lego MindStorms system, demonstrating the various sensors and motors, and how the RCX worked. We talked about project ideas, and chose a candy sorter as our first project.

The last thing we did in meeting one was kick around some very basic design ideas for the candy sorter. In doing so, we learned that we had initial conceptual views which varied greatly from one another. This will definitely be an advantage in problem-solving, and it will be fascinating to see how we narrow down our designs to one type, and then refine them.

In this case, although we had some other variations, the two predominant design concepts were a hopper-fed mechanism and a robot that would navigate within a defined area and retrieve candy scattered there. Both seemed valid initially, although we elected a hopper-fed design due to problems we foresaw with a gathering-type robot pushing candies outside of the gathering area.