pololu 3pi ir tracking and control V2

I made some progress with the pololu tracking and control project.

To allow for better control, I updated the communication protocol between the desktop program and the robot. The previous version supported a minimal set of commands (stop, forwards, spin left, spin right). This set has been replaced by a tuple of values directly controlling the speed of the left and right motors. This allows us to write a more “ambitious” target following algorithm.

The new heuristic is in two steps, speed and direction. We first determine the distance between the tracked robot and the target, and set the speed component as a proportion to distance (close = slow, far = fast). The direction is calculated in a similar fashion: we turn in the direction of the target, the bigger the angle difference between the current heading of the 3pi and the target, the stronger the turn. This remains a simple algorithm, but I a kind of like the behavior. There is more life to it. The following video illustrates (the white dot represents the position of target, the orange dot represents the position of the tracked 3pi robot).

3pi pololu robot controlled via ir tracking: second test from david hodgetts on Vimeo.

pololu 3pi robot controlled via computer vision: first test

I was recently working on some boid like procedural animations and I was thinking it would be fun to transpose such a system in the physical space. Theo recently received a collection of 5 pololu 3pi robots, and I decided they would be ideal candidates for such a project.

The 3pi pololu robots are fairly simple, so I figured it would be easier to consider the robots as brainless and blind particles under the control of an all-seeing “smart” desktop computer.

The first problem was remote communication with the 3pi. Thankfully, Theo was kind enough to mount an xbee module on one of the bots. This allows remote communication via a serial port abstraction.

The second problem was vision, the desktop controller needs to know the position of the 3pi. As I wanted to create a test environment quickly I fired up CCV instead of written custom tracking code. This is an open source tracker usually used for multitouch surfaces.

I thought it would be interesting to track the 3pi with Infrared light, this would allow for a overhead projection to be added later without influencing the tracking. I used an IR sensitive ps3 eye webcam and Theo added an infrared led to the 3pi to make it visible to the camera (it’s the led on top of the mast on the 3pi picture). The setup was ready to track and send the position of the 3pi to the main desktop program. Now that I knew the position of the 3pi, it was time to make it find and move to a new arbitrary position (target).

For a first test, I opted for a very naive strategy. The situation is the following:
1. we know the position of the 3pi via tracking but not its orientation (visually tracking the orientation is too big of a problem for this first test).
2. we can control the motors of the 3pi, make it move forwards and turn, but we can’t tell it to move or turn a known distance (for instance you can’t tell it to turn 30 degrees).

However, if we tell the bot to move forwards, and we track the movement, we get a new position, which we can compare to the starting position and transpose as a vector, and now we have a direction.

The next step was to decide for a simple vocabulary of movements for the 3pi. I decided that it could either be moving forwards, or spinning to the right, or spinning to the left or, finally, it could be immobile. The heuristic is then quite simple:

1. move forwards

2. are we close enough to the target to be considered the final position?
if yes
stop all done.
else
are we getting closer to the target?
if yes
continue forwards
else
decide if target is to the left or to the right of 3pi and spin in the corresponding direction, then goto 1

Granted this is very naive, but fine for a proof of concept stage.
I used openframeworks to read the tracking information and to communicate serially (xbee) with the 3pi.

You can see the first result in the following video. On the screen, the white dot represents the target and the blue dot is the tracked position of the 3pi.

As you can see the basic system is in place, but there is still a lot of work to get a more fluid experience : )

Flock test 02

Just a quick update to my flock system.
I added attractor/repulsor and wind forces. In this example, the movement of the mouse triggers a repulsor.

To make things interesting visually, each boid is the head of a chain of spring, and each chain of spring in turn drives a ribbon curve.

To do: Improve the visuals. Play with perlin noise.

Flocking test

Playing around with some flocking code (adapted from Daniel Shifman’s implementation ).

I intend to add some controls to the canonical flocking forces (repulsion, align and cohesion) to make the system reactive to external input (multitouch or camera). I also have to think about the rendering, the actual colorama is just for fun.