pololu 3pi ir tracking and control V2

I made some progress with the pololu tracking and control project.

To allow for better control, I updated the communication protocol between the desktop program and the robot. The previous version supported a minimal set of commands (stop, forwards, spin left, spin right). This set has been replaced by a tuple of values directly controlling the speed of the left and right motors. This allows us to write a more “ambitious” target following algorithm.

The new heuristic is in two steps, speed and direction. We first determine the distance between the tracked robot and the target, and set the speed component as a proportion to distance (close = slow, far = fast). The direction is calculated in a similar fashion: we turn in the direction of the target, the bigger the angle difference between the current heading of the 3pi and the target, the stronger the turn. This remains a simple algorithm, but I a kind of like the behavior. There is more life to it. The following video illustrates (the white dot represents the position of target, the orange dot represents the position of the tracked 3pi robot).

3pi pololu robot controlled via ir tracking: second test from david hodgetts on Vimeo.

pololu 3pi robot controlled via computer vision: first test

I was recently working on some boid like procedural animations and I was thinking it would be fun to transpose such a system in the physical space. Theo recently received a collection of 5 pololu 3pi robots, and I decided they would be ideal candidates for such a project.

The 3pi pololu robots are fairly simple, so I figured it would be easier to consider the robots as brainless and blind particles under the control of an all-seeing “smart” desktop computer.

The first problem was remote communication with the 3pi. Thankfully, Theo was kind enough to mount an xbee module on one of the bots. This allows remote communication via a serial port abstraction.

The second problem was vision, the desktop controller needs to know the position of the 3pi. As I wanted to create a test environment quickly I fired up CCV instead of written custom tracking code. This is an open source tracker usually used for multitouch surfaces.

I thought it would be interesting to track the 3pi with Infrared light, this would allow for a overhead projection to be added later without influencing the tracking. I used an IR sensitive ps3 eye webcam and Theo added an infrared led to the 3pi to make it visible to the camera (it’s the led on top of the mast on the 3pi picture). The setup was ready to track and send the position of the 3pi to the main desktop program. Now that I knew the position of the 3pi, it was time to make it find and move to a new arbitrary position (target).

For a first test, I opted for a very naive strategy. The situation is the following:
1. we know the position of the 3pi via tracking but not its orientation (visually tracking the orientation is too big of a problem for this first test).
2. we can control the motors of the 3pi, make it move forwards and turn, but we can’t tell it to move or turn a known distance (for instance you can’t tell it to turn 30 degrees).

However, if we tell the bot to move forwards, and we track the movement, we get a new position, which we can compare to the starting position and transpose as a vector, and now we have a direction.

The next step was to decide for a simple vocabulary of movements for the 3pi. I decided that it could either be moving forwards, or spinning to the right, or spinning to the left or, finally, it could be immobile. The heuristic is then quite simple:

1. move forwards

2. are we close enough to the target to be considered the final position?
if yes
stop all done.
else
are we getting closer to the target?
if yes
continue forwards
else
decide if target is to the left or to the right of 3pi and spin in the corresponding direction, then goto 1

Granted this is very naive, but fine for a proof of concept stage.
I used openframeworks to read the tracking information and to communicate serially (xbee) with the 3pi.

You can see the first result in the following video. On the screen, the white dot represents the target and the blue dot is the tracked position of the 3pi.

As you can see the basic system is in place, but there is still a lot of work to get a more fluid experience : )

Hello RS485 with Arduino

Nothing fancy here, just trying to get started with RS485 and Arduino. So here we go for a basic setup with one master that only emits and one slave that just listens. See the comments for the wiring:

/* RS 485 V1 Master, using a SN75176BP

                              -------
                          RO -|     |- VCC [connect to 5v]
                              |     |
                          RE -|     |- B-------------->   [connect to Slave's B]
                              |     |        | 120R (parallel resistor)
          connect to 5V   DE -|     |- A-------------->   [connect to Slave's A]
                              |     |
   connect to pin 1 (TX)  DI -|     |- GND [connect to GND]
                              -------

*/
const int ledPin = 13;      // the pin that the LED is attached to

void setup() {
  Serial.begin(9600);
  pinMode(ledPin, OUTPUT);
}

void loop() {

  byte b = 0;

  Serial.write(b);
  analogWrite(ledPin, b);
  delay(1000);

  b=255;

  Serial.write(b);
  analogWrite(ledPin, b);
  delay(1000);
}

And now for the slave:

/* RS 485 V1 SLAVE, using a SN75176BP

                              -------
   connect to pin 0 (RX)  RO -|     |- VCC [connect to 5v]
                              |     |
          connect toGND   RE -|     |- B-------------->   [connect to Master's B]
                              |     |        | 120R (parallel resistor)
                          DE -|     |- A-------------->   [connect to Master's A]
                              |     |
                          DI -|     |- GND [connect to GND]
                              -------

YOU'LL HAVE TO DISCONNECT RO DURING UPLOAD TO I/O BOARD!!!!!!!!!                          

*/
const int ledPin = 13;      // the pin that the LED is attached to

void setup() {
  Serial.begin(9600);
  pinMode(ledPin, OUTPUT);
}

void loop() {
  byte brightness;

  if (Serial.available()) {// check if data has been sent from the computer:

    brightness = Serial.read(); // read the most recent byte (which will be from 0 to 255):

    analogWrite(ledPin, brightness); // set the brightness of the LED:
  }
}

If all goes according to the plan, you should see something like this:



Labo workbench light, a bit more clever than a switch

Using an old Arduino, a MosFet transistor, 5m of led strip and an old alarm detector (SIEMENS IR100B), I built an interesting little lighting set up.
The idea is that the leds switch on as you approach the desk.
The detector is a bit sensitive, but it does the job ok.
I was curious about the free run consumption when the leds are turned off.
Doing this project I got my answer: it’s very little. Less than 100mW according the wattmeter.

In attachment, code, schematics and video for reuse and improvement.

thanks to fritzing
/* Theo Reichel, Reichel Complex AI, 02/2010 */

int sensorPin = 2; // interrupt 0
int sensorAlimPin = 4;
int ledArrayPin = 9; // PWM
int buttonPin = 3; // interrupt 1
int ledPin = 11; // PWM

volatile bool sensor_status = LOW;
volatile bool button_pressed = LOW;

volatile unsigned int light_power;

unsigned long sensor_millis_diff = 0;
unsigned long sensor_status_age = 0;

volatile int menu;
int i;

void setup()   {
  Serial.begin(19200);
  Serial.println("Labo desk light with detector started");

  pinMode(ledArrayPin, OUTPUT);
  pinMode(sensorAlimPin, OUTPUT);
  pinMode(ledPin, OUTPUT);
  pinMode(buttonPin, INPUT);

  digitalWrite(sensorAlimPin, HIGH);

  attachInterrupt(0, sensor_trigger, CHANGE);
  attachInterrupt(1, user_button, FALLING);
}

void loop()
{
// menu
  while (menu == 0)
    detector();

  while (menu == 1)
    always_on();

  while (menu == 2)
    always_off();
}

void fadein()
{
  for (light_power; light_power<255; light_power++)
  {
    analogWrite(ledArrayPin, light_power);

    if (button_pressed) // button can interrupt fade
    {
      button_pressed = LOW;
      break;
    }

    delay(5);
  }
  if (light_power == 255) // electrical workaround
    digitalWrite(ledArrayPin, HIGH);
}

void fadeout()
{
  for (light_power; light_power > 0; light_power--)
  {
    analogWrite(ledArrayPin, light_power);

    if (sensor_status) // sensor can interrupt during fadeout.
      break;

    if (button_pressed) // button can interrupt fade
    {
      button_pressed = LOW;
      break;
    }

    delay(10);
  }
  if (light_power == 0) // electrical workaround
    digitalWrite(ledArrayPin, LOW);
}

/////////// programs /////////////

void detector()
{
  digitalWrite(ledPin, 50);

  if ( sensor_status )
  {
    Serial.print("update sensor_status_age: ");
    Serial.println(sensor_status_age);
    sensor_status_age = millis();  

    if (light_power < 255)
    {
      Serial.print("fadein from light power: ");
      Serial.println(light_power);

      fadein();
    }
  }
  else
  {
    sensor_millis_diff = millis() - sensor_status_age;

    if ( light_power > 0 && sensor_millis_diff > 60000 )
    {
      Serial.print("fadout from light power:");
      Serial.print(light_power);
      Serial.print(", duration without motion");
      Serial.println(sensor_millis_diff);

      fadeout();
    }
  }
}

void always_on()
{
  digitalWrite(ledPin, HIGH);

  if (light_power < 255)
    fadein();
}

void always_off()
{
  analogWrite(ledPin, LOW);

  if (light_power > 0)
    fadeout();
}

/////////// interrupts /////////////

void sensor_trigger()
{
  sensor_status = !digitalRead(sensorPin);
}

void user_button()
{
  button_pressed = HIGH;

  if (menu == 2)
    menu = 0;
  else
    menu++;
}

The next project in the same wave is to make a nice PCB, arduino compatible with FET and detector on board.
Adding some nice little things like RTC, light sensor and a little led display, I plan to build a better light management.
For instance, if the room is bright enough the leds stays off.
If the time is far in the night, meaning that I’m not supposed to be awake, the light is faded to keep it soft for my eyes.
And of course a single push button to iterate different modes.

There is probably a lot more to do to make “clever” lights.
Feel free to share your ideas, I’m very interested.
But please, don’t mention clapping in the hand like in SF movie. I believe the light is improved if it adapts to our presence without our explicit will or interaction.

Flock test 02

Just a quick update to my flock system.
I added attractor/repulsor and wind forces. In this example, the movement of the mouse triggers a repulsor.

To make things interesting visually, each boid is the head of a chain of spring, and each chain of spring in turn drives a ribbon curve.

To do: Improve the visuals. Play with perlin noise.

Flocking test

Playing around with some flocking code (adapted from Daniel Shifman’s implementation ).

I intend to add some controls to the canonical flocking forces (repulsion, align and cohesion) to make the system reactive to external input (multitouch or camera). I also have to think about the rendering, the actual colorama is just for fun.



Multitouch table: finger tracking

As we mentioned in a previous post, we relied on the techniques discussed on the Nuigroup site to drive our table (see Nuigroup for specifics). They all use computer vision to solve the multitouch problem. In other words, the position of the fingers on the surface is tracked by camera.

A simple webcam can do the trick. However, it needs to be slightly modified to filter out visible light (so as to avoid capturing the projected image). Then,  via a process of frame differencing and the aid of various thresholding and image processing filters, you obtain a pure black (0) and white image (1) describing the position of the elements in contact with the surface. This is then used as the basis for the tracking.

In our case, we used a modified PS3 eye webcam, which is relatively cheap, and has some excellent frame rates (640×480 at 100fps).

On the software side, we used Tbeta which is an open source tracking solution written with the openframeworks c++ library. Tbeta tracks the elements in contact with the surface and broadcasts their position and id over udp using the Tuio protocol.

 

tbeta interface
tbeta interface

this shows the tbeta interface in action. On the left, the source image from the webcam. On the right the processed B/W  image used for tracking.