lørdag den 1. januar 2011

Lego Lab 14

Robot 2 to pick up active object mounted with infrared LED

Date: 16 - 17 December - 2010
Duration: 7 hours
Participants: Kim Bjerge

Goals for the lab session
The goal for this lab session is to find a method for robot #2 to getting close to the object being able to grip it an carry it away.  

To achieve milestone goal:
2 - Robot 2 is able to pick up an object

From the WBS we have the following outstanding actions to do for robot #2.
2 - Pick up object and moves it away
2 - Robot2 SW architecture of functional behavior (Inputs to finial architecture)

Robot #2 to pick up object
We learned from the previous lab session 13 that it is difficult to get close to the object being able to pick it up. We found that the robot #2 needs to be very close to the object before it can be detected by the robot.
Figure 1: Collector robot in front of the active object with a white high luminosity led inserted.

By mounting LED’s attached to the object we get better readings see table below. The object is a white round flamingo ball see picture above. We used an object mounted with a white LED and infrared LED. The table below shows the readings from the left and right RTX light sensors.
Distance from objectNo LED
(L-R light sensor)
White LED
(L-R light sensor)
Infrared LED
(L-R light sensor)
30 cm29-3037-4131-35
20 cm29-3139-4034-38
10 cm30-3239-4535-44
5 cm29-3230-3837-46
3 cm29-3232-3335-41
2 cm29-3331-3231-33
1 cm30-3460-6660-67

Table 1: Results of the reading experiments. It can be seen the different read values by the light sensors compared to the distance the sensors are placed regarding the light source.

Plotted values for white and infrared LED, sensor readings values in different distance from object.

Figure 2: Reading differences between left and right sensors while white visible light is used.

Figure 3: Reading differences between left and right sensors while the IR led is used.

Table 2: Original data used to plot graphs shown in figures 2 and 3.

The table below lists the reading values from the left and right RTX light sensors in different angles in a distance of 10 cm from the object. It is possible to detect the LED light in angles up to 30 - 45 degrees from the object in a distance of 10 cm.
10 cm from object
Degrees from LED
White LED
(L-R light sensor)
Infrared LED
(L-R light sensor)
046-44 (2)38-43 (5)
2031-43 (12)31-40 (9)
4529-30 (1)29-31 (2)
6528-29 (1)28-30 (2)
9027-29 (2)27-29 (2)

Table 3: Read data by the sensors when the white and IR leds are placed at different angles.

It seems like the power full white LED and the infrared LED’s both can be used to in adjusting approaching the object. The infrared LED is selected since is it seems to give better and stable reading close to the object with less power consumption. The infrared LED only consumes half the power of the white LED.  It is also the most “invisible” solution.

The object is mounted with the infrared LED replacing the white LED:

Figure 4: Active object with a high luminosity LED inserted. On the lower left part of the picture the IR beacon can be seen.

Figure 5: Frontal view of the active object with the IR beacon inserted.

Remark that the camera is actually able to see the light from the infrared LED. The light can only be seen in the dark with the human eye.

Figure 6: IR light seen through a digital camera.

Robot 2 using PID controller to approaching infrared light  

The robot 2 construction is changed by mounting 2 RTX light sensors very close to each other at the front of the robot. The purpose is to use the light sensor readings to adjust the direction for moving the robot in direction of the object with the infrared emitting LED. Similar to lab session 5 [1] where we have implemented a PID controller to follow a black line we will try to used a PID control to move the robot towards the infrared LED light. The readings from the left and right RTX light sensors are subtracted from each other and used as input to the PID controller that controls the force to the left and right motors. This idea was generated by experimentation with the light sensors and inspiration from previous lab sessions. We found the the best results was achieved when the light sensors was paced very close to each other.

Pictures below shows how the RTX light sensors are mounted on the robot.

Figure 7: Frontal view of the collector robot. Sensing platform and lifted gripper can be seen.

Figure 8: Frontal detailed view of the collector robot. Mounted ultrasonic and rcx light sensors can be viewed.

Software for the Robot 2 gripper
The problem addressed in this lab session is to grip the object after the location of the object is reached. In the previous lab session we demonstrated that is was possible for robot 2 to move to the location given by robot 1. In this test we will focus on writing a test program that tests how it is possible to control robot 2 approaching the object mounted with the infrared LED and carry the object away.

The subsumptional architecture [2] as used for robot 1 is selected composed by 2 different behaviours. This architecture is easy to work with. In this case a combination of reactive and sequential control strategies as described by Fred Martin [3] chapter 5.3 and 5.4 will be used.  The strategy for this test of robot 2 is based on two independent behaviours where the behavior FindObject has a closed loop reactive strategy using a PID controller with feedback from the RTX light sensors. The GripObject behavior is using a sequential control strategy and triggered by the ultrasonic distance sensor. The first behaviour FindObject uses a PID controller to adjust the movement of robot 2 closing the infrared LED. When the robot is very close the GripObject behavior suppress the FindObject behaviour and starts a control sequence that defines how to carry the object away.

The FindObject behavior uses the PID controller implemented and tested in lab session 5, modified to use the two RTX light sensors.

The PID constants are set to:

Kp = 5, Ki = 0.03 and Kd = 75 selecting a default forward power of Tp = 65

The constants have been selected based on experience from lab session 5 see [1] and re-tuned on experimental basic for this setup.

Below is listed the essential methods of the PIDControl class:



 private void pidCalculate(int lvalue) {
 int turn;
 int error;
 float derivative = 0;

  error = lvalue - offset;
  integral = (integral*2)/3 + error;
  derivative = error - lastError;
  turn = (int)(Kp*error + Ki*integral + Kd*derivative);
 
  // Try to reduce power in turns
  if (turn > 0)
  {
  powerLeft = limitPower(Tp);
  powerRight = limitPower(Tp - 2*turn);
  }
  else
  {
   powerLeft = limitPower(Tp + 2*turn);
   powerRight = limitPower(Tp);   
  }
 
  lastError = error;
 }

 public void regulateStep ()
 {
 lightValue = leftLight.getLightValue();
 lightValue -= rightLight.getLightValue();
     pidCalculate(lightValue);
 }

The regulateStep method is called from the thread in the FindObject behavior each 5 ms (Sampling rate of 200 Hz):


public void run() {
 while(true){
  
  pidControl.regulateStep();
  suppress();
  forward(pidControl.getLeftPower(), pidControl.getRightPower());
  release();
  delay(5);
 }
}

The GripObject behavior is a sequential control behavior that is triggered by the ultrasonic sensor that detects the object is very close. (Distance < 10)

Below is listed the implementation of the sequential control part of GripObject. We are using an open loop control without any sensing input from the environment. This approach is fast to implement and fine for testing the robot #2 behaviour to carry the object away.
       


suppress(); // FindObject behaviour is supressed
       
        stop();
       
        // Moves a bit forward
        forward(60,60);
        delay(300);
       
        // Stop and grip object
        stop();
        lowerGripArm();
        drawString("s");
        delay(1000);
       
        // Turn left
        forward(60,0);
        delay(1000);
       
        // Move object forward
        forward(60,60);
        delay(2000);
       
        // Stop and lift grip arm
        stop();
        liftGripArm();
        delay(500);

        // Move backward
        backward(60,60);
        delay(1000);
       
        // Turn right
        forward(0,60);
        delay(2000);
       
        release(); // Find object behaviour is activated again

Complete source code for the Robot 2 gripper:


Conclusion

The below video records of robot 2 following the infrared light and gripping the object and carry it away. The videos shows that is was possible to find a method based on infrared light mounted at the object for the robot 2 to get in close position of the object and move it away from the arena.

We now have demonstrated a method for robot 2 to be able to pick up an object and moves it away. We have also come a bit closer to a software architecture that can be used to combine a sequential and reactive behavior. Next step would be to combine the navigation of robot 2 done in previous lab session 13 with our new findings to completed the software control for robot 2.  

Collector robot chasing an IR beacon:
http://www.youtube.com/watch?v=vSToTj7i_TE

Collector robot reaching the object and gripping it:
http://www.youtube.com/watch?v=yUYxmIm3gCk

References

[1] Lab session 5: Line follower (PID controlled) http://josemariakim.blogspot.com/2010/10/lab-session-5-line-follower_14.html
[2] Lab session 8: Braintenberg Vehicles with Subsumption Architectures
http://josemariakim.blogspot.com/2010/11/lab-session-8-braintenberg-vehicles.html
[3] Fred Martin, Robotic Explorations: A Hands-on Introduction to Engineering,
Prentice Hall, 2001.

Ingen kommentarer:

Send en kommentar