lørdag den 1. januar 2011

Lego Lab 15

Robot 2 SW architecture and object position

Dates: 27 - 29  December - 2010
Duration:  8 hours
Participants: Kim Bjerge



Goals for the lab session
The goal for this lab session is to find a method for robot #2 to navigate to the location of the object and carry the object away and drop it off. The robot navigates to a position of the object where the infrared LED can be seen. The object will always be positioned so the LED points in a know direction from where robot #2 can see it.

Sprint #2
3 - Robot 2 is able to navigate to the location of the object
4 - Robot 2 is able to carry the object away and drop it off

From the WBS we have the following outstanding actions to do for robot #2.

2- Robot2 SW architecture of functional behavior
3- Go to a specified location (R2)
4- Drop object (R2)

Robot 2 SW architecture
In contrary to robot 1 we have chosen a sequential control strategy for robot 2 as described by Fred Martin in chapter 5.3 [1]. In this case the sequential strategy is good since robot 2 knows what to do in advance. It has to wait for the location of the object then move to the object and grip it. Then finally it will bring the object back to the home position and drop it off. The software architecture we have chosen is the same as use for robot 1. The supsumtion architecture based on the framework made by Ole Caprani [3] is chosen. It is very easy to use and allows us to have independent prioritized behaviors that control the same motors. We have already a very good experience with it in the design of robot 1. For robot 2 we have only two different behaviors the sequential strategy as described above and the find object that uses a closed loop control using the RTX light sensors and a PID control regulator to get close to the active object. This is the same control as described previous lab session [4].

Figure 1: UML class diagram showing the software architecture of the robot.

The UML class diagram above shows the classes that defines the software architecture for robot 2. The same basic principle is used as for robot 1. The SeqStrategy is the main controlling behavior for the robot. The FindObject behavior is suspend most of the time and only active to navigate close to the active object. The Car class is rewritten and now uses the method steer in the SimpleNavigator class instead of controlling the motors directly.

See code snippet below:

FindObject behavior thread:
public void run() {
while(true){
pidControl.regulateStep();
suppress();
forward(pidControl.getLeftPower(), pidControl.getRightPower());
release();
delay(1);
                         }
}

Car forward:
public static synchronized void forward(int leftPower, int rightPower)
{
// Steer values between -200 to 200 that’s why multiply by two
robot.steer((leftPower - rightPower)*2);
}

The sequential control strategy for robot 2 is very straight forward. It contains a series of procedural steps that has an open loop control approach. It is only in the step where the FindObject behavior is released that a close loop control is used. Here the ultrasonic sensor detects the distance to the object. When it is detected to be very close the object is griped and moved away. The sequential steps that robot 2 performs are listed below.

1. Suspends behavior FindObject
2. Wait for object location later transmitted by robot 1
3. Move to location of object based on position of robot 1
4. Release behavior FindObject
5. Wait for the ultrasonic sensor to detect object
6. Suspends behavior FindObject
7. Grip the object by moving a little forward and lower gripper
8. Bring object home by moving to coordinates 0,0
9. Lift gripper

See SeqStrategy code snippet below:

SeqStrategy behavior thread:

public void run()
{
while (true)
    {
    suppress();
    // Wait location of object, then moves toward object
    WaitForObjLocation();
    MoveToOjbLocation();
        release();

        // Let find object behavior get close to object
    int distance = us.getDistance();
        while ( distance > tooCloseThreshold )
        {
            distance = us.getDistance();
            drawInt(distance);
        }
        suppress();
        // Object is now very close to robot, then grip object         
    GripObject();
    // Bring object home and release it
    BringObjectHome();
release();
    }
}
public void WaitForObjLocation()
{
// Simulate pose received from robot #1
   ObjectLocation objLoc = new ObjectLocation(859, -91, -162, 210);
      
// Convert robot #1 position to location for robot #2
Pose robot2pose = objLoc.GetRobot2Pose();
x_loc = Math.round(robot2pose.getX());
y_loc = Math.round(robot2pose.getY());
head = Math.round(robot2pose.getHeading());  
// Display robot #2 pose
String msg = x_loc + "," + y_loc + "," + head;
   LCD.drawString(msg, 0, 7);
}
private void MoveToOjbLocation()
{
  // Move to position with heading, Located by pose of Robot 1
    drawString("l");
  goTo(x_loc, y_loc, true);  
  WaitMoving();
drawString("h");
  rotateTo(head, true);
  WaitMoving();
  stop();
}
private void GripObject()
{
    stop();
    // Moves a bit forward
    drawString("f");
    forward();
    delay(400);
    // Stop and grip object
    stop();
    lowerGripArm();
    drawString("s");
    delay(500);
}
private void BringObjectHome()
{
  // Return to home position
  goTo(0, 0, true);
  WaitMoving();
liftGripArm();
delay(2500); // Object must be manual removed
rotateTo(0, true);
  WaitMoving();    
}

Go to a specified location

When robot 1 has found the object it sends its pose to robot 2. The pose contains the coordinates in the Cartesian coordinate system and the heading in degrees pointing towards the object. Based on this information robot 2 needs to calculate a position where it has to move being able to use the light sensors to navigate towards the infrared LED mounted on the object. The object needs to be placed in a position with the LED pointing in a direction known by robot 2. We have chosen a position where robot 1 will have a heading of 0 degrees starting from the home position (x=0, y=0).  

In the drawing below is illustrated how robot 1 (R1) will be in a position area A. Robot 2 (R2) moves to position (X2, y2, O) computed.

Figure 2: Geometrical calculation of the desired position.

The circular geometry is used to calculate the desired position for robot 2. The circle is divided into area A, B, C and D depending on the heading angle of robot 1. The position of robot 2 can be calculated using the equations below. No effort is made in reducing the equations.

Figure 3: Expressions used for position calculation.

The algorithm has been tested in MatLab plotting the position of R1 and R2 in a coordinate system see figure below. Test data is generated with different headings  (-135, 135, -45 and 45) to ensure the computation is correct for headings in all areas A, B, C and D. Below is show the result with a heading of -135 area A.

Figure 4: Graphical representation of the positions in MatLab.

MatLab code for computation of R2 position based on pose  received from R1.

The class ObjectLocation implements the final computation for converting the robot 1 position to the location where robot 2 has to move. To give additional space for robot 2 to used the FindObject behavior to navigate using the PID controller a x_offset (20 cm) is added to the algorithm. This gives more distance for robot 2 being able to compensate for drift errors in the pose given by robot 1. We have previously found that we are able to use the infrared light and navigate to it up to the distance of 30 cm see [4].

ObjectLocation class:

public class ObjectLocation {

Pose robot1Pose;
Pose robot2Pose;
// X direction offset to let the PID control
// handle getting close to the object
private final float x_offset = 200; // Offset to object [mm]
float radius;

ObjectLocation(float x, float y, float heading, float r)
{
  robot1Pose = new Pose(x, y, heading);
  radius = r;
  robot2Pose = null;
}

public Pose GetRobot1Pose()
{
  return robot1Pose;
}

public Pose GetRobot2Pose()
{
 double a, ar, yr = 0.0, xr = 0.0;
 double head = robot1Pose.getHeading();
 double x = Math.abs(robot1Pose.getX());
 double y = Math.abs(robot1Pose.getY());
 double r = radius;

 if (robot2Pose == null)
 {
   // Robot 1 in position A of object
   if ((head >= -180) && (head < -90))
   {
  a = 180 - Math.abs(head);
  ar = Math.abs(a) * Math.PI/180;
  yr = y + r*Math.sin(ar);
  xr = x - r*(1 + Math.cos(ar));
   }
....... code obmitted - same as MatLab versioin

// Subtract offset from x coordinate
   if (xr >= x_offset) xr -= x_offset;
   else xr = 0;
  
   // Robot #2 default heading 0 degrees
   robot2Pose = new Pose((float)xr, (float)-yr, (float)0.0);
 }
 
 return robot2Pose;
}
}

Conclusion
This lab session completes sprint #2 according to our plan being able for robot 2 to navigate to the location of the object and carry it away where it finally drops it off.

The below video demonstrates that we were able to find a software architecture and algorithm to navigate to a position of the object so robot 2 is able to grip the object and carry it away. Since we only have one Infrared LED mounted on the object, robot 2 needs to find and move to a position in front of the light. It will then be able to move close to the object in direction of the infrared light using the light sensors and the PID controller. We have chosen to use the circular geometry in calculating the position to where robot 2 navigates using the SimpleNavigator [2] from the Lejos API.

Since we have used an open loop sequential strategy, to navigate to the position of the object, robot 2 will not be able to avoid any obstacles on its way. By using the subsumption architecture it would though be easy to add another higher prioritized behavior that could use the ultrasonic sensor to detect any obstacles and navigates around it. This approach depends on the implementation of the SimpleNavigator. Would it be able to handle a new command when a currently executed operation is not yet completed?

Video of final robot 2 operation – robot #2 moved to location and carries object away

Complete MatLab code and source code for the Robot 2 navigation and gripper functionality:

References
[2] Lejos API SimpleNavigator, http://lejos.sourceforge.net/nxt/nxj/api/index.html
[3] Ole Caprani, code for a subsumptional architecture  http://www.legolab.daimi.au.dk/DigitalControl.dir/NXT/Lesson8.dir/
[4] Lego lab 14, http://josemariakim.blogspot.com/2011/01/lego-lab-14.html

Ingen kommentarer:

Send en kommentar