Don't Crash! Tailing Behaviour 🦆🦆🦆🦆

I implement autonomous driving that includes tailing behavious behind other robots (Duckiebots).

Link to the GitHub repo

Contributors
Leen Alzebdeh and Tianming Han

Summary

In this exercise, we managed to write a node that attempts to drive the robot autonomously inside the lane, and safely keep a distance and follow if a Duckiebot is driving or turning ahead of it. We did not meet the goal of driving the robot two laps around the duckietown while maintaining a safe tailing behavior due to inconsistent performance of the program. Despite that, we encountered many programmatic and physical challenges from implementing a safe driving program in a multithreaded environment using software tools such as OpenCV, message bus and services, and we learned about some possible ways to handle such challenges.

Objective

Our objective is to implement autonomous tailing behavior on our duckiebot, where it can maintain a safe distance with the robot in front of it while driving and turning. We will implement stop line detection and utilize the existing implementation of duckiebot detection node for intersection handling and tailing.

Background

In the last exercise, we implemented autonomous lane following, where the Duckiebot can stay inside the lane while driving straight, and localize its position in the world using AprilTags. In this exercise we need a map in order to know the legal turns at an intersection, therefore we also included our AprilTag detection code from last exercise.

Methods and Results

The template code for this exercise contains a Duckiebot detection node that can process the camera input and extract the position of the dot pattern at the back of the leading Duckiebot when we are following it. The detection output gives us the 3D distance and image position of the dot pattern so we know where the robot in front of us is relative to our robot.

Before starting to implement tailing behavior, we modified our lane following code from the last exercise so our robot drives on the right side of the road. This is done by replacing the horizontal image coordinates in the code x by image_width - x, and running the resulting program on the robot shows it is able to follow its own lane but will bump into AprilTag after it enters an intersection.

Turning at an Intersection

To give our robot the ability to make a safe turn at an intersection, we need to let it stop at the stop line and know which direction it should turn. We detect the stop line by applying a red color mask to the camera input and find the largest contour, and compare its y-value with a threshold below which we recognize that the robot is close enough to the stop line and we should stop.
After stopping at the stop line, the robot will start executing a predefined sequence of actions depending on the direction that it wants to turn: If it turns left, it will drive forward into the intersection and make a 90 degree turn to the left in-place; if it turns right, it still drive forward a bit to avoid hitting the AprilTag at its right, and then make a 90 degree turn to the right. After entering the target lane it will resume the lane following code. Note while it starts to turn we also set a timer so it will wait a while before it starts detecting the next stop line at the next intersection.
The robot also needs to know which directions are legal to turn to, so it will not suddenly turn off the road while it is driving autonomously by itself. We tried two approaches to this: using camera input and using AprilTag localization. Since the duckietown has a stop line at each entry of the intersection, we can detect those stop lines to determine which way is legal to turn to. We use the same red color masking as before and extracted a few more contours, and when those contours fall into specific regions on the image, the robot recognizes there is a road in that way which it can make a turn into. However, this approach has two issues, and they led us to discard this solution to use AprilTag detections:

  1. When the robot makes a turn, it can not see the target road at the left side until after it turns left. This makes it difficult to judge if there is a road at the left side. Initially we let the robot rotate a bit to the left to see if there is a road, but then because the motor input is quite imprecise that the duckiebot can not get its orientation back to align with the current lane so it can make a good turn. We then tried to take a shortcut to just detect the roads at the forward and right directions, which appear to work for a single robot case.
  2. Later in the exercise when we tried to let our robot follow the leader robot, it sometimes failed since the leader robot blocked the sight of the camera when it made a turn, and our robot could not see the stop lines at the other intersections.
After switching to use AprilTag detection, the robot is able to locate itself on the map and know which way it is entering the T-intersection. A lookup then gives the legal turns available to the robot. In testing, the robot sometimes misses an AprilTag while driving at the center of the duckietown, so instead we just take the first AprilTag at an intersection it sees and let the robot infer its future position based on the turns it makes.

Duckiebot Tailing Behaviour

Implementing Duckiebot following included two main tasks:

1. Keep a safe distance from the leader.

To keep a safe distance behind the bot, we first get the distance of the back of the leader bot, to our bot. We subscribed to the topic /hostname/duckiebot_distance_node/distance and compared the published distance to our distance threshold to decide if it is too close that we should stop. We determined a safe distance of 0.50 meters to maintain between the bots, thus if a reading lower than 0.50 is received, the Duckiebot stops motion.

After the current duckiebot stops at a stop line, the robot will wait until either the leader robot goes 0.80 meters away or if the leader robot is no longer detected and then make the turn.

2. Follow the direction that leader Duckiebot is turning.

The duckiebot remembers where the dot pattern was last seen in the image and makes a turn decision based on that. This is done by a simple linear decision boundary by computing the distance between the dot pattern and a dot on the image representing the direction we can turn to. However, this does not work very well in testing due to errors in the observations. The robot can enter the intersection at a tilted angle, and it may observe the dot pattern at the right side even if the leader robot is going straight. The inconsistency of the duckiebot detection described in the Discussion section compounds this issue.

Results

Video 1. Duckiebot Driving and Tailing: The duckiebot following the leader making two left turns, but fails at the end due to the stop line timer value set too low and it mistakes the stop line from the other side as the stop line of a new intersection.

QAs

How well did your implemented strategy work? Was it reliable? In what situations does it perform poorly?
When driving by itself, the robot is able to follow the lane and turn relatively consistently. When following the leader robot, the robot is able to detect when the leader robot stops most of the time, except when the duckiebot detection node does not see the robot in front of the camera. When turning in an intersection, the robot can not find which way the leader robot turns toward, and the result is inconsistent from time to time. After visualizing the leader robot’s dot pattern position in rqt_image_view, we found that the detection loses track of the dot pattern once the leader robot starts turning. However, when the robot finds the right direction, it is able to make the turn most of the time.

Discussion

Below lists some difficulties encountered by us:

  • We had a bug caused by updating the PID controller while the robot is not in lane following mode (i.e. stop behind a robot or turning in an intersection). The robot appears to suddenly drift after it turns and it took us a while to find the cause.
  • Occasionally the robot does not respond to wheel commands. We noticed while turning, though the code is sending messages to set opposite velocity to two wheels but the robot sometimes does not turn. We tried replicating the issue by letting it execute custom wheel commands remotely for a set period of time, and found that around one in ten times it skips the command, despite the console output showing the robot does receive the command.
    • We hypothesize that this could come from one of the following reasons: 1) the main thread of the lane following node is doing too much work processing images causing delay, 2) the message is either lost or gets delayed in the ROS message bus. However, we did not look into it further due to time constraints.
    • We tried two turning methods including gently turning into the target lane or going forward and turning 90 degrees to the target direction. In either case the issue persists.
  • Inconsistent duckiebot rear dot detection: Looking at the rqt_image_view output of the duckiebot detection node, we noticed while we are able to detect them, the detection is error prone even when the leader robot is right in front of the current robot and standing still, and gets worse when the leader robot is leaning one way or when two robots are turning (with the dot pattern tilting more than around 45 degree angle to the camera, the detection always fail). When the detection does pick up the leader robot’s dot pattern, the detection flickers from time to time causing the current robot to sometimes believe the leader robot has moved.
    • Initially we thought either the detection rate was too low or the LED light was disrupting the detection. However the same issue still persists after we turned off the LED and increased the detection rate. We also tried cleaning the lens to increase the detection consistency and adding a timer for the robot to wait a bit before it is sure the leader robot has gone away.
  • Determining the turn direction of the leader: relying on the position of the rear dots proved difficult due to the inconsistent detection results. Naturally when turning left the leader robot will 1) enter the intersection, 2) turn to the left side and 3) enter the target lane. While executing this sequence of actions, the path of the rear dots in our robot’s camera appears moving to the right then disappears. This is because the leader robot’s front end moves left and rear end moves right when it rotates counterclockwise, and after it makes the turn the camera can no longer pick up the dot pattern because of the viewing angle. Thus no turns or wrong turns can be made.
  • Sometimes the LED light does not respond to service requests. We tried to set its lights to flicker on turning, but two of the four lamps did not respond to set custom led pattern requests. This was not a problem in previous exercises because the LED tasks were simpler, requiring only the same color light on four lamps. We do not know what caused this issue for the lack of a way to debug the LED light.

We also found that with the increased number of parameters in the program to tune, it becomes a lot more time consuming to tune them. This may be helped by setting a topic on which the program listens for parameter changes for us to tune the parameters on the fly.

References

dt-AprilTags repository README.md: https://github.com/duckietown/lib-dt-AprilTags
Duckiebot detection template repo: https://github.com/XZPshaw/CMPUT412503_exercise4
Led_emitter_node comments from dt-core: https://github.com/duckietown/dt-core/blob/daffy/packages/led_emitter/src/led_emitter_node.py