../_images/tiagopro-icon.png ../_images/kangaroo-icon.png ../_images/tiago-icon.png ../_images/ari-icon.png ../_images/talos-icon.png ../_images/mobile-bases-icon.png

🎯 Target Navigation#

The Target Navigation framework enables robots to detect and autonomously navigate toward various types of Targets.

The purpose of this chapter is to walk you through the usage of the available Target Detectors, how to create new ones, and how to make your robot navigate towards the detected Targets.

../_images/targets.svg

The Target Navigation framework consists of two main components:

  • Target Detection: This component is responsible for detecting a target and providing its position relative to the robot. In general, any object that can be detected using one or more of the robot’s sensors can be considered a target. Some of the targets that can be detected are ArUco Markers, 2D Laser Patterns, Humans and so on.

  • Target Navigation: This component is implemented in the NavigateToTarget Navigator which is a Nav2 Behavior-Tree Navigator that uses the output of the Target Detection to navigate the robot toward the target. This Navigator allows you to define custom navigation logic using a Behavior Tree, enabling users to define the robot’s behavior as it approaches a target. For instance, if the target detection fails or the target is unreachable, the robot could skip the target, wait and retry, or abort its task. This approach offers more targeted navigation where the goal is not just a fixed position in the environment, but a specific object or target whose position can change dynamically.

Since these two components are decoupled, the Target Detection can be used independently of Navigation. For example, the Target Detection feature can be used to localize a target in the environment and create a map of the detected targets.

../_images/architecture.svg

Applications#

The Target Navigation framework allows for the creation of various advanced robot functionalities. Some examples include:

  • Static Target Navigation: The robot identifies the position of a static target (e.g. ArUco marker) and navigates toward it, maintaining a fixed distance from the target.

  • Dynamic Target Navigation: The robot identifies the position of a dynamic target (e.g. a Person) and navigates toward it. As the robot moves, it continuously updates the target’s position, allowing it to follow the target as it moves.

Note

A similar behavior can be achieved using the Dynamic Object Following feature. This demo uses the Follow Dynamic Point Behavior Tree.

  • Target Alignment: The robot identifies the position of a target and aligns itself accordingly. This is particularly useful when the robot needs to reach a precise position and orientation to perform a specific action, such as grasping an object or delivering materials.

  • Docking: The robot identifies the position of a docking station and navigates toward it until the contact is made. This is essential for tasks like autonomous battery charging or other actions requiring a physical connection to a docking station.

In addition to these examples, other key features of Target Navigation that can be leveraged to develop new applications include:

  • No need for a pre-existing map of the environment or precise odometry.

  • Compatibility with the traditional Nav2 stack;

  • Support for obstacle avoidance.

Usage#

The NavigateToTarget Navigator can be accessed either through the Command Line or through the PAL Navigation RViz Panel. In either cases, the user simply needs to select the desired Behavior Tree implementing the Navigation logic, the Target Detection plugin, and the ID of the target to be detected.

Laser Detection#

In this example, the Laser Target Detector is used to detect the specific pattern of the PAL Charging Station (ID 0) and making the robot navigate towards it.

../_images/dock_pattern.svg

Once detected, the NavigateToTarget Navigator will use the output of the Target Detection to navigate the robot to a fixed distance from the target (0.5m).

This same behavior can be achieved using the Command Line Interface with the command:

ros2 action send_goal /navigate_to_target pal_nav2_msgs/action/NavigateToTarget "target_id: 0
detector: 'laser_target_detector'
behavior_tree: 'navigate_to_target'"

Alternatively, if you just want to detect the target without navigating towards it, you can use the following command:

ros2 action send_goal /detect_target pal_nav2_msgs/action/DetectTarget "id: 0
detector: 'laser_target_detector'
recurrent_detection: false" -f

Attention

To properly detect the target, the robot must be equipped with a 2D Laser Sensor and must be close enough to the target for an accurate detection.

ArUco Detection#

In this example, the ArUco Target Detector is used to detect an ArUco Marker placed on the wall (ID 20) and making the robot navigate towards it.

../_images/aruco_pattern.svg

Once detected, the NavigateToTarget Navigator will use the output of the Target Detection to navigate the robot at a fixed distance from the target (0.5m).

This same behavior can be achieved using the Command Line Interface with the command:

ros2 action send_goal /navigate_to_target pal_nav2_msgs/action/NavigateToTarget "target_id: 20
detector: 'aruco_target_detector'
behavior_tree: 'navigate_to_target'"

Alternatively, if you just want to detect the target without navigating towards it, you can use the following command:

ros2 action send_goal /detect_target pal_nav2_msgs/action/DetectTarget "id: 20
detector: 'aruco_target_detector'
recurrent_detection: false" -f

Attention

To properly detect the target, the robot must be equipped with an RGB camera and must be close enough to the target for an accurate detection.

See also#

To continue learning about the Target Navigation, how to implement it and how to create your own behavior trees and demos, check the following sections: