../_images/tiagopro-icon.png ../_images/kangaroo-icon.png ../_images/tiago-icon.png ../_images/ari-icon.png ../_images/talos-icon.png ../_images/mobile-bases-icon.png

๐ŸŽฏ Target Navigation#

The Target Navigation framework enables robots to detect and autonomously navigate toward various types of Targets.

The purpose of this chapter is to walk you through the usage of the available Target Detectors, how to create new ones, and how to make your robot navigate towards the detected Targets.

../_images/targets.svg

The Target Navigation framework consists of two main components:

  • Target Detection: This component is responsible for detecting a target and providing its position relative to the robot. In general, any object that can be detected using one or more of the robotโ€™s sensors can be considered a target. Some of the targets that can be detected are ArUco Markers, 2D Laser Patterns, Humans and so on.

  • Target Navigation: This component is implemented in the NavigateToTarget Navigator which is a Nav2 Behavior-Tree Navigator that uses the output of the Target Detection to navigate the robot toward the target. This Navigator allows you to define custom navigation logic using a Behavior Tree, enabling users to define the robotโ€™s behavior as it approaches a target. For instance, if the target detection fails or the target is unreachable, the robot could skip the target, wait and retry, or abort its task. This approach offers more targeted navigation where the goal is not just a fixed position in the environment, but a specific object or target whose position can change dynamically.

Since these two components are decoupled, the Target Detection can be used independently of Navigation. For example, the Target Detection feature can be used to localize a target in the environment and create a map of the detected targets.

../_images/architecture.svg

Applications#

The Target Navigation framework allows for the creation of various advanced robot functionalities. Some examples include:

  • Static Target Navigation: The robot identifies the position of a static target (e.g. ArUco marker) and navigates toward it, maintaining a fixed distance from the target.

  • Dynamic Target Navigation: The robot identifies the position of a dynamic target (e.g. a Person) and navigates toward it. As the robot moves, it continuously updates the targetโ€™s position, allowing it to follow the target as it moves.

Note

A similar behavior can be achieved using the Dynamic Object Following feature. This demo uses the Follow Dynamic Point Behavior Tree.

  • Target Alignment: The robot identifies the position of a target and aligns itself accordingly. This is particularly useful when the robot needs to reach a precise position and orientation to perform a specific action, such as grasping an object or delivering materials.

  • Docking: The robot identifies the position of a docking station and navigates toward it until the contact is made. This is essential for tasks like autonomous battery charging or other actions requiring a physical connection to a docking station.

In addition to these examples, other key features of Target Navigation that can be leveraged to develop new applications include:

  • No need for a pre-existing map of the environment or precise odometry.

  • Compatibility with the traditional Nav2 stack;

  • Support for obstacle avoidance.

See also#

To continue learning about the Target Navigation, how to implement it and how to create your own behavior trees and demos, check the following sections: