List of ROS Nodes#
This page list all the nodes available in PAL OS 24.9. They are installed and running by default on the robot.
Note
Only the nodes contributing to the public API of the PAL OS 24.9 are listed here.
Additional ROS nodes might be running on the robot, for internal purposes.
Nodes related to 💬 Communication#
asr_vosk
#
The vosk-based speech recognition (ASR) module.
See asr_howto for more information.
chatbot_rasa
#
A RASA-based chatbot.
communication_hub
#
A node routing speech between the ASR node, the chatbot(s) and the TTS.
i18n_manager
#
Monitors the currently active language, and notify other nodes when it changes. See internationalization for details.
language_center
#
Monitors the currently active language, and notify other nodes when it changes. See internationalization for details.
pal_chatbot_wrapper
#
Interface between the RASA chatbot server and ROS topics/actions.
pal_tts
#
Text-to-speech node, with several backends, including acapela and a non-verbal one, ‘R2D2’-style.
rasa_action_server
#
This node loads and performs custom chatbot action (small Python script executed in response to specific intents detected by the chatbot engine).
respeaker_node
#
The driver for the reSpeaker microphone array.
soft_wakeup_word
#
Monitors incoming speech for wake-up/sleep keywords, and publishes accordingly the result to /active_listening.
See Wake-up word detector for detailed documentation.
tts_engine
#
The text-to-speech engine, with several available back-ends, like acapela or a synthetic non-verbal backend.
vosk_asr
#
The vosk-based speech recognition (ASR) module.
See asr_howto for more information.
Nodes related to 😄 Expressive interactions#
expressive_eyes
#
The node generating the robot’s eyes and controlling their positions.
eyes_demo
#
Control the default gazing behaviour of the robot.
gaze_manager
#
Coordinates eyes and head motion to generate natural gazing behaviours.
See attention-management for details.
Nodes related to ⚙️ Robots hardware#
ari_eyes_ros_driver
#
arm_controller
#
back_panel_battery_notification_node
#
embedded_networking_supervisor
#
gripper_controller
#
head_controller
#
head_front_img_throttle
#
image_republisher
#
joint_torque_state_controller
#
joy
#
Connects to the external Bluetooth gamepad, and publishes accordingly messages to remotely control the robot’s base, torso and head.
mm11
#
The low-level motor driver.
pal_diagnostic_aggregator
#
Aggregates all robot diagnostic messages, to present them in an unified way.
pal_diagnostic_reporter
#
pal_led_manager
#
Manages the LEDs of PAL’s robots.
pal_master_calibration
#
point_head_action
#
Exposes an action server to control the robot’s head in cartesian space. See attention_manager for details.
power_status_analyzer_node
#
raspi_head_info_republish
#
raspi_image_compressed_republish
#
robot_head_display
#
Displays the robot’s face, along with some basic controls to e.g. change the volume or the active language.
robot_state_publisher
#
This package allows you to publish the state of a robot to tf2.
rtt_deployer
#
sick*
#
ROS 2 Driver for the Sick Laser Scanners.
torso_controller
#
torso_front_camera/realsense2_camera
#
The driver for the front realSense RGB-D camera
torso_front_camera/realsense2_camera_manager
#
The driver for the front realSense RGB-D camera
Nodes related to 🛠 Robot management#
charging_monitor
#
computer_monitor_control
#
controller_manager
#
Manages lifecycle of controllers, providing an interface for loading, unloading, starting, and stopping ROS controllers.
See How to change your robot’s motion controller for details.
pal_startup_control
#
pal_topic_monitor
#
user_preferences_backend
#
Nodes related to 👋 Gestures and motions#
play_motion2
#
Plays and handle pre-recorded motions in ROS 2.
See Upper body motion and play_motion2 for more information.
Nodes related to 🧭 Navigation#
amcl
#
Adaptive Monte Carlo Localization (AMCL) is a probabilistic localization module which estimates the position and orientation (i.e. Pose) of a robot in a given known map using a 2D laser scanner.
behavior_server
#
The Behavior Server implements a task server for handling and executing various behaviors, such as recoveries and docking.
bt_navigator
#
bt_navigator_navigate_through_poses
#
This node is a BT Navigator Plugin that implements the NavigateThroughPoses task interface.
bt_navigator_navigate_to_pose
#
This node is a BT Navigator Plugin that implements the NavigateToPose task interface.
controller_server
#
The Controller Server handles the controller requests for the execution of a certain plan using controller plugin. It will take in path and plugin names for controller, progress checker and goal checker to use and call the appropriate plugins. It also hosts the local costmap.
dlo_ros
#
The Direct Laser Odometry (DLO) takes wheel odometry and laser data as input while provides odometry corrected as output.
eulero_manager
#
Eulero is a centralized system for managing Environmental Metadata supporting a variety of different complex configurations such as Multi-Floor Navigation and Multi-Map Navigation.
global_costmap
#
The Global Costmap is a 2D grid-based costmap for environmental representations, consisting of several “layers” of data about the environment. It covers all the environment surrounding the robot.
joy_teleop
#
A configurable node to map joystick controls to robot teleoperation commands.
joystick
#
This node interfaces a generic Linux joystick to ROS. This is a lightweight, Linux-only node with no external dependencies.
joystick_relay
#
The Joystick Relay node is part of the twist_mux package and offers several interfaces to control the robot using a joystick input. It implements a Priority interface to enable or disable the joystick and a Turbo interface to change the speed of the robot.
key_teleop
#
A node to teleoperate the robot via the keyboard.
laserscan_multi_merger
#
Laserscan_merger allows to easily merge multiple laser scans into a single one.
lifecycle_manager_advanced_navigation
#
The Lifecycle Manager implements a method for handling the lifecycle transition states for multiple Lifecycle Nodes in a deterministic way. In this case it is used to manage all the Advanced Navigation nodes.
lifecycle_manager_laser
#
The Lifecycle Manager implements a method for handling the lifecycle transition states for multiple Lifecycle Nodes in a deterministic way. In this case it is used to manage all the Laser nodes.
lifecycle_manager_localization
#
The Lifecycle Manager implements a method for handling the lifecycle transition states for multiple Lifecycle Nodes in a deterministic way. In this case it is used to manage all the Localization nodes.
lifecycle_manager_navigation
#
The Lifecycle Manager implements a method for handling the lifecycle transition states for multiple Lifecycle Nodes in a deterministic way. In this case it is used to manage all the Navigation nodes.
local_costmap
#
The Local Costmap is a 2D grid-based costmap for environmental representations, consisting of several “layers” of data about the environment. It covers a square of a limited size centered in the robot.
map_mask_server
#
PAL Map Masks provides a simple, plugin-based way, to create Nav2 Costmap 2D Filters masks that can be used to annotate maps.
map_saver
#
Map Saver is a node running in background providing the /map_saver/save_map that reads maps expressed as OccupancyGrids from a ROS 2 topic and stores them in the filesystem.
map_server
#
The Map Server hosts a map expressed as an OccupancyGrid and makes it available in ROS 2 with the /map and the /map_server/load_map
mobile_base_controller
#
Converts a velocity message in the form of geometry_msgs/msg/Twist into a control command for each of the robot’s wheels.
pal_bt_navigator_through_waypoints
#
Alongside Nav2 NavigateToPose and Nav2 NavigateThroughPose, NavigateThroughWaypoints is a Nav2 BT Navigator intended to allow for flexibility in the Waypoint navigation task and provides a way to easily specify complex robot behaviors.
pal_bt_navigator_to_target
#
Alongside Nav2 NavigateToPose and Nav2 NavigateThroughPoses, NavigateToTargetNavigator is a Nav2 BT Navigator intended to allow robots to navigate towards a specified Target identified by an ID, using a specified detector and a Behavior Tree for the navigation logic.
pal_laser_filters
#
This node implements a plugin-based filter chain for sensor_msgs/msg/LaserScan messages. It allows for the sequential application of multiple Laser filters and is designed to be compatible with ROS 2 laser_filters
planner_server
#
slam_toolbox_sync
#
Slam Toolbox is a set of tools and capabilities for 2D SLAM.
target_detector_server
#
twist_mux
#
This node provides a multiplexer for geometry_msgs/msg/Twist messages. It takes N input twist topics and outputs the messages from a single one. For selecting the topic they are selected based on their priority, the messages timeout and M input lock topics that can inhibit one input twist topic.
waypoint_follower
#
Other nodes#
chitchat_demo
#
incrementer
#
pal_interaction_environment_node
#
Nodes related to 💡 Knowledge and reasoning#
kb_rest
#
The REST endpoint for the robot’s knowledge base (as well as the web-based ‘knowledge base explorer’). Learn more about the knowledge base and the robot’s reasoning capabilities.
knowledge_core
#
The KnowledgeCore knowledge base, including its service-based ROS API
people_facts
#
The people_facts package
Nodes related to 👥 Social perception#
attention_manager
#
Computes where the robot should focus its gaze (visual attention)
hri_engagement
#
ROS4HRI-compatible node to estimate the level of engagement of people in the robot’s vicinity.
hri_face_body_matcher
#
Face-to-body matcher for ROS4HRI
hri_face_detect
#
Google Mediapipe-based face detection for ROS4HRI.
hri_face_identification
#
ROS4HRI-compatible package for face identification using dlib.
hri_fullbody
#
ROS node implementing 2D/3D full-body pose estimation, using Google Mediapipe. Part of ROS4HRI.
hri_person_manager
#
Combines information about people’s face, body, voice into on consistent representation, following the ROS4HRI conventions
hri_visualization
#
Creates a video stream with overlays for faces and bodies
input_action_server
#
Nodes related to 🖥️ Touchscreen#
pal_chrome
#
rosbridge_websocket
#
Bridge between ROS topics, services, actions, and javascript
web_subtitles_mgr
#
Manages the subtitles displayed on the touchscreen when the robot hears or speaks.
Nodes with detailled documentation#
amcl
behavior_server
bt_navigator
bt_navigator_navigate_through_poses
bt_navigator_navigate_to_pose
controller_server
dlo_ros
eulero_manager
global_costmap
hri_face_body_matcher
hri_face_detect
hri_fullbody
joy_teleop
joystick
joystick_relay
key_teleop
knowledge_core
laserscan_multi_merger
lifecycle_manager_advanced_navigation
lifecycle_manager_laser
lifecycle_manager_localization
lifecycle_manager_navigation
local_costmap
map_mask_server
map_saver
map_server
mobile_base_controller
pal_bt_navigator_through_waypoints
pal_bt_navigator_to_target
pal_laser_filters
planner_server
robot_state_publisher
sick*
slam_toolbox_sync
target_detector_server
twist_mux
waypoint_follower