List of ROS Nodes#
This page list all the nodes available in the SDK 23.12. They are installed and running by default on the robot.
Note
Only the nodes contributing to the public API of the pal-sdk-23.12
are listed here.
Additional ROS nodes might be running on the robot, for internal purposes.
Nodes related to Expressive interactions#
expressive_eyes
#
The node generating the robot’s eyes and controlling their positions.
eyes_demo
#
Control the default gazing behaviour of the robot.
gaze_manager
#
Coordinates eyes and head motion to generate natural gazing behaviours.
See Controlling the attention and gaze of the robot for details.
Nodes related to Robot hardware#
embedded_networking_supervisor
#
point_head_action
#
Exposes an action server to control the robot’s head in cartesian space. See Attention management for details.
head_front_img_throttle
#
image_republisher
#
mm11
#
The low-level motor driver.
pal_led_manager
#
Manages the LEDs of PAL’s robots.
pal_master_calibration
#
power_status_analyzer_node
#
raspi_head_info_republish
#
robot_state_publisher
#
This package allows you to publish the state of a robot to tf2.
rtt_deployer
#
torso_front_camera/realsense2_camera
#
The driver for the front realSense RGB-D camera
torso_front_camera/realsense2_camera_manager
#
The driver for the front realSense RGB-D camera
Nodes related to Robot management#
charging_monitor
#
computer_monitor_control
#
language_center
#
Monitors the currently active language, and notify other nodes when it changes. See Internationalisation and language support for details.
pal_startup_control
#
pal_topic_monitor
#
twist_mux
#
user_preferences_backend
#
Nodes related to Navigation#
key_teleop
#
A node to teleoperate the robot via the keyboard.
map_server
#
Provides the environment 2D map for navigation.
pal_image_navigation_node
#
parking_server
#
zoi_detector
#
Monitors whether the robot enters a pre-defined Zone of Interest. See Defining Zones of Interest to create ZOIs.
Nodes related to Knowledge and reasoning#
knowledge_core
#
The KnowledgeCore knowledge base, including its service-based ROS API
kb_rest
#
The REST endpoint for the robot’s knowledge base (as well as the web-based ‘knowledge base explorer’). Learn more about the knowledge base and the robot’s reasoning capabilities.
people_facts
#
The people_facts package
Nodes related to Social perception#
attention_manager
#
Computes where the robot should focus its gaze (visual attention)
hri_face_body_matcher
#
Face-to-body matcher for ROS4HRI
hri_face_detect
#
Google Mediapipe-based face detection for ROS4HRI.
hri_face_identification
#
ROS4HRI-compatible package for face identification using dlib.
hri_fullbody
#
ROS node implementing 2D/3D full-body pose estimation, using Google Mediapipe. Part of ROS4HRI.
hri_person_manager
#
Combines information about people’s face, body, voice into on consistent representation, following the ROS4HRI conventions
hri_visualization
#
Creates a video stream with overlays for faces and bodies
Nodes related to Speech and language processing#
pal_chatbot_wrapper
#
Interface between the RASA chatbot server and ROS topics/actions.
rasa_action_server
#
This node loads and performs custom chatbot action (small Python script executed in response to specific intents detected by the chatbot engine).
respeaker_node
#
The driver for the reSpeaker microphone array.
soft_wakeup_word
#
Monitors incoming speech for wake-up/sleep keywords, and publishes accordingly the result to /active_listening.
tts_server
#
The Text-to-speech server, based on acapela.
vosk_asr
#
The vosk-based speech recognition (ASR) module.
See How-to: Automatic Speech Recognition (ASR) for more information.
Nodes related to Touchscreen#
rosbridge_websocket
#
Bridge between ROS topics, services, actions, and javascript
web_subtitles_mgr
#
Manages the subtitles displayed on the touchscreen when the robot hears or speaks.