List of ROS Topics#
This page list all the public topics exposed in PAL OS 24.9.
Caution
Only the topics contributing to the public API of the PAL OS 24.9 are listed here.
Additional ROS topics might be present on the robot, for internal purposes. They are however not part of the documented and supported robot API.
Alphabetic index#
/active_listening
/amcl_pose
/arm_controller/joint_trajectory
[π€ ROS2?] /arm_left_controller/command
[π€ ROS2?] /arm_left_controller/safe_command
/arm_right_controller/command
/arm_right_controller/safe_command
/audio_in/channel0
/audio_in/channel1
/audio_in/channel2
/audio_in/channel3
/audio_in/channel4
/audio_in/channel5
/audio_in/raw
/audio_in/sound_direction
/audio_in/sound_localization
/audio_in/speech
[π€ ROS2?] /audio_in/status_led
/audio_in/voice_detected
/audio_out/raw
/base_imu
[π€ ROS2?] /chatbot/trigger
/cmd_vel
[π€ ROS2?] /diagnostics
[π€ ROS2?] /diagnostics_agg
/dlo_ros/odom
[π€ ROS2?] /end_effector_camera/camera_info
[π€ ROS2?] /end_effector_camera/image_raw
[π€ ROS2?] /end_effector_left_camera/camera_info
[π€ ROS2?] /end_effector_left_camera/image_raw
[π€ ROS2?] /end_effector_right_camera/camera_info
[π€ ROS2?] /end_effector_right_camera/image_raw
/eulero_manager/feedback
/eulero_manager/update
/global_costmap/costmap
/global_costmap/footprint
/goal_pose
/gripper_controller/joint_trajectory
[π€ ROS2?] /hand_left_controller/command
[π€ ROS2?] /hand_right_controller/command
[π€ ROS2?] /head_controller/command
/head_controller/joint_trajectory
[π€ ROS2?] /head_front_camera/color/camera_info
[π€ ROS2?] /head_front_camera/color/image_raw/*
[π€ ROS2?] /head_front_camera/image_throttle/compressed
[π€ ROS2?] /hri_face_detect/ready
[π€ ROS2?] /hri_face_identification/ready
/humans/bodies/*/cropped
/humans/bodies/*/joint_states
/humans/bodies/*/position
/humans/bodies/*/roi
/humans/bodies/*/skeleton2d
/humans/bodies/tracked
/humans/bodies/*/velocity
/humans/candidate_matches
/humans/faces/*/aligned
/humans/faces/*/cropped
/humans/faces/*/landmarks
/humans/faces/*/roi
/humans/faces/tracked
/humans/persons/*/alias
/humans/persons/*/anonymous
/humans/persons/*/body_id
/humans/persons/*/engagement_status
/humans/persons/*/face_id
/humans/persons/known
/humans/persons/*/location_confidence
/humans/persons/tracked
/humans/persons/*/voice_id
/humans/voices/*/audio
/humans/voices/*/is_speaking
/humans/voices/*/speech
/humans/voices/tracked
/initialpose
/input_joy/cmd_vel
/intents
/joint_states
[π€ ROS2?] /joint_torque_states
/joy
/joy_priority
/joy_vel
/kb/active_concepts
/kb/add_fact
/kb/events/*
/kb/remove_fact
/keepout_map_mask/mask
/keepout_map_mask/mask_info
/key_vel
/local_costmap/costmap
/local_costmap/footprint
/local_plan
/look_at
/map
/map_metadata
/mobile_base_controller/cmd_vel_unstamped
/mobile_base_controller/odom
/particle_cloud
/pause_navigation
/phone_vel
/plan
/pose
[π€ ROS2?] /power/battery_level
[π€ ROS2?] /power/is_charging
[π€ ROS2?] /power/is_docked
[π€ ROS2?] /power/is_emergency
[π€ ROS2?] /power/is_plugged
/power_status
[π€ ROS2?] /robot_face/background_image
/robot_face/expression
/robot_face/image_raw/*
/robot_face/look_at
/rviz_joy_vel
/scan
/scan_front_raw
/scan_raw
/scan_rear_raw
/slam_toolbox/graph_visualization
/slam_toolbox/scan_visualization
/slam_toolbox/update
/sonar_base
/speed_limit
/speed_map_mask/mask
/speed_map_mask/mask_info
/target_detector/goal
/target_detector_server/image
[π€ ROS2?] /torso_back_camera/fisheye1/camera_info
[π€ ROS2?] /torso_back_camera/fisheye1/image_raw/*
[π€ ROS2?] /torso_back_camera/fisheye2/camera_info
[π€ ROS2?] /torso_back_camera/fisheye2/image_raw/*
[π€ ROS2?] /torso_controller/command
/torso_controller/joint_trajectory
[π€ ROS2?] /torso_controller/safe_command
[π€ ROS2?] /torso_front_camera/aligned_depth_to_color/camera_info
[π€ ROS2?] /torso_front_camera/aligned_depth_to_color/image_raw/*
[π€ ROS2?] /torso_front_camera/color/camera_info
[π€ ROS2?] /torso_front_camera/color/image_raw/*
[π€ ROS2?] /torso_front_camera/depth/camera_info
[π€ ROS2?] /torso_front_camera/depth/color/points
[π€ ROS2?] /torso_front_camera/depth/image_rect_raw/*
[π€ ROS2?] /torso_front_camera/infra1/camera_info
[π€ ROS2?] /torso_front_camera/infra1/image_rect_raw/*
[π€ ROS2?] /torso_front_camera/infra2/image_rect_raw/compressed
[π€ ROS2?] /touch_web_state
/updated_goal
[π€ ROS2?] /user_input
[π€ ROS2?] /web/go_to
[π€ ROS2?] /web_subtitles
/wrist_ft
[π€ ROS2?] /xtion/depth_registered/camera_info
[π€ ROS2?] /xtion/depth_registered/image_raw
[π€ ROS2?] /xtion/depth_registered/points
[π€ ROS2?] /xtion/rgb/camera_info
[π€ ROS2?] /xtion/rgb/image_raw
[π€ ROS2?] /xtion/rgb/image_rect_color
By capability#
π¬ Communication#
/active_listening
(documentation) Whether or not recognized speech should be further processed (eg by the chatbot). See overview_nlp for details./chatbot/trigger
(documentation) Publish here chatbot intents you want to trigger. This is espectially useful to implement a pro-active behaviour, where the robot starts itself the conversation.See overview_nlp for details.
/humans/voices/*/audio
(documentation) The audio stream of the voice./humans/voices/*/is_speaking
(documentation) Whether verbal content is currently recognised in this voiceβs audio stream./humans/voices/*/speech
(documentation) The recognised text, as spoken by this voice./humans/voices/tracked
(documentation) The list of voices currently detected by the robot.
π Developing applications#
/intents
(documentation) An intent, encoding a desired activity to be scheduled by the robot (not to be confused by the chatbot intents). Read more about Intents.
π Expressive interactions#
/look_at
(documentation) Set a target for the robot to look at. Uses both the eyes and the head position./robot_face/background_image
(documentation) Displays a ROS video stream as background of the robotβs face/eyes. See Background and overlays for details./robot_face/expression
(documentation) Set the expression of ARI eyes. See Robot face and expressions for details./robot_face/image_raw/*
(documentation) The left and right images to be displayed on the robotβs eyes. Published by default by the expressive_eyes node. If you want to publish your own face on this topic, you might want to first stop the/robot_face/look_at
(documentation) Sets the direction of eyes. If you want to control the gaze direction, use instead /look_at. See attention-management for details.
βοΈ Robots hardware#
/audio_in/channel0
(documentation) Merged audio channel of the ReSpeakerβs 4 microphones/audio_in/channel1
(documentation) Audio stream from the ReSpeakerβs first microphone./audio_in/channel2
(documentation) Audio stream from the ReSpeakerβs second microphone./audio_in/channel3
(documentation) Audio stream from the ReSpeakerβs third microphone./audio_in/channel4
(documentation) Audio stream from the ReSpeakerβs fourth microphone./audio_in/channel5
(documentation) Monitor audio stream from the ReSpeakerβs audio input (used for self-echo cancellation)./audio_in/raw
(documentation) Merged input audio channel from the microphone. For robot equipped with a ReSpeaker array, this is an alias for /audio_in/channel0./audio_in/sound_direction
(documentation) The estimated Direction of Arrival of the detected sound./audio_in/sound_localization
(documentation) The estimated sound source location./audio_in/speech
(documentation) Raw audio data of detected speech (published once the person has finished speaking)./audio_in/status_led
(documentation) The topic controlling the reSpeaker microphone LEDs. Do not use this topic directly. Instead, use /pal_led_manager/do_effect./audio_in/voice_detected
(documentation) Publishes a boolean indicating if a voice is currently detected (ie, whether someone is currently speaking)/audio_out/raw
(documentation) Audio data published on this topic is directly played on the robotβs loudspeakers./base_imu
(documentation) Inertial data from the IMU./end_effector_camera/camera_info
(documentation) Intrinsic and distortion parameters of the RGB endoscopic camera./end_effector_camera/image_raw
(documentation) RGB image of the endoscopic camera/end_effector_left_camera/camera_info
(documentation) Intrinsic and distortion parameters of the RGB endoscopic camera for the left arm./end_effector_left_camera/image_raw
(documentation) RGB image of the endoscopic camera for the left arm./end_effector_right_camera/camera_info
(documentation) Intrinsic and distortion parameters of the RGB endoscopic camera for the right arm./end_effector_right_camera/image_raw
(documentation) RGB image of the endoscopic camera for the right arm./head_front_camera/color/camera_info
(documentation) Camera calibration and metadata/head_front_camera/color/image_raw/*
(documentation) Color rectified image. RGB format/head_front_camera/image_throttle/compressed
(documentation) Compressed head image./joint_states
(documentation) The current state of the robotβs joints (eg angular position of each joint)./joint_torque_states
(documentation) The current state of the robotβs joints with effort indicating the measured torque instead of the current (eg angular position of each joint)./sonar_base
(documentation) Readings of the sonar./torso_back_camera/fisheye1/camera_info
(documentation) Camera calibration and metadata (fisheye2)/torso_back_camera/fisheye1/image_raw/*
(documentation) Fisheye image/torso_back_camera/fisheye2/camera_info
(documentation) Camera calibration and metadata (fisheye2)/torso_back_camera/fisheye2/image_raw/*
(documentation) Fisheye image (/torso_front_camera/aligned_depth_to_color/camera_info
(documentation) Intrinsics parameters of the aligned dept to color image/torso_front_camera/aligned_depth_to_color/image_raw/*
(documentation) Aligned depth to color image/torso_front_camera/color/camera_info
(documentation) Camera calibration and metadata/torso_front_camera/color/image_raw/*
(documentation) Color rectified image. RGB format/torso_front_camera/depth/camera_info
(documentation) Camera calibration and metadata/torso_front_camera/depth/color/points
(documentation) Registered XYZRGB point cloud./torso_front_camera/depth/image_rect_raw/*
(documentation) Rectified depth image/torso_front_camera/infra1/camera_info
(documentation) Camera calibration and metadata (infra1 and infra2)/torso_front_camera/infra1/image_rect_raw/*
(documentation) Raw uint16 IR image/torso_front_camera/infra2/image_rect_raw/compressed
(documentation)/wrist_ft
(documentation) Force and torque vectors currently detected by the Force/Torque sensor./xtion/depth_registered/camera_info
(documentation) Intrinsic parameters of the depth image./xtion/depth_registered/image_raw
(documentation) 32-bit depth image. Every pixel contains the depth of the corresponding point in meters./xtion/depth_registered/points
(documentation) Point cloud computed from the depth image./xtion/rgb/camera_info
(documentation) Intrinsic and distortion parameters of the RGB camera./xtion/rgb/image_raw
(documentation) RGB image./xtion/rgb/image_rect_color
(documentation) Rectified RGB image.
π Robot management#
/diagnostics
(documentation)/diagnostics_agg
(documentation)/power_status
(documentation)/power/battery_level
(documentation)/power/is_charging
(documentation)/power/is_docked
(documentation)/power/is_emergency
(documentation)/power/is_plugged
(documentation)
π Gestures and motions#
/arm_controller/joint_trajectory
(documentation) Sequence of positions that the joints have to reach in given time intervals./arm_left_controller/command
(documentation)/arm_left_controller/safe_command
(documentation)/arm_right_controller/command
(documentation)/arm_right_controller/safe_command
(documentation)/gripper_controller/joint_trajectory
(documentation) Sequence of positions that the joints have to reach in given time intervals./hand_left_controller/command
(documentation)/hand_right_controller/command
(documentation)/head_controller/command
(documentation)/head_controller/joint_trajectory
(documentation) Sequence of positions that the joints have to reach in given time intervals./torso_controller/command
(documentation) This topic takes a sequence of positions that the torso joint needs to reach at given time intervals./torso_controller/joint_trajectory
(documentation) Sequence of positions that the joints have to reach in given time intervals./torso_controller/safe_command
(documentation) This topic takes a sequence of positions that the torso joint needs to reach at given time interval; the motion is only executed if it does not lead to a self-collision.
π‘ Knowledge and reasoning#
/kb/active_concepts
(documentation) Lists the symbolic concepts that are currently active (an active concept is a concept ofrdf:type ActiveConcept
).See the KnowledgeCore API for details.
/kb/add_fact
(documentation) Statements published to this topic are added to the knowledge base. The string must represent a <s, p, o> triple, with terms separated by a space.See the KnowledgeCore API for details.
/kb/events/*
(documentation) Event notifications for previously subscribed events. See /kb/events for details./kb/remove_fact
(documentation) Statements published to this topic are removed from the knowledge base. The string must represent a <s, p, o> triple, with terms separated by a space.See the KnowledgeCore API for details.
π₯οΈ Touchscreen#
/touch_web_state
(documentation)/web/go_to
(documentation) Sets the webpage to be displayed on the touchscreen.