Contents Menu Expand Light mode Dark mode Auto light/dark mode
PAL SDK 23.1 documentation
Logo
PAL SDK 23.1 documentation
  • Getting started with your robot
    • ARI’s unboxing
    • First start-up
    • How-to: Getting started with ARI - Simulator
    • Safety and regulatory reference
    • Disclaimers
    • Power management
    • How to get support?
  • Developing applications
    • Creating no-code user interactions with ARI
    • ARI app development
    • Developing for ARI with ROS
      • List of ROS Topics
        • /active_listening
        • /arm_left_controller/command
        • /arm_left_controller/safe_command
        • /arm_right_controller/command
        • /arm_right_controller/safe_command
        • /audio/channel0
        • /audio/channel1
        • /audio/channel2
        • /audio/channel3
        • /audio/channel4
        • /audio/channel5
        • /audio/raw
        • /audio/sound_direction
        • /audio/sound_localization
        • /audio/speech
        • /audio/status_led
        • /audio/voice_detected
        • /chatbot/trigger
        • /current_zone_of_interest
        • /diagnostics
        • /diagnostics_agg
        • /hand_left_controller/command
        • /hand_right_controller/command
        • /head_controller/command
        • /head_front_camera/color/camera_info
        • /head_front_camera/color/image_raw/*
        • /head_front_camera/image_throttle/compressed
        • /hri_face_detect/ready
        • /hri_face_identification/ready
        • /humans/bodies/*/cropped
        • /humans/bodies/*/joint_states
        • /humans/bodies/*/position
        • /humans/bodies/*/roi
        • /humans/bodies/*/skeleton2d
        • /humans/bodies/tracked
        • /humans/bodies/*/velocity
        • /humans/candidate_matches
        • /humans/faces/*/aligned
        • /humans/faces/*/cropped
        • /humans/faces/*/landmarks
        • /humans/faces/*/roi
        • /humans/faces/tracked
        • /humans/persons/*/alias
        • /humans/persons/*/anonymous
        • /humans/persons/*/body_id
        • /humans/persons/*/engagement_status
        • /humans/persons/*/face_id
        • /humans/persons/known
        • /humans/persons/*/location_confidence
        • /humans/persons/tracked
        • /humans/persons/*/voice_id
        • /humans/voices/*/audio
        • /humans/voices/*/is_speaking
        • /humans/voices/*/speech
        • /humans/voices/tracked
        • /intents
        • /interaction_logger
        • /interaction_profile_manager/parameter_updates
        • /joint_states
        • /kb/add_fact
        • /kb/events/*
        • /kb/remove_fact
        • /left_eye
        • /look_at
        • /look_at_with_style
        • /map
        • /mobile_base_controller/cmd_vel
        • /mobile_base_controller/odom
        • /move_base/current_goal
        • /move_base_simple/goal
        • /pause_navigation
        • /power/battery_level
        • /power/is_charging
        • /power/is_docked
        • /power/is_emergency
        • /power/is_plugged
        • /power_status
        • /right_eye
        • /robot_face
        • /robot_face/expression
        • /robot_face/look_at
        • /scan
        • /torso_back_camera/fisheye1/camera_info
        • /torso_back_camera/fisheye1/image_raw/*
        • /torso_back_camera/fisheye2/camera_info
        • /torso_back_camera/fisheye2/image_raw/*
        • /torso_front_camera/aligned_depth_to_color/camera_info
        • /torso_front_camera/aligned_depth_to_color/image_raw/*
        • /torso_front_camera/color/camera_info
        • /torso_front_camera/color/image_raw/*
        • /torso_front_camera/depth/camera_info
        • /torso_front_camera/depth/color/points
        • /torso_front_camera/depth/image_rect_raw/*
        • /torso_front_camera/infra1/camera_info
        • /torso_front_camera/infra1/image_rect_raw/*
        • /torso_front_camera/infra2/image_rect_raw/compressed
        • /touch_web_state
        • /user_input
        • /web/go_to
        • /web_subtitles
      • List of ROS Actions
        • /arm_left_controller/follow_joint_trajectory
        • /arm_right_controller/follow_joint_trajectory
        • /asr/set_locale
        • /chatbot/set_locale
        • /execute_task
        • /get_user_input
        • /go_and_dock
        • /head_controller/follow_joint_trajectory
        • /motion_manager
        • /move_base
        • /pal_image_navigation_node/image_goal
        • /pal_image_navigation_node/image_look
        • /pal_led_manager/do_effect
        • /pal_play_presentation
        • /pal_play_presentation_from_name
        • /pickup
        • /place
        • /play_motion
        • /play_motion_builder_node/build
        • /play_motion_builder_node/run
        • /safe_arm_left_controller/follow_joint_trajectory
        • /safe_arm_right_controller/follow_joint_trajectory
        • /tts
        • /undocker_server
      • List of ROS Services
        • /active_chatbot
        • /attention_manager/set_policy
        • /gaze_manager/disable_gaze
        • /gaze_manager/disable_neck
        • /gaze_manager/enable_neck
        • /gaze_manager/reset_gaze
        • /get_speech_duration
        • /get_task_configuration
        • /get_task_types
        • /interaction_profile_manager/set_parameters
        • /kb/about
        • /kb/events
        • /kb/lookup
        • /kb/manage
        • /kb/query
        • /kb/revise
        • /kb/sparql
        • /logout
        • /offline_map/undo_last_update
        • /offline_map/update_map
        • /pal_map_manager/change_building
        • /pal_map_manager/change_map
        • /pal_map_manager/current_map
        • /pal_map_manager/list_maps
        • /pal_map_manager/rename_map
        • /pal_map_manager/save_curr_building_conf
        • /pal_map_manager/save_map
        • /pal_master_calibration/save_calibration_section
        • /pal_navigation_sm
        • /pal_presentation_erase
        • /pal_presentation_list
        • /pal_presentation_load
        • /pal_presentation_save
        • /pal_startup_control/get_log
        • /pal_startup_control/get_log_file
        • /pal_startup_control/start
        • /pal_startup_control/stop
        • /play_motion_builder_node/change_joints
        • /play_motion_builder_node/edit_motion
        • /play_motion_builder_node/list_joint_groups
        • /play_motion_builder_node/store_motion
        • /task_administrator/add_task
        • /task_administrator/edit_task
        • /task_administrator/get_schedule_by_date
        • /task_administrator/get_task_list
        • /task_administrator/remove_task
        • /torso_front_camera/rgb_camera/set_parameters
        • /torso_front_camera/stereo_module/set_parameters
        • /wakeup_monitor/enable
        • /wakeup_monitor/get_sleep_pattern
        • /wakeup_monitor/get_wakeup_pattern
        • /wakeup_monitor/set_sleep_pattern
        • /wakeup_monitor/set_wakeup_pattern
      • List of ROS Nodes
        • hri_face_detect
        • hri_fullbody
        • robot_state_publisher
      • List of ROS Parameters
        • /docking/was_docked
        • /human_description_*
        • /humans/faces/height
        • /humans/faces/width
        • /humans/match_threshold
        • /humans/reference_frame
        • /master_calibration/ari_rgbd_sensors_calibration/ir_torso_exposure
        • /master_calibration/user_preferences
        • /pal/language
        • /pal/playback_volume
        • /pal/time_zone
        • /pal_repository/last_upgrade
        • /pal_repository/repository
        • /pal_repository/version
        • /pal_robot_info
        • /play_motion/motions
        • /robot_face
        • /ros_web_cfg/robot_name
        • /task_executor/do_tasks_autonomously
        • /tts/supported_languages
    • Intents
    • Accessing the robot’s capabilities from javascript
    • Tutorial: ARI’s “Hello world”
    • Tutorial: Build and run a new ROS package
    • Tutorial: Deploying ROS packages on the robot
    • Create an application with pal_app
    • Developing with docker and ROS
    • REST interface
  • Robot management
    • Operating system configuration and updates
    • How to flash an ISO on your robot
    • Configuration files
    • ARI networking
    • ARI’s start-up process and application management
    • Configure an application to launch at start-up
    • Using the WebGUI
    • The WebCommander tool
    • Tutorial: How to customise the WebCommander displays?
    • Data management, security, data privacy
    • How-to: Log and retrieve data from the robot
    • How to know the battery level and charging status?
    • Internationalisation and language support
    • Robot management capabilities
  • ARI hardware
    • Hardware overview
    • Accessing ARI sensors
    • Tutorial: Automatic recording of audio
    • ARI hardware capabilities
    • Hardware-related API
      • /torso_front_camera/aligned_depth_to_color/camera_info
      • /torso_front_camera/aligned_depth_to_color/image_raw/*
      • /torso_front_camera/color/camera_info
      • /torso_front_camera/color/image_raw/*
      • /torso_front_camera/depth/camera_info
      • /torso_front_camera/depth/color/points
      • /torso_front_camera/depth/image_rect_raw/*
      • /torso_front_camera/infra1/camera_info
      • /torso_front_camera/infra1/image_rect_raw/*
      • /torso_front_camera/infra2/image_rect_raw/compressed
      • /torso_back_camera/fisheye1/camera_info
      • /torso_back_camera/fisheye1/image_raw/*
      • /torso_back_camera/fisheye2/camera_info
      • /torso_back_camera/fisheye2/image_raw/*
      • /head_front_camera/color/camera_info
      • /head_front_camera/color/image_raw/*
      • /head_front_camera/image_throttle/compressed
      • /audio/channel0
      • /audio/channel1
      • /audio/channel2
      • /audio/channel3
      • /audio/channel4
      • /audio/channel5
      • /audio/raw
      • /audio/sound_direction
      • /audio/sound_localization
      • /audio/speech
      • /audio/status_led
      • /audio/voice_detected
      • /mobile_base_controller/cmd_vel
      • /mobile_base_controller/odom
      • /scan
    • Meaning of ARI LEDs colors and patterns
    • LEDs API
    • ARI microphone array and audio recording
  • Social perception
    • Social perception with ROS4HRI
    • How-to: How to detect a face, a skeleton or a person?
    • Tutorial: detect people oriented toward the robot (Python)
    • Tutorial: detect people around the robot (C++)
    • Social perception topics
      • /humans/faces/*/aligned
      • /humans/faces/*/cropped
      • /humans/faces/*/landmarks
      • /humans/faces/*/roi
      • /humans/faces/tracked
      • /humans/persons/*/face_id
      • /humans/bodies/*/cropped
      • /humans/bodies/*/joint_states
      • /humans/bodies/*/position
      • /humans/bodies/*/roi
      • /humans/bodies/*/skeleton2d
      • /humans/bodies/tracked
      • /humans/bodies/*/velocity
      • /humans/persons/*/body_id
      • /humans/persons/*/voice_id
      • /humans/voices/*/audio
      • /humans/voices/*/is_speaking
      • /humans/voices/*/speech
      • /humans/voices/tracked
      • /humans/persons/*/alias
      • /humans/persons/*/anonymous
      • /humans/persons/*/body_id
      • /humans/persons/*/engagement_status
      • /humans/persons/*/face_id
      • /humans/persons/known
      • /humans/persons/*/location_confidence
      • /humans/persons/tracked
      • /humans/persons/*/voice_id
      • /humans/candidate_matches
  • Speech and language processing
    • Dialogue management
    • Wake-up word detector
    • Internationalisation and language support
    • How-to: Automatic Speech Recognition (ASR)
    • Tutorial: create, translate or update a chatbot
    • Speech and language processing capabilities
    • ASR, TTS and dialogue management APIs
    • Default chit-chat/smalltalk capabilities of ARI
  • Expressive interactions
    • Controlling the attention and gaze of the robot
    • Getting started with ARI - Combine voice, gesture and eyes
    • Tutorial: Creating expressions with LEDs
    • How-to: Control ARI’s expressions
    • APIs for expressiveness control
      • /robot_face/expression
      • /look_at
      • /look_at_with_style
      • /attention_manager/set_policy
      • /gaze_manager/disable_gaze
      • /gaze_manager/disable_neck
      • /gaze_manager/enable_neck
      • /gaze_manager/reset_gaze
  • Navigation
    • How-to: control the mobile base
    • Laser-based SLAM and path planning on the robot
    • Vision-based SLAM
    • Managing maps
    • Autonomous navigation with ARI
    • SLAM and path planning in simulation
    • Laser-based SLAM and path planning on the robot with RViz
    • Laser-based SLAM and path planning on the robot with WebGUI
    • ARI’s Building Manager
    • WebGUI Navigation Manager
    • Navigation capabilities
    • Navigation-related API
  • Knowledge and reasoning
    • Tutorial: Getting started with the knowledge base
    • How-to: How to add/remove/query facts
    • pykb reference
    • KnowledgeCore API
  • Gestures and motions
    • Upper body motion and play_motion
      • /arm_left_controller/command
      • /arm_left_controller/safe_command
      • /arm_right_controller/command
      • /arm_right_controller/safe_command
      • /head_controller/command
      • /arm_left_controller/follow_joint_trajectory
      • /arm_right_controller/follow_joint_trajectory
      • /head_controller/follow_joint_trajectory
      • /safe_arm_left_controller/follow_joint_trajectory
      • /safe_arm_right_controller/follow_joint_trajectory
    • Creating pre-recorded motion with the Motion Builder
    • play_motion: How to play a pre-recorded motion
    • Gestures and motions capabilities
    • List of available motions
  • Touchscreen
    • Presentations
    • ARI’s Touchscreen manager
    • Tutorial: Building a first touchscreen interaction
    • Touchscreen capabilities
  • ARI SDK 23.1 Highlights
  • Frequently Asked Questions
  • Glossary
Back to top

Navigation-related API#

Section

Navigation - reference

This section details ARI’s autonomous navigation framework. The navigation software uses Visual SLAM to perform mapping and localization using the RGB-D camera of the torso. This system works by detecting keypoints or features from the camera input and recognising previously seen locations in order to create a map and localize. The map obtained is represented as an Occupancy Grid Map (OGM) that can later be used to make the robot localize and navigate autonomously in the environment using move_base (http://wiki.ros.org/move_base)

Navigation architecture#

The navigation software provided with ARI can be seen as a black box with the inputs and outputs shown in the figure below.

../_images/ari-navigation.png

As can be seen, the user can communicate with the navigation software using ROS (Robotics Operating System: http://wiki.ros.org/) actions and services. Note that Rviz (http://wiki.ros.org/rviz) also can use these interfaces to help the user perform navigation tasks.

ROS nodes comprising the navigation architecture communicate using topics, actions and services, but also using parameters in the ROS param server.

Topic interfaces#

  • topic_move_base_simple_goal

Topic interface to send the robot to a pose specified in /map metric coordinates. Use this interface if no monitoring of the navigation status is required.

  • /current_zone_of_interest

Topic to print the name of the zone of interest where the robot is at present, if any.

Action interfaces#

  • /move_base

Action to send the robot to a pose specified in /map metric coordinates. Use of this interface is recommended when the user wants to monitor the status of the action.

  • action_poi_navigation_server-go_to_poi

Action to send the robot to an existing Point Of Interest (POI) by providing its identifier. POIs can be set using Rviz Map Editor, see section Laser-based SLAM and path planning on the robot.

  • action_pal_waypoint_navigate

Action to make the robot visit all the POIs of a given group or subset. POIs and POI groups can be defined using the Rviz Map Editor, see section Laser-based SLAM and path planning on the robot.

Service interfaces#

  • /pal_navigation_sm

Service to set the navigation mode to mapping or to localization mode.

In order to set the mapping mode:

rosservice call /pal_navigation_sm "input: 'MAP'"

In order to set the robot in localization mode:

rosservice call /pal_navigation_sm "input: 'LOC'"

In the localization mode the robot is able to plan paths to any valid point on the map.

  • /pal_map_manager/save_map

Service to save the map with a given name. Example:

rosservice call /pal_map_manager/save_map "directory: 'my_office_map'"

The directory argument is the name of the map. If empty, a timestamp will be used. The maps are stored in $HOME/.pal/ari_maps/configurations.

  • /pal_map_manager/change_map

Service to choose the active map. Example:

rosservice call /pal_map_manager/change_map "input: 'my_office_map'"

Next
Knowledge and reasoning
Previous
Navigation capabilities
Copyright © 2023, PAL Robotics
Made with Sphinx and @pradyunsg's Furo
On this page
  • Navigation-related API
    • Navigation architecture
      • Topic interfaces
      • Action interfaces
      • Service interfaces