Contents Menu Expand Light mode Dark mode Auto light/dark mode
PAL SDK 23.1 documentation
Logo
PAL SDK 23.1 documentation
  • Getting started with your robot
    • ARI’s unboxing
    • First start-up
    • How-to: Getting started with ARI - Simulator
    • Safety and regulatory reference
    • Disclaimers
    • Power management
    • How to get support?
  • Developing applications
    • Creating no-code user interactions with ARI
    • ARI app development
    • Developing for ARI with ROS
      • List of ROS Topics
        • /active_listening
        • /arm_left_controller/command
        • /arm_left_controller/safe_command
        • /arm_right_controller/command
        • /arm_right_controller/safe_command
        • /audio/channel0
        • /audio/channel1
        • /audio/channel2
        • /audio/channel3
        • /audio/channel4
        • /audio/channel5
        • /audio/raw
        • /audio/sound_direction
        • /audio/sound_localization
        • /audio/speech
        • /audio/status_led
        • /audio/voice_detected
        • /chatbot/trigger
        • /current_zone_of_interest
        • /diagnostics
        • /diagnostics_agg
        • /hand_left_controller/command
        • /hand_right_controller/command
        • /head_controller/command
        • /head_front_camera/color/camera_info
        • /head_front_camera/color/image_raw/*
        • /head_front_camera/image_throttle/compressed
        • /hri_face_detect/ready
        • /hri_face_identification/ready
        • /humans/bodies/*/cropped
        • /humans/bodies/*/joint_states
        • /humans/bodies/*/position
        • /humans/bodies/*/roi
        • /humans/bodies/*/skeleton2d
        • /humans/bodies/tracked
        • /humans/bodies/*/velocity
        • /humans/candidate_matches
        • /humans/faces/*/aligned
        • /humans/faces/*/cropped
        • /humans/faces/*/landmarks
        • /humans/faces/*/roi
        • /humans/faces/tracked
        • /humans/persons/*/alias
        • /humans/persons/*/anonymous
        • /humans/persons/*/body_id
        • /humans/persons/*/engagement_status
        • /humans/persons/*/face_id
        • /humans/persons/known
        • /humans/persons/*/location_confidence
        • /humans/persons/tracked
        • /humans/persons/*/voice_id
        • /humans/voices/*/audio
        • /humans/voices/*/is_speaking
        • /humans/voices/*/speech
        • /humans/voices/tracked
        • /intents
        • /interaction_logger
        • /interaction_profile_manager/parameter_updates
        • /joint_states
        • /kb/add_fact
        • /kb/events/*
        • /kb/remove_fact
        • /left_eye
        • /look_at
        • /look_at_with_style
        • /map
        • /mobile_base_controller/cmd_vel
        • /mobile_base_controller/odom
        • /move_base/current_goal
        • /move_base_simple/goal
        • /pause_navigation
        • /power/battery_level
        • /power/is_charging
        • /power/is_docked
        • /power/is_emergency
        • /power/is_plugged
        • /power_status
        • /right_eye
        • /robot_face
        • /robot_face/expression
        • /robot_face/look_at
        • /scan
        • /torso_back_camera/fisheye1/camera_info
        • /torso_back_camera/fisheye1/image_raw/*
        • /torso_back_camera/fisheye2/camera_info
        • /torso_back_camera/fisheye2/image_raw/*
        • /torso_front_camera/aligned_depth_to_color/camera_info
        • /torso_front_camera/aligned_depth_to_color/image_raw/*
        • /torso_front_camera/color/camera_info
        • /torso_front_camera/color/image_raw/*
        • /torso_front_camera/depth/camera_info
        • /torso_front_camera/depth/color/points
        • /torso_front_camera/depth/image_rect_raw/*
        • /torso_front_camera/infra1/camera_info
        • /torso_front_camera/infra1/image_rect_raw/*
        • /torso_front_camera/infra2/image_rect_raw/compressed
        • /touch_web_state
        • /user_input
        • /web/go_to
        • /web_subtitles
      • List of ROS Actions
        • /arm_left_controller/follow_joint_trajectory
        • /arm_right_controller/follow_joint_trajectory
        • /asr/set_locale
        • /chatbot/set_locale
        • /execute_task
        • /get_user_input
        • /go_and_dock
        • /head_controller/follow_joint_trajectory
        • /motion_manager
        • /move_base
        • /pal_image_navigation_node/image_goal
        • /pal_image_navigation_node/image_look
        • /pal_led_manager/do_effect
        • /pal_play_presentation
        • /pal_play_presentation_from_name
        • /pickup
        • /place
        • /play_motion
        • /play_motion_builder_node/build
        • /play_motion_builder_node/run
        • /safe_arm_left_controller/follow_joint_trajectory
        • /safe_arm_right_controller/follow_joint_trajectory
        • /tts
        • /undocker_server
      • List of ROS Services
        • /active_chatbot
        • /attention_manager/set_policy
        • /gaze_manager/disable_gaze
        • /gaze_manager/disable_neck
        • /gaze_manager/enable_neck
        • /gaze_manager/reset_gaze
        • /get_speech_duration
        • /get_task_configuration
        • /get_task_types
        • /interaction_profile_manager/set_parameters
        • /kb/about
        • /kb/events
        • /kb/lookup
        • /kb/manage
        • /kb/query
        • /kb/revise
        • /kb/sparql
        • /logout
        • /offline_map/undo_last_update
        • /offline_map/update_map
        • /pal_map_manager/change_building
        • /pal_map_manager/change_map
        • /pal_map_manager/current_map
        • /pal_map_manager/list_maps
        • /pal_map_manager/rename_map
        • /pal_map_manager/save_curr_building_conf
        • /pal_map_manager/save_map
        • /pal_master_calibration/save_calibration_section
        • /pal_navigation_sm
        • /pal_presentation_erase
        • /pal_presentation_list
        • /pal_presentation_load
        • /pal_presentation_save
        • /pal_startup_control/get_log
        • /pal_startup_control/get_log_file
        • /pal_startup_control/start
        • /pal_startup_control/stop
        • /play_motion_builder_node/change_joints
        • /play_motion_builder_node/edit_motion
        • /play_motion_builder_node/list_joint_groups
        • /play_motion_builder_node/store_motion
        • /task_administrator/add_task
        • /task_administrator/edit_task
        • /task_administrator/get_schedule_by_date
        • /task_administrator/get_task_list
        • /task_administrator/remove_task
        • /torso_front_camera/rgb_camera/set_parameters
        • /torso_front_camera/stereo_module/set_parameters
        • /wakeup_monitor/enable
        • /wakeup_monitor/get_sleep_pattern
        • /wakeup_monitor/get_wakeup_pattern
        • /wakeup_monitor/set_sleep_pattern
        • /wakeup_monitor/set_wakeup_pattern
      • List of ROS Nodes
        • hri_face_detect
        • hri_fullbody
        • robot_state_publisher
      • List of ROS Parameters
        • /docking/was_docked
        • /human_description_*
        • /humans/faces/height
        • /humans/faces/width
        • /humans/match_threshold
        • /humans/reference_frame
        • /master_calibration/ari_rgbd_sensors_calibration/ir_torso_exposure
        • /master_calibration/user_preferences
        • /pal/language
        • /pal/playback_volume
        • /pal/time_zone
        • /pal_repository/last_upgrade
        • /pal_repository/repository
        • /pal_repository/version
        • /pal_robot_info
        • /play_motion/motions
        • /robot_face
        • /ros_web_cfg/robot_name
        • /task_executor/do_tasks_autonomously
        • /tts/supported_languages
    • Intents
    • Accessing the robot’s capabilities from javascript
    • Tutorial: ARI’s “Hello world”
    • Tutorial: Build and run a new ROS package
    • Tutorial: Deploying ROS packages on the robot
    • Create an application with pal_app
    • Developing with docker and ROS
    • REST interface
  • Robot management
    • Operating system configuration and updates
    • How to flash an ISO on your robot
    • Configuration files
    • ARI networking
    • ARI’s start-up process and application management
    • Configure an application to launch at start-up
    • Using the WebGUI
    • The WebCommander tool
    • Tutorial: How to customise the WebCommander displays?
    • Data management, security, data privacy
    • How-to: Log and retrieve data from the robot
    • How to know the battery level and charging status?
    • Internationalisation and language support
    • Robot management capabilities
  • ARI hardware
    • Hardware overview
    • Accessing ARI sensors
    • Tutorial: Automatic recording of audio
    • ARI hardware capabilities
    • Hardware-related API
      • /torso_front_camera/aligned_depth_to_color/camera_info
      • /torso_front_camera/aligned_depth_to_color/image_raw/*
      • /torso_front_camera/color/camera_info
      • /torso_front_camera/color/image_raw/*
      • /torso_front_camera/depth/camera_info
      • /torso_front_camera/depth/color/points
      • /torso_front_camera/depth/image_rect_raw/*
      • /torso_front_camera/infra1/camera_info
      • /torso_front_camera/infra1/image_rect_raw/*
      • /torso_front_camera/infra2/image_rect_raw/compressed
      • /torso_back_camera/fisheye1/camera_info
      • /torso_back_camera/fisheye1/image_raw/*
      • /torso_back_camera/fisheye2/camera_info
      • /torso_back_camera/fisheye2/image_raw/*
      • /head_front_camera/color/camera_info
      • /head_front_camera/color/image_raw/*
      • /head_front_camera/image_throttle/compressed
      • /audio/channel0
      • /audio/channel1
      • /audio/channel2
      • /audio/channel3
      • /audio/channel4
      • /audio/channel5
      • /audio/raw
      • /audio/sound_direction
      • /audio/sound_localization
      • /audio/speech
      • /audio/status_led
      • /audio/voice_detected
      • /mobile_base_controller/cmd_vel
      • /mobile_base_controller/odom
      • /scan
    • Meaning of ARI LEDs colors and patterns
    • LEDs API
    • ARI microphone array and audio recording
  • Social perception
    • Social perception with ROS4HRI
    • How-to: How to detect a face, a skeleton or a person?
    • Tutorial: detect people oriented toward the robot (Python)
    • Tutorial: detect people around the robot (C++)
    • Social perception topics
      • /humans/faces/*/aligned
      • /humans/faces/*/cropped
      • /humans/faces/*/landmarks
      • /humans/faces/*/roi
      • /humans/faces/tracked
      • /humans/persons/*/face_id
      • /humans/bodies/*/cropped
      • /humans/bodies/*/joint_states
      • /humans/bodies/*/position
      • /humans/bodies/*/roi
      • /humans/bodies/*/skeleton2d
      • /humans/bodies/tracked
      • /humans/bodies/*/velocity
      • /humans/persons/*/body_id
      • /humans/persons/*/voice_id
      • /humans/voices/*/audio
      • /humans/voices/*/is_speaking
      • /humans/voices/*/speech
      • /humans/voices/tracked
      • /humans/persons/*/alias
      • /humans/persons/*/anonymous
      • /humans/persons/*/body_id
      • /humans/persons/*/engagement_status
      • /humans/persons/*/face_id
      • /humans/persons/known
      • /humans/persons/*/location_confidence
      • /humans/persons/tracked
      • /humans/persons/*/voice_id
      • /humans/candidate_matches
  • Speech and language processing
    • Dialogue management
    • Wake-up word detector
    • Internationalisation and language support
    • How-to: Automatic Speech Recognition (ASR)
    • Tutorial: create, translate or update a chatbot
    • Speech and language processing capabilities
    • ASR, TTS and dialogue management APIs
    • Default chit-chat/smalltalk capabilities of ARI
  • Expressive interactions
    • Controlling the attention and gaze of the robot
    • Getting started with ARI - Combine voice, gesture and eyes
    • Tutorial: Creating expressions with LEDs
    • How-to: Control ARI’s expressions
    • APIs for expressiveness control
      • /robot_face/expression
      • /look_at
      • /look_at_with_style
      • /attention_manager/set_policy
      • /gaze_manager/disable_gaze
      • /gaze_manager/disable_neck
      • /gaze_manager/enable_neck
      • /gaze_manager/reset_gaze
  • Navigation
    • How-to: control the mobile base
    • Laser-based SLAM and path planning on the robot
    • Vision-based SLAM
    • Managing maps
    • Autonomous navigation with ARI
    • SLAM and path planning in simulation
    • Laser-based SLAM and path planning on the robot with RViz
    • Laser-based SLAM and path planning on the robot with WebGUI
    • ARI’s Building Manager
    • WebGUI Navigation Manager
    • Navigation capabilities
    • Navigation-related API
  • Knowledge and reasoning
    • Tutorial: Getting started with the knowledge base
    • How-to: How to add/remove/query facts
    • pykb reference
    • KnowledgeCore API
  • Gestures and motions
    • Upper body motion and play_motion
      • /arm_left_controller/command
      • /arm_left_controller/safe_command
      • /arm_right_controller/command
      • /arm_right_controller/safe_command
      • /head_controller/command
      • /arm_left_controller/follow_joint_trajectory
      • /arm_right_controller/follow_joint_trajectory
      • /head_controller/follow_joint_trajectory
      • /safe_arm_left_controller/follow_joint_trajectory
      • /safe_arm_right_controller/follow_joint_trajectory
    • Creating pre-recorded motion with the Motion Builder
    • play_motion: How to play a pre-recorded motion
    • Gestures and motions capabilities
    • List of available motions
  • Touchscreen
    • Presentations
    • ARI’s Touchscreen manager
    • Tutorial: Building a first touchscreen interaction
    • Touchscreen capabilities
  • ARI SDK 23.1 Highlights
  • Frequently Asked Questions
  • Glossary
Back to top

Hardware-related API#

Section

ARI hardware - reference

Cameras#

The following Figure illustrates the cameras mounted on the robot:

../_images/ari_cameras.png

Torso front camera#

Stereo RGB-D camera: This camera is mounted on the frontal side of the torso below the touchscreen and provides RGB images along with a depth image obtained by using an IR projector and an IR camera. The depth image is used to obtain a point cloud of the scene.

Torso front camera-related topics:

  • /torso_front_camera/aligned_depth_to_color/camera_info
  • /torso_front_camera/aligned_depth_to_color/image_raw/*
  • /torso_front_camera/color/camera_info
  • /torso_front_camera/color/image_raw/*
  • /torso_front_camera/depth/camera_info
  • /torso_front_camera/depth/color/points
  • /torso_front_camera/depth/image_rect_raw/*
  • /torso_front_camera/infra1/camera_info
  • /torso_front_camera/infra1/image_rect_raw/*
  • /torso_front_camera/infra2/image_rect_raw/compressed

Torso back camera#

Stereo-fisheye camera: This camera is mounted on the back side of the torso, right below the emergency button, and provides stereo, fisheye and black and white images.

Torso back camera-related topics:

  • /torso_back_camera/fisheye1/camera_info
  • /torso_back_camera/fisheye1/image_raw/*
  • /torso_back_camera/fisheye2/camera_info
  • /torso_back_camera/fisheye2/image_raw/*

Head camera#

Either one of these cameras is located inside ARI’s head.

RGB camera: provides RGB images

RGB-D camera: provides RGB-D images

Head camera-related topics:

  • /head_front_camera/color/camera_info
  • /head_front_camera/color/image_raw/*
  • /head_front_camera/image_throttle/compressed

Optional cameras#

The touchscreen can include up to three more cameras:

  • thermal camera

  • RGB camera

  • RGB-D camera

In order to learn how to access the cameras, please refer to Accessing ARI sensors.

LEDs#

See LEDs API.

Animated eyes#

ARI has LCD eyes which provide a collection of eyes expressions. These can be use to support engaging interactions, along with head and arms movements.

See How-to: Control ARI’s expressions for the list of available expression, API, and code samples.

Speakers and microphones#

ARI has an array of four microphones that can be used to record audio and process it in order to perform tasks such as speech recognition. The microphone is located on the circular gap of the torso. There are two HIFI full-range speakers just below it.

The ReSpeaker Mic Array V2.0 consists of 4 microphones (https://www.seeedstudio.com/ReSpeaker-Mic-Array-v2-0.html). See ARI microphone array and audio recording for details.

To learn more about how to process speech in ARI, refer to Dialogue management.

ReSpeaker topics:

  • /audio/channel0
  • /audio/channel1
  • /audio/channel2
  • /audio/channel3
  • /audio/channel4
  • /audio/channel5
  • /audio/raw
  • /audio/sound_direction
  • /audio/sound_localization
  • /audio/speech
  • /audio/status_led
  • /audio/voice_detected

Joints#

Base#

The base joints of ARI are the front and back wheels of the robot. Take care when wheeling ARI, especially with the smaller back wheels.

../_images/ari_wheels.png

The wheels can be controlled using a ros topic, specifying the desired linear and angular velocity of the robot. These velocities are specified in meters per second and are translated to wheel angular velocities internally.

Drive wheels topics:

  • /mobile_base_controller/cmd_vel
  • /mobile_base_controller/odom

To get more information on how to move around ARI, please refer to Navigation.

Arms#

ARI’s joints and arm movements are illustrated below:

../_images/ari_arms_joints.jpg

Head joints#

The joint movements for ARI’s head are shown below:

../_images/ari_head_joints.jpg

Please refer to the following entries to to learn how to move ARI’s upper body:

Upper body motion and play_motion

play_motion: How to play a pre-recorded motion

whole_body_motion_control

move_it

play_motion

LIDAR#

LIDAR topics:

  • /scan
Next
Meaning of ARI LEDs colors and patterns
Previous
ARI hardware capabilities
Copyright © 2023, PAL Robotics
Made with Sphinx and @pradyunsg's Furo
On this page
  • Hardware-related API
    • Cameras
      • Torso front camera
      • Torso back camera
      • Head camera
      • Optional cameras
    • LEDs
    • Animated eyes
    • Speakers and microphones
    • Joints
      • Base
      • Arms
      • Head joints
    • LIDAR