Contents Menu Expand Light mode Dark mode Auto light/dark mode
PAL SDK 23.1 documentation
PAL SDK 23.1 documentation
SDK version
  • The ARI robot
    • ARI’s unboxing
    • First start-up
    • How-to: Getting started with ARI - Simulator
    • Safety and regulatory reference
    • Disclaimers
    • Power management
    • How to get support?
  • Developing applications
    • Creating no-code user interactions with ARI
    • ARI app development
    • Developing for ARI with ROS
      • List of ROS Topics
        • /active_listening
        • /arm_left_controller/command
        • /arm_left_controller/safe_command
        • /arm_right_controller/command
        • /arm_right_controller/safe_command
        • /audio
        • /audio/channel0
        • /current_zone_of_interest
        • /diagnostics
        • /diagnostics_agg
        • /eyes/expression
        • /hand_left_controller/command
        • /hand_right_controller/command
        • /head_controller/command
        • /head_front_camera/color/camera_info
        • /head_front_camera/color/image_raw/*
        • /head_front_camera/image_throttle/compressed
        • /hri_face_detect/ready
        • /hri_face_identification/ready
        • /humans/bodies/*/cropped
        • /humans/bodies/*/joint_states
        • /humans/bodies/*/position
        • /humans/bodies/*/roi
        • /humans/bodies/*/skeleton2d
        • /humans/bodies/tracked
        • /humans/bodies/*/velocity
        • /humans/candidate_matches
        • /humans/faces/*/aligned
        • /humans/faces/*/cropped
        • /humans/faces/*/landmarks
        • /humans/faces/*/roi
        • /humans/faces/tracked
        • /humans/persons/*/alias
        • /humans/persons/*/anonymous
        • /humans/persons/*/body_id
        • /humans/persons/*/engagement_status
        • /humans/persons/*/face_id
        • /humans/persons/known
        • /humans/persons/*/location_confidence
        • /humans/persons/tracked
        • /humans/persons/*/voice_id
        • /humans/voices/*/audio
        • /humans/voices/*/is_speaking
        • /humans/voices/*/speech
        • /humans/voices/tracked
        • /intents
        • /interaction_logger
        • /interaction_profile_manager/parameter_updates
        • /is_speeching
        • /joint_states
        • /kb/add_fact
        • /kb/events/*
        • /kb/remove_fact
        • /look_at
        • /look_at_with_style
        • /map
        • /mobile_base_controller/cmd_vel
        • /mobile_base_controller/odom
        • /move_base/current_goal
        • /move_base_simple/goal
        • /pause_navigation
        • /power/battery_level
        • /power/is_charging
        • /power/is_docked
        • /power/is_emergency
        • /power/is_plugged
        • /power_status
        • /scan
        • /sound_direction
        • /sound_localization
        • /speech_audio
        • /speech_multi_recognizer/recognizing
        • /torso_back_camera/fisheye1/camera_info
        • /torso_back_camera/fisheye1/image_raw/*
        • /torso_back_camera/fisheye2/camera_info
        • /torso_back_camera/fisheye2/image_raw/*
        • /torso_front_camera/aligned_depth_to_color/camera_info
        • /torso_front_camera/aligned_depth_to_color/image_raw/*
        • /torso_front_camera/color/camera_info
        • /torso_front_camera/color/image_raw/*
        • /torso_front_camera/depth/camera_info
        • /torso_front_camera/depth/color/points
        • /torso_front_camera/depth/image_rect_raw/*
        • /torso_front_camera/infra1/camera_info
        • /torso_front_camera/infra1/image_rect_raw/*
        • /torso_front_camera/infra2/image_rect_raw/compressed
        • /touch_web_state
        • /user_input
        • /web/go_to
        • /web_subtitles
      • List of ROS Actions
        • /arm_left_controller/follow_joint_trajectory
        • /arm_right_controller/follow_joint_trajectory
        • /conversation_manager
        • /execute_task
        • /get_user_input
        • /go_and_dock
        • /head_controller/follow_joint_trajectory
        • /manage_chatbot
        • /motion_manager
        • /move_base
        • /pal_head_manager/disable
        • /pal_image_navigation_node/image_goal
        • /pal_image_navigation_node/image_look
        • /pal_led_manager/do_effect
        • /pal_play_presentation
        • /pal_play_presentation_from_name
        • /pickup
        • /place
        • /play_motion
        • /play_motion_builder_node/build
        • /play_motion_builder_node/run
        • /safe_arm_left_controller/follow_joint_trajectory
        • /safe_arm_right_controller/follow_joint_trajectory
        • /speech_multi_recognizer
        • /start_asr
        • /stop_asr
        • /train_chatbot
        • /tts
        • /undocker_server
      • List of ROS Services
        • /active_chatbot
        • /get_speech_duration
        • /get_task_configuration
        • /get_task_types
        • /interaction_profile_manager/set_parameters
        • /kb/about
        • /kb/events
        • /kb/lookup
        • /kb/manage
        • /kb/query
        • /kb/revise
        • /kb/sparql
        • /logout
        • /offline_map/undo_last_update
        • /offline_map/update_map
        • /pal_map_manager/change_building
        • /pal_map_manager/change_map
        • /pal_map_manager/current_map
        • /pal_map_manager/list_maps
        • /pal_map_manager/rename_map
        • /pal_map_manager/save_curr_building_conf
        • /pal_map_manager/save_map
        • /pal_master_calibration/save_calibration_section
        • /pal_navigation_sm
        • /pal_presentation_erase
        • /pal_presentation_list
        • /pal_presentation_load
        • /pal_presentation_save
        • /pal_startup_control/get_log
        • /pal_startup_control/get_log_file
        • /pal_startup_control/start
        • /pal_startup_control/stop
        • /play_motion_builder_node/change_joints
        • /play_motion_builder_node/edit_motion
        • /play_motion_builder_node/list_joint_groups
        • /play_motion_builder_node/store_motion
        • /task_administrator/add_task
        • /task_administrator/edit_task
        • /task_administrator/get_schedule_by_date
        • /task_administrator/get_task_list
        • /task_administrator/remove_task
        • /torso_front_camera/rgb_camera/set_parameters
        • /torso_front_camera/stereo_module/set_parameters
        • /wakeup_monitor/enable
        • /wakeup_monitor/get_sleep_pattern
        • /wakeup_monitor/get_wakeup_pattern
        • /wakeup_monitor/set_sleep_pattern
        • /wakeup_monitor/set_wakeup_pattern
      • List of ROS Nodes
        • robot_state_publisher
      • List of ROS Parameters
        • /docking/was_docked
        • /human_description_*
        • /humans/faces/height
        • /humans/faces/width
        • /humans/match_threshold
        • /humans/reference_frame
        • /master_calibration/ari_rgbd_sensors_calibration/ir_torso_exposure
        • /master_calibration/user_preferences
        • /pal/language
        • /pal/playback_volume
        • /pal/time_zone
        • /pal_repository/last_upgrade
        • /pal_repository/repository
        • /pal_repository/version
        • /pal_robot_info
        • /play_motion/motions
        • /ros_web_cfg/robot_name
        • /task_executor/do_tasks_autonomously
        • /tts/supported_languages
    • Intents
    • Accessing the robot’s capabilities from javascript
    • Create an application with pal_app
    • Tutorial: ARI’s “Hello world”
    • 🚧 Tutorial: Creating a simple multi-modal interaction
    • Tutorial: Build and run a new ROS package
    • Tutorial: Deploying ROS packages on the robot
    • Developing with docker and ROS
    • REST interface
  • Robot management
    • ARI’s start-up process and application management
    • 🚧 Overview of ARI’s configuration files and software structure
    • 🚧 ARI’s internal computers, permanent partition, system and firmware upgrades
    • 🚧 Internationalisation and language support
    • Using the WebGUI
    • Configure an application to launch at start-up
    • Tutorial: How to customise the WebCommander displays?
    • How-to: How to know battery life / charging
    • Data management, security, data privacy on ARI
    • How-to: Log and retrieve data from the robot
    • Robot management capabilities
    • The WebCommander tool
    • ARI networking
  • ARI hardware
    • Hardware overview
    • Accessing ARI sensors
    • Tutorial: Automatic recording of audio
    • ARI hardware capabilities
    • Hardware-related API
      • /torso_front_camera/aligned_depth_to_color/camera_info
      • /torso_front_camera/aligned_depth_to_color/image_raw/*
      • /torso_front_camera/color/camera_info
      • /torso_front_camera/color/image_raw/*
      • /torso_front_camera/depth/camera_info
      • /torso_front_camera/depth/color/points
      • /torso_front_camera/depth/image_rect_raw/*
      • /torso_front_camera/infra1/camera_info
      • /torso_front_camera/infra1/image_rect_raw/*
      • /torso_front_camera/infra2/image_rect_raw/compressed
      • /torso_back_camera/fisheye1/camera_info
      • /torso_back_camera/fisheye1/image_raw/*
      • /torso_back_camera/fisheye2/camera_info
      • /torso_back_camera/fisheye2/image_raw/*
      • /head_front_camera/color/camera_info
      • /head_front_camera/color/image_raw/*
      • /head_front_camera/image_throttle/compressed
      • /eyes/expression
      • /sound_direction
      • /sound_localization
      • /is_speeching
      • /audio
      • /audio/channel0
      • /speech_audio
      • /mobile_base_controller/cmd_vel
      • /mobile_base_controller/odom
      • /scan
    • Meaning of ARI LEDs colors and patterns
    • LEDs API
    • ARI microphone array and audio recording
  • Social capabilities
    • Social perception with ROS4HRI
    • 🚧 Overview of the social perception pipeline
    • Tutorial: Python tutorial: detecting people oriented toward the robot
    • Tutorial: C++ tutorial: detecting people around the robot
    • 🚧 Tutorial: Create a simple engagement detector
    • 🚧 Tutorial: Tooling and debugging of human-robot interactions
    • How-to: How to detect a face, a skeleton or a person?
    • Social perception topics
      • /humans/faces/*/aligned
      • /humans/faces/*/cropped
      • /humans/faces/*/landmarks
      • /humans/faces/*/roi
      • /humans/faces/tracked
      • /humans/persons/*/face_id
      • /humans/bodies/*/cropped
      • /humans/bodies/*/joint_states
      • /humans/bodies/*/position
      • /humans/bodies/*/roi
      • /humans/bodies/*/skeleton2d
      • /humans/bodies/tracked
      • /humans/bodies/*/velocity
      • /humans/persons/*/body_id
      • /humans/persons/*/voice_id
      • /humans/voices/*/audio
      • /humans/voices/*/is_speaking
      • /humans/voices/*/speech
      • /humans/voices/tracked
      • /humans/persons/*/alias
      • /humans/persons/*/anonymous
      • /humans/persons/*/body_id
      • /humans/persons/*/engagement_status
      • /humans/persons/*/face_id
      • /humans/persons/known
      • /humans/persons/*/location_confidence
      • /humans/persons/tracked
      • /humans/persons/*/voice_id
      • /humans/candidate_matches
  • Speech and language processing
    • Dialogue management
    • Wake-up word detector
    • 🚧 How to design and trigger actions from the chatbot
    • 🚧 How-to: Text-to-speech on ARI
    • How-to: Automatic Speech Recognition (ASR) on ARI
    • Tutorial: create or update a chatbot for ARI
    • 🚧 Tutorial: Programming an interactive a story
    • 🚧 Tutorial: Using the knowledge base with the chatbot
    • Speech and language processing capabilities
    • ASR, TTS and dialogue management APIs
    • Default chit-chat/smalltalk capabilities of ARI
  • Expressive interactions
    • 🚧 Expressive interactions with ARI
    • Tutorial: Creating expressions with LEDs
    • 🚧 Tutorial: Getting started with ARI - Combine voice and gesture
    • How-to: Control ARI’s eyes
    • APIs for expressiveness control
      • /eyes/expression
      • /look_at
      • /look_at_with_style
  • Navigation
    • How-to: control the mobile base
    • Laser-based SLAM and path planning on the robot
    • Vision-based SLAM
    • Managing maps
    • Autonomous navigation with ARI
    • SLAM and path planning in simulation
    • ARI’s Building Manager
    • 🚧 WebGUI Navigation Manager
    • Navigation capabilities
    • Navigation-related API
  • Knowledge and reasoning
    • Tutorial: Getting started with the knowledge base
    • How-to: How to add/remove/query facts
    • pykb reference
    • KnowledgeCore API
  • Gestures and motions
    • ARI Motion Builder
    • 🚧 How-to: How to ask the robot to move during a script
    • How-to: Implement a gaze-following behaviour
    • Gestures and motions capabilities
    • List of available motions
  • Touchscreen
    • Presentations
    • ARI’s Touchscreen manager
    • Tutorial: Building a first touchscreen interaction
    • Touchscreen capabilities
  • ARI SDK 23.1 Highlights
  • Frequently Asked Questions
  • Glossary
Back to top

Hardware-related API#

Section

ARI hardware - reference

Cameras#

The following Figure illustrates the cameras mounted on the robot:

../_images/ari_cameras.png

Torso front camera#

Stereo RGB-D camera: This camera is mounted on the frontal side of the torso below the touchscreen and provides RGB images along with a depth image obtained by using an IR projector and an IR camera. The depth image is used to obtain a point cloud of the scene.

Torso front camera-related topics:

  • /torso_front_camera/aligned_depth_to_color/camera_info
  • /torso_front_camera/aligned_depth_to_color/image_raw/*
  • /torso_front_camera/color/camera_info
  • /torso_front_camera/color/image_raw/*
  • /torso_front_camera/depth/camera_info
  • /torso_front_camera/depth/color/points
  • /torso_front_camera/depth/image_rect_raw/*
  • /torso_front_camera/infra1/camera_info
  • /torso_front_camera/infra1/image_rect_raw/*
  • /torso_front_camera/infra2/image_rect_raw/compressed

Torso back camera#

Stereo-fisheye camera: This camera is mounted on the back side of the torso, right below the emergency button, and provides stereo, fisheye and black and white images.

Torso back camera-related topics:

  • /torso_back_camera/fisheye1/camera_info
  • /torso_back_camera/fisheye1/image_raw/*
  • /torso_back_camera/fisheye2/camera_info
  • /torso_back_camera/fisheye2/image_raw/*

Head camera#

Either one of these cameras is located inside ARI’s head.

RGB camera: provides RGB images

RGB-D camera: provides RGB-D images

Head camera-related topics:

  • /head_front_camera/color/camera_info
  • /head_front_camera/color/image_raw/*
  • /head_front_camera/image_throttle/compressed

Optional cameras#

The touchscreen can include up to three more cameras:

  • thermal camera

  • RGB camera

  • RGB-D camera

In order to learn how to access the cameras, please refer to Accessing ARI sensors.

LEDs#

../_images/ari_leds.svg

ARI has 4 led displaying devices capable of being controlled via our ROS interface:

  • Back ring: a LED ring at the back, below the emergency button, with a

    variety of colour options.

  • Ears rings: LED ring in each of its ears with a variety of colour

    options.

  • respeaker: LED ring at the speaker.

ID

#Leds

Available Effects

Back

0

40

All

Ear Left

1

16

All

Ear Right

2

16

All

Respeaker

4

12

Fixed Color

Different devices may have different capabilities, and may only be able to show part of the effects.

All led devices are controlled using the same ROS Interface, provided by the PAL Led Manager. This interface allows clients to send effects to one or multiple led devices, for a duration of time and a priority. When a device has more than one effect active at a time, it displays the one with the highest priority, until the duration ends and then displays the next effect with the highest priority. There’s a default effect with unlimited duration and lowest priority that displays a fixed color.

Interfaces#

The led interface is an Action Server: /pal_led_manager/do_effect

A goal consists of:

  • devices A list of devices the goal applies to.

  • params The effect type, and the parameters for the selected effect type.

  • effectDuration Duration of the effect, when the time is over the previous

    effect will be restored. 0 will make it display forever

  • priority Priority of the effect, 0 is no priority, 255 is max priority

To learn more on how to access the LEDs, refer to Meaning of ARI LEDs colors and patterns and Tutorial: Creating expressions with LEDs.

Animated eyes#

ARI has LCD eyes which provide a collection of eyes expressions. These can be use to support engaging interactions, along with head and arms movements.

The available eyes expressions are:

neutral

angry

sad

happy

surprised

disgusted

scared

pleading

vulnerable

despaired

guilty

disappointed

embarrassed

horrified

skeptical

annoyed

furious

suspicious

rejected

bored

tired

asleep

confused

amazed

excited

Eyes expression topics:

  • /eyes/expression

To learn more about ARI’s expressive eyes, refer to How-to: Control ARI’s eyes.

Speakers and microphones#

ARI has an array of four microphones that can be used to record audio and process it in order to perform tasks such as speech recognition. The microphone is located on the circlar gap of the torso. There are two HIFI full-range speakers just below it.

The ReSpeaker Mic Array V2.0 consists of 4 microphones (https://www.seeedstudio.com/ReSpeaker-Mic-Array-v2-0.html). See ARI microphone array and audio recording for details.

To learn more about how to process speech in ARI, refer to Dialogue management.

ReSpeaker topics:

  • /sound_direction
  • /sound_localization
  • /is_speeching
  • /audio
  • /audio/channel0
  • /speech_audio

Joints#

Base#

The base joints of ARI are the front and back wheels of the robot. Take care when wheeling ARI, especially with the smaller back wheels.

../_images/ari_wheels.png

The wheels can be controlled using a ros topic, specifying the desired linear and angular velocity of the robot. These velocities are specified in meters per second and are translated to wheel angular velocities internally.

Drive wheels topics:

  • /mobile_base_controller/cmd_vel
  • /mobile_base_controller/odom

To get more information on how to move around ARI, please refer to Navigation.

Arms#

ARI’s joints and arm movements are illustrated below:

../_images/ari_arms_joints.jpg

Head joints#

The joint movements for ARI’s head are shown below:

../_images/ari_head_joints.jpg

Please refer to the following entries to to learn how to move ARI’s upper body:

upper_body_motion

🚧 How-to: How to ask the robot to move during a script

whole_body_motion_control

move_it

play_motion

LIDAR#

LIDAR topics:

  • /scan
Next
Meaning of ARI LEDs colors and patterns
Previous
ARI hardware capabilities
Copyright © 2023, PAL Robotics
Made with Sphinx and @pradyunsg's Furo
On this page
  • Hardware-related API
    • Cameras
      • Torso front camera
      • Torso back camera
      • Head camera
      • Optional cameras
    • LEDs
      • Interfaces
    • Animated eyes
    • Speakers and microphones
    • Joints
      • Base
      • Arms
      • Head joints
    • LIDAR