Frequently Asked Questions#

Developing applications#

❓ How to wrap my code on a ROS Noetic container and upload it to the robot

First of all create a Dockerfile with the dependencies that you need, taking as base ros:noetic docker. See examples:

Once you have a Dockerfile, build your docker:

docker build -t <DOCKERFILE NAME>

Run the docker and add your changes/packages/ that you need. Test that it works properly, e.g. using an ARI docker for simulation if needed.

Push the docker to a docker registry accessible from the robot.

For instance:

$ docker login
$ docker push

Once you have the docker in your repository, pull it from ARI:

$ ssh pal@ari-Xc
$ docker login
$ docker pull

You can then run the docker as you would from your laptop/PC.

❓ How do I communicate between containers?

All communication between containers is done through regular ROS msgs, topics, actions and services that have been defined for each module. All containers will be using ROS, and will be able to see and directly communicate with other containers (and with the Nvidia Jetson if you are using it).

Make sure however that you are starting the docker image with the --net host option, so that the container sees the ROS core.

Links to ROS examples, depending if you use C++ or Python:



It is recommended to complete all the ROS tutorials available in the coding language you use.

❓ What method should I use to debug inside a container?

Accessing the container (with eg docker exec -it <image name> bash) and using gdb or similar tools as normal from inside the container, as if it was a standard computer is recommended.

❓ How can I automatically launch an application at start up?

Applications can be automatically launched at start-up. See this documentation section.

ARI hardware#

❓ How can I record audio data using ROS?

ARI publishes the audio in the following topic: /audio/raw. You can use rosbags to record any kind of ROS topics: Namely, enter the robot and in the desired directory:

ssh pal@ari-Xc

rosbag record -O audio.bag /audio/raw

where audio.bag is the name of the bag that will be stored in the directory where the command is run.

❓ How to change frame rate of fisheye cameras?

The fisheye cameras are connected to the Nvidia Jetson and are running on a docker (1 per camera). In order to make them visible to ARI’s main PC and externally we have a ROS bridge, corresponding to the following launch file:

As you see here we remap the input topics from the Jetson and also specify the frame rate, in this case 5Hz. You can pull this package, adjust the frame-rate and deploy it on the robot (the package would then be in deployed_ws/share/ari_rgbd_sensors) or alternatively adjust the permanent partition of the robot (/opt/pal/ferrum/share/ari_rgbd_sensors).

<node name="fisheye_rear_relay" type="throttle" pkg="topic_tools" args="messages /rear_fisheye_camera/image_raw/compressed  5 /rear_camera/fisheye/image_raw/compressed" />

❓ What does the LEDs color/pattern mean?

Check the Meaning of ARI LEDs colors and patterns page for explanations about the meaning of the different LEDs.

Robot management#

❓ What would be a typical usage length before the batteries run out of power?

ARI has a battery life of approximately 8-12 hours, and needs about 5 hours. Note that the autonomy is significantly reduced if the robot needs to move frequently. to be fully charged.

❓ Can we change ARI’s name once set?

In theory no, however if it is a feature that you would be interested in, the team at PAL Robotics could look into it. Contact the support.

❓ Can we work on the robot while the red emergency button is pressed, for example, for remote work?

Yes, you will be able to connect to its cameras and work with perception for example. However ARI won’t produce any speech output nor will it move its joints (arms, head or base)

We however suggest to always have the emergency button unpressed when working with ARI.

❓ How can I find the IP address of ARI?

By default, all ARIs are configured with the static IP address You can select this, by either specifying ARI’s IP address or its hostname ari-0c.

See ARI networking for details.

❓ What networking port are used/required by ARI?

❓ How can I see the list, and stop/start the robot’s background services?

The running robot background services (also called startups) are listed (and can be started/stopped) in their WebCommander tab. You can also use command-line tools or even the ROS API.

❓ How to update or upgrade my robot?

Gestures and motions#

❓ Is there some kind of ‘self collision avoidance’ check so we can’t make the robot ‘hurt’ itself with its arms?

ARI features a whole body motion controller that protects against self-collisions when automatically generating gestures like pointing. See whole_body_motion_control for details.

However, if you create your own pre-recorded motions using e.g. the motion builder, you are responsible for avoiding self-collisions as the robot will ‘blindly’ execute the pre-recorded motion.

❓ How can I control ARI’s head orientation with absolute values?

You can publish a target position for the robot to look at on the /look_at topic.

See How to set the gaze direction? for details and examples.

❓ Is it possible to control the arm/head joints in velocity?

The dynamixel motors do have velocity control mode, however it is not implemented in ActuatorsMgr, as its lowest layer does not support it

❓ How can I stop the default idle motion of the arms?

The default idle motion of the arms can be stopped from the The WebCommander tool startup panel, by stopping the interaction_profile service.

❓ How can I stop the default gaze behaviour?

You can call the /attention_manager/set_policy ROS service with the policy DISABLED. See Controlling the attention and gaze of the robot for details.

❓ How can I control the gaze direction of the robot?

You can publish a target position for the robot to look at on the /look_at topic.

See How to set the gaze direction? for details and examples.


❓ How to adjust ARI Navigation parameters (e.g. backward navigation, etc)

You can find move_base parameters at /opt/pal/gallium/share/pal_navigation_cfg_ari:

  • base/teb/local_planner.yaml

  • Backward velocity

  • Forward velocity

  • etc.


PAL Robotics recommends to be careful when adjusting these parameters, as they have been tuned to match ARI’s current navigation capabilities.

❓ Can I lock ARI in position to avoid any movement?

Usually ARI is always ‘locked’ in position (e.g. the robot will not move if pushed), it is only when the robot battery is low or the emergency button is pressed that you can move ARI manually by pushing it.

To quickly stop the robot during autonomous navigation, you should press the emergency button.

❓ How robust is the Visual SLAM?

The localization is done using ORB-SLAM and the occupancy grid is used for motion planning (with obstacle avoidance).

The robustness to lighting condition changes is the one offered by the adopted Visual SLAM system. We are working with LIDAR-camera integration in order to improve navigation capabilities.

❓ What is the mininimum width the robot can operate in?

Taking into account the width of robot + security margin, corridors must be a minimum of 10cm wide so the robot can pass through

In order to rotate on the spot, the robot needs a radius of 60cm approx

❓ Can the robot move backwards?

This depends on the particular planner that is used. TEB offers this feature.

❓ How to create custom Points of Interest (POI) or Zones of Interests (ZOI)?

❓ What is the maximum speed at which the robot moves?

ARI can move up to 1.5m/s. However, we recommend to use the robot at lower speeds, for safety reasons.

❓ Is it possible to map/navigate without a laser scan (lidar)?

Yes, you can do vision-based mapping and navigation on ARI, as explained here: vision-based-navigation. However, the navigation will not be as accurate as with a laser scanner, and the robot will likely often get lost.

Speech and language processing#

❓ How to change ARI’s language?

You can set the default language on your robot for the WebCommander Settings tab. You can also set the /pal/language parameter to change the currently active language. See Internationalisation and language support for details.

❓ What languages are supported by ARI?

You can access the list of the language currently available on your robot for the WebCommander Settings tab. Alternatively, you can access the ROS parameter /tts/supported_languages. If you wish to add additional languages, contact us.

❓ How to change the ‘wake-up’ (or ‘sleep’) keywords?

You can easily set a custom regular expression as wake-up/sleep keyword. See How to change the wake-up/sleep keywords? for details.

❓ What is the default wake-up word?

Check Wake-up word detector for the default wake-up/sleep keywords.


❓ How can I create ARI touch-screen content so that I can output my applications information e.g. ROS image topics, text, custom messages?

You can use ARI’s rrlib.js and pallib.js modules, that enable calling ARI and ROS functions from Javascript. See the robot’s developer manual for more information.

❓ How do I get the default touch-screen back?

Edit .pal/www/webapps/page_cfg/config.json to point to any of the json files in .pal/www/webapps/pages/. The default web is default_page.json:

{"default_page": "default_page.json"}

❓ How does ARI’s touchscreen work?

ARI’s touchscreen displays a Chrome webpage. Specifically it has a topic called /web/go_to where you specify the URL browser link you want it to go to.

It uses Chrome’s Selenium driver:

See using-the-touchscreen for details.

❓ How can I visualize what is displayed on the robot’s screen?

ARI’s touchscreen is running the Chrome browser in ‘kiosk’ mode. You can reproduce this setup on your own machine by launching Chrome with the parameter --kiosk. ARI is using Chrome version 63.0.3239.132-1 with no specific extra plugins or software enabled.

In addition, you can connect to http://ari-0c:11011/webapps/pages/pal_touch_web_display/ with your browser so you can load with the same infrastructure that is loaded in the chest touchscreen and have it interact with the rest of the robot infrastructure.