Frequently Asked Questions#
Developing applications#
❓ Can I compile my nodes in a ROS Docker container, and run it on the robot?
Yes, Docker is installed on your robot and you can run any image you wish.
Once you have the Docker image in your repository, pull it on your robot. For instance:
$ ssh pal@<robot>-Xc $ docker login registry.gitlab.com $ docker pull registry.gitlab.com/myuser/mydockerYou can then run the docker as you would from your laptop/PC.
❓ How do I communicate between containers?
All communication between containers is done through regular ROS msgs, topics, actions and services that have been defined for each module. All containers will be using ROS, and will be able to see and directly communicate with other containers (and with the Nvidia Jetson if you are using it).
Make sure however that you are starting the docker image with the
--net host
option, so that the container sees the ROS core.Links to ROS examples, depending if you use C++ or Python:
C++: http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28c%2B%2B%29
Python: http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28python%29
It is recommended to complete all the ROS tutorials available in the coding language you use.
❓ What method should I use to debug inside a container?
Accessing the container (with eg
docker exec -it <image name> bash
) and usinggdb
or similar tools as normal from inside the container, as if it was a standard computer is recommended.
❓ How can I automatically launch an application at start up?
Applications can be automatically launched at start-up. See this documentation section.
Robot hardware#
❓ How can I record audio data using ROS?
ARI publishes the audio in the following topic: /audio/raw. You can use rosbags to record any kind of ROS topics: http://wiki.ros.org/rosbag. Namely, enter the robot and in the desired directory:
ssh pal@<robot>-0c rosbag record -O audio.bag /audio/rawwhere
audio.bag
is the name of the bag that will be stored in the directory where the command is run.
❓ How to change frame rate of fisheye cameras?
The fisheye cameras are connected to the Nvidia Jetson and are running on a docker (1 per camera). In order to make them visible to ARI’s main PC and externally we have a ROS bridge, corresponding to the following launch file: https://github.com/pal-robotics/ari_navigation/blob/melodic-devel/ari_rgbd_sensors/launch/elp_cameras_relay.launch
As you see here we remap the input topics from the Jetson and also specify the frame rate, in this case 5Hz. You can pull this package, adjust the frame-rate and deploy it on the robot (the package would then be in
deployed_ws/share/ari_rgbd_sensors
) or alternatively adjust the permanent partition of the robot (/opt/pal/ferrum/share/ari_rgbd_sensors
).<node name="fisheye_rear_relay" type="throttle" pkg="topic_tools" args="messages /rear_fisheye_camera/image_raw/compressed 5 /rear_camera/fisheye/image_raw/compressed" />
❓ What does the LEDs color/pattern mean?
Check the leds page for explanations about the meaning of the different LEDs.
Robot management#
❓ What would be a typical usage length before the batteries run out of power?
ARI has a battery life of approximately 8-12 hours, and needs about 5 hours. Note that the autonomy is significantly reduced if the robot needs to move frequently. to be fully charged.
❓ Can we change ARI’s name once set?
In theory no, however if it is a feature that you would be interested in, the team at PAL Robotics could look into it. Contact the support.
❓ Can we work on the robot while the red emergency button is pressed, for example, for remote work?
Yes, you will be able to connect to its cameras and work with perception for example. However ARI won’t produce any speech output nor will it move its joints (arms, head or base)
We however suggest to always have the emergency button unpressed when working with ARI.
❓ How can I find the IP address of ARI?
By default, all ARIs are configured with the static IP address
10.68.0.1
. You can select this, by either specifying ARI’s IP address or its hostname<robot>-0c
.See Network configuration for details.
❓ What networking port are used/required by ARI?
Check the Data management, security, data privacy page.
❓ How can I see the list, and stop/start the robot’s background services?
The running robot background services (also called startups) are listed (and can be started/stopped) in their WebCommander tab. You can also use command-line tools or even the ROS API.
❓ How to update or upgrade my robot?
See System upgrade.
❓ Can I add a new ROS workspace to the robot environment?
In cases where the workspace resolution process needs to be changed, the file
/usr/bin/init_pal_env.sh
can be modified to adapt the environment of the startup process.
Gestures and motions#
❓ Is there some kind of ‘self collision avoidance’ check so we can’t make the robot ‘hurt’ itself with its arms?
ARI features a whole body motion controller that protects against self-collisions when automatically generating gestures like pointing. See whole_body_motion_control for details.
However, if you create your own pre-recorded motions using e.g. the motion builder, you are responsible for avoiding self-collisions as the robot will ‘blindly’ execute the pre-recorded motion.
❓ How can I control ARI’s head orientation with absolute values?
You can publish a target position for the robot to look at on the /look_at topic.
See How to set the gaze direction? for details and examples.
❓ Is it possible to control the arm/head joints in velocity?
The dynamixel motors do have velocity control mode, however it is not implemented in ActuatorsMgr, as its lowest layer does not support it
❓ How can I stop the default idle motion of the arms?
The default idle motion of the arms can be stopped from the The WebCommander tool startup panel, by stopping the
interaction_profile
service.
❓ How can I stop the default gaze behaviour?
You can call the /attention_manager/set_policy ROS service with the policy
DISABLED
. See Controlling the attention and gaze of the robot for details.
❓ How can I control the gaze direction of the robot?
You can publish a target position for the robot to look at on the /look_at topic.
See How to set the gaze direction? for details and examples.
Speech and language processing#
❓ How to change ARI’s language?
You can set the default language on your robot for the WebCommander Settings tab. You can also set the
/pal/language
parameter to change the currently active language. See Internationalisation and language support for details.
❓ What languages are supported by ARI?
You can access the list of the language currently available on your robot for the WebCommander Settings tab. Alternatively, you can access the ROS parameter /tts/supported_languages. If you wish to add additional languages, contact us.
❓ How to change the ‘wake-up’ (or ‘sleep’) keywords?
You can easily set a custom regular expression as wake-up/sleep keyword. See How to change the wake-up/sleep keywords? for details.
❓ What is the default wake-up word?
Check Wake-up word detector for the default wake-up/sleep keywords.
Touchscreen#
❓ How can I create ARI touch-screen content so that I can output my applications information e.g. ROS image topics, text, custom messages?
You can use ARI’s
rrlib.js
andpallib.js
modules, that enable calling ARI and ROS functions from Javascript. See the robot’s developer manual for more information.
❓ How do I get the default touch-screen back?
Edit
.pal/www/webapps/page_cfg/config.json
to point to any of the json files in.pal/www/webapps/pages/
. The default web isdefault_page.json
:{"default_page": "default_page.json"}
❓ How does ARI’s touchscreen work?
PAL’s robots’s touchscreen displays a Chrome webpage. Specifically it has a topic called /web/go_to where you specify the URL browser link you want it to go to.
It uses Chrome’s Selenium driver: https://chromedriver.chromium.org/getting-started
See using-the-touchscreen for details.
❓ How can I visualize what is displayed on the robot’s screen?
PAL’s robots’s touchscreen is running the Chrome browser in ‘kiosk’ mode. You can reproduce this setup on your own machine by launching Chrome with the parameter
--kiosk
. ARI is using Chrome version 63.0.3239.132-1 with no specific extra plugins or software enabled.In addition, you can connect to
http://<robot-0c:11011/webapps/pages/pal_touch_web_display/
with your browser so you can load with the same infrastructure that is loaded in the chest touchscreen and have it interact with the rest of the robot infrastructure.