Interaction skills

This page lists system skills (ie, installed by default) related to 😄 Interaction. You can directly use these skills in your tasks and mission controller.

ask_human_for_help

Ask a human for help

Input parameters

  • question_to_human string, required

    The question to ask to the human.

  • person_id string array

    The preferred person IDs to ask for help. Those humans will be prioritized when asking for help. If left empty, all tracked humans are considered.

Quick snippets

Call the skill from the command-line
$ ros2 action send_goal /skill/ask_human_for_help interaction_skills/action/AskHumanForHelp # then press Tab to complete the message prototype

How to use in your code

See code samples for the corresponding /skill/ask_human_for_help: code samples

do_led_effect

Perform light effects using the robot’s LEDs.

Input parameters

  • groups string array, default: []

    The LED groups to use for the effect (eg ear_leds, back_leds). An empty list means the effect is applied to all LED groups.

  • effect string, default: solid_color

    The selected LED effect, one of:

    • solid_color: Applies up to two solid colors side-by-side to the specified LED groups, with the primary color applied to the first partition of the LED group, and the secondary color applied to the second partition.

    • rainbow: Lights the LED groups in a rainbow effect, which moves through the LED group cyclically.

    • fade: Fades between two colors cyclically. It starts with the primary color, fades to the secondary color over the first partition, then fades back to the primary color over the second partition.

    • blink: Blinks the LED groups between two colors. It applies a solid primary color in the first partition of a cycle, then applies a solid secondary color in the second partition.

    • flow: Displays a loading-like effect, with the partition ratio of LEDs colored with the primary color moving through the LED group, at a constant speed such to traverse the entire LED group in one cycle, and the secondary color filling the rest of the LED group.

  • duration float, default: 0.0

    Total duration of the effect before the action is completed. A null or negative number will mean the action continues indefinitely until canceled.

  • color interaction_skills/msg/LedColor

    The primary color for the effect. Ignored for the rainbow effect.

  • secondary_color interaction_skills/msg/LedColor

    The secondary color for the effect. Ignored for the rainbow effect.

  • cycle float, default: 1.0

    Duration in seconds of a single cycle of the effect. Ignored for the solid_color effect.

  • partition float, default: 1.0

    For effects with two phases in a cycle (or two spatially separated partitions as in solid_color), the proportion of the first one ([0., 1.]) Ignored for the rainbow effect.

Quick snippets

Call the skill from the command-line
$ ros2 action send_goal /skill/do_led_effect interaction_skills/action/DoLedEffect # then press Tab to complete the message prototype

How to use in your code

See code samples for the corresponding /skill/do_led_effect: code samples

look_for_human

Search and localize specific humans

Input parameters

  • patterns string array, default: []

    List of RDF patterns that identify the human to look for. Exactly one variable is expected in the patterns. If left empty, all visible humans are returned.

Quick snippets

Call the skill from the command-line
$ ros2 action send_goal /skill/look_for_human interaction_skills/action/LookFor # then press Tab to complete the message prototype

How to use in your code

See code samples for the corresponding /skill/look_for_human: code samples

look_for_object

Search and localize specific objects

Input parameters

  • patterns string array, default: []

    List of RDF patterns that identify the objects to look for. Exactly one variable is expected in the patterns.

Quick snippets

Call the skill from the command-line
$ ros2 action send_goal /skill/look_for_object interaction_skills/action/LookFor # then press Tab to complete the message prototype

How to use in your code

See code samples for the corresponding /skill/look_for_object: code samples

set_expression

Sets the expression of the robot. This might include changing the robot’s face, body posture, or other expressive features.

One either sets the expression by name, or by specifying the valence and arousal of the expression.

Input parameters

  • expression.expression string

    Name of the expression. See List of expressions for details.

    One of:

    • neutral

    • angry

    • sad

    • happy

    • surprised

    • disgusted

    • scared

    • pleading

    • vulnerable

    • despaired

    • guilty

    • disappointed

    • embarrassed

    • horrified

    • skeptical

    • annoyed

    • furious

    • suspicious

    • rejected

    • bored

    • tired

    • asleep

    • confused

    • amazed

    • excited

  • expression.valence float

    The desired valence of the expression, ranging from -1 (very negative) to 1 (very positive).

  • expression.arousal float

    The desired arousal of the expression, ranging from -1 (very calm) to 1 (very excited).

Quick snippets

Trigger the skill from the command-line
$ ros2 topic pub /skill/set_expression interaction_skills/msg/SetExpression # then press Tab to complete the message prototype

How to use in your code

See code samples for the corresponding /skill/set_expression: code samples

look_at

Defines the gazing direction of the robot. This skill can be used to either look at a specific point in space (a ROS tf frame), or to set a generic gaze policy, such as looking at people around the robot.

Using the glance policy, you can also use this skill to brielfy look at a specific point in space, before returning to the previous gaze policy.

Input parameters

  • policy string

    One of the policies below, or empty string (in that case, the robot will look atpoint in space specified with the target parameter).

    Available policies:

    • random: randomly look around, with short fixations

    • social: look around for faces, with fixations on detected faces

    • glance: glance at a specific point in space, then return to the previous gaze policy. target must be specified.

    • auto: automatic gaze policy, implementation-dependent. On PAL interactive robots, equivalent to social.

    • reset: reset the gaze of the robot to looking straight.

  • target geometry_msgs/msg/PointStamped

    tf frame to track. Ignored if the policy parameter is set to one of the predefined policies.

Quick snippets

Call the skill from the command-line
$ ros2 action send_goal /skill/look_at interaction_skills/action/LookAt # then press Tab to complete the message prototype

How to use in your code

See code samples for the corresponding /skill/look_at: code samples