Interaction skills¶
This page lists system skills (ie, installed by default) related to 😄 Interaction. You can directly use these skills in your tasks and mission controller.
ask_human_for_help
¶
Interface: /skill/ask_human_for_help
Message type: interaction_skills/action/AskHumanForHelp
Ask a human for help
Input parameters¶
question_to_human
string
, requiredThe question to ask to the human.
person_id
string array
The preferred person IDs to ask for help. Those humans will be prioritized when asking for help. If left empty, all tracked humans are considered.
Quick snippets¶
$ ros2 action send_goal /skill/ask_human_for_help interaction_skills/action/AskHumanForHelp # then press Tab to complete the message prototype
How to use in your code¶
See code samples for the corresponding /skill/ask_human_for_help: code samples
See code samples for the corresponding /skill/ask_human_for_help: code samples
To be added soon
You can call this skill from QML using the following code snippet. See Building a touchscreen interface via ROS and QML to learn more.
import Ros 2.0
// ...
AskHumanForHelpSkill {
id: mySkill
onResult: {
console.log("Skill result: " + result);
}
onFeedback: {
console.log("Skill feedback: " + feedback);
}
}
// Call the skill
mySkill.ask_human_for_help(question_to_human);
look_for_human
¶
Interface: /skill/look_for_human
Message type: interaction_skills/action/LookFor
Search and localize specific humans
Input parameters¶
patterns
string array
, default:[]
List of RDF patterns that identify the human to look for. Exactly one variable is expected in the patterns. If left empty, all visible humans are returned.
Quick snippets¶
$ ros2 action send_goal /skill/look_for_human interaction_skills/action/LookFor # then press Tab to complete the message prototype
How to use in your code¶
See code samples for the corresponding /skill/look_for_human: code samples
See code samples for the corresponding /skill/look_for_human: code samples
To be added soon
You can call this skill from QML using the following code snippet. See Building a touchscreen interface via ROS and QML to learn more.
import Ros 2.0
// ...
LookForHumanSkill {
id: mySkill
onResult: {
console.log("Skill result: " + result);
}
onFeedback: {
console.log("Skill feedback: " + feedback);
}
}
// Call the skill
mySkill.look_for_human();
look_for_object
¶
Interface: /skill/look_for_object
Message type: interaction_skills/action/LookFor
Search and localize specific objects
Input parameters¶
patterns
string array
, default:[]
List of RDF patterns that identify the objects to look for. Exactly one variable is expected in the patterns.
Quick snippets¶
$ ros2 action send_goal /skill/look_for_object interaction_skills/action/LookFor # then press Tab to complete the message prototype
How to use in your code¶
See code samples for the corresponding /skill/look_for_object: code samples
See code samples for the corresponding /skill/look_for_object: code samples
To be added soon
You can call this skill from QML using the following code snippet. See Building a touchscreen interface via ROS and QML to learn more.
import Ros 2.0
// ...
LookForObjectSkill {
id: mySkill
onResult: {
console.log("Skill result: " + result);
}
onFeedback: {
console.log("Skill feedback: " + feedback);
}
}
// Call the skill
mySkill.look_for_object();
set_expression
¶
Interface: /skill/set_expression
Message type: interaction_skills/msg/SetExpression
Sets the expression of the robot. This might include changing the robot’s face, body posture, or other expressive features.
One either sets the expression by name, or by specifying the valence and arousal of the expression.
Input parameters¶
expression.expression
string
Name of the expression. See List of expressions for details.
One of:
neutral
angry
sad
happy
surprised
disgusted
scared
pleading
vulnerable
despaired
guilty
disappointed
embarrassed
horrified
skeptical
annoyed
furious
suspicious
rejected
bored
tired
asleep
confused
amazed
excited
expression.valence
float
The desired valence of the expression, ranging from -1 (very negative) to 1 (very positive).
expression.arousal
float
The desired arousal of the expression, ranging from -1 (very calm) to 1 (very excited).
Quick snippets¶
$ ros2 topic pub /skill/set_expression interaction_skills/msg/SetExpression # then press Tab to complete the message prototype
How to use in your code¶
See code samples for the corresponding /skill/set_expression: code samples
See code samples for the corresponding /skill/set_expression: code samples
To be added soon
You can call this skill from QML using the following code snippet. See Building a touchscreen interface via ROS and QML to learn more.
import Ros 2.0
// ...
SetExpressionSkill {
id: mySkill
}
// Call the skill
mySkill.set_expression();