Communication skills¶
This page lists system skills (ie, installed by default) related to 💬 Communication. You can directly use these skills in your tasks and mission controller.
ask_human_for_help
¶
Interface: /skill/ask_human_for_help
Message type: interaction_skills/action/AskHumanForHelp
Ask a human for help
Input parameters¶
question_to_human
string
, requiredThe question to ask to the human.
person_id
string array
The preferred person IDs to ask for help. Those humans will be prioritized when asking for help. If left empty, all tracked humans are considered.
Quick snippets¶
$ ros2 action send_goal /skill/ask_human_for_help interaction_skills/action/AskHumanForHelp # then press Tab to complete the message prototype
How to use in your code¶
See code samples for the corresponding /skill/ask_human_for_help: code samples
See code samples for the corresponding /skill/ask_human_for_help: code samples
To be added soon
You can call this skill from QML using the following code snippet. See Building a touchscreen interface via ROS and QML to learn more.
import Ros 2.0
// ...
AskHumanForHelpSkill {
id: mySkill
onResult: {
console.log("Skill result: " + result);
}
onFeedback: {
console.log("Skill feedback: " + feedback);
}
}
// Call the skill
mySkill.ask_human_for_help(question_to_human);
ask
¶
Interface: /skill/ask
Message type: communication_skills/action/Ask
A specialization of the chat where the role is predefined and the dialogue is always initiated by the robot.
See How-to: Dialogue management for details.
Input parameters¶
question
string
, requiredThe question to be asked to the user. It is used to generate the initial utterance of the dialogue.
answers_schema
string
, requiredThe serialized JSON object of pieces of information to be retrieved through the chat. The keys correspond to the info required, while the values follow the JSON schema format for object properties.
For example, if the required information is the person age, the dictionary could be the following:
{"age": {"type": "integer", "minimum": 0}}
For a simple yes/no question, you could have instead:
{"response": {"type": "boolean"}}
person_id
string
, default:""
If targeting a specific person, the ID of the person.
group_id
string
, default:""
If targeting a group of people, the ID of the group (currently not supported).
meta.priority
integer
, default:128
Between 0 and 255. Higher value means that this skill invokation will have higher priority.
Output fields¶
answers
object
The serialized JSON object containing the pieces retrieved from the chat, depending on the answers_schema.
For age example above, the result might be:
{"age": 42}
Quick snippets¶
$ ros2 action send_goal /skill/ask communication_skills/action/Ask # then press Tab to complete the message prototype
How to use in your code¶
See code samples for the corresponding /skill/ask: code samples
See code samples for the corresponding /skill/ask: code samples
To be added soon
You can call this skill from QML using the following code snippet. See Building a touchscreen interface via ROS and QML to learn more.
import Ros 2.0
// ...
AskSkill {
id: mySkill
onResult: {
console.log("Skill result: " + result);
}
onFeedback: {
console.log("Skill feedback: " + feedback);
}
}
// Call the skill
mySkill.ask(question, answers_schema);
chat
¶
Interface: /skill/chat
Message type: communication_skills/action/Chat
Start a dialogue with a defined purpose.
See How-to: Dialogue management for details.
Input parameters¶
role
string
, default:__default__
, requiredThe name and configuration for the dialogue purpose (chatbot-dependent).
initiate
boolean
, default:False
If true, the dialogue is opened and the robot initiates the conversation.
initial_input
string
, default:""
Optionally specify the utterance the chatbot initiates the conversation with. Otherwise, the chatbot may generate one based on the role. It is ignored if
initiate
is false.person_id
string
, default:""
If targeting a specific person, the ID of the person.
group_id
string
, default:""
If targeting a group of people, the ID of the group (currently not supported).
meta.priority
integer
, default:128
Between 0 and 255. Higher value means that this skill invokation will have higher priority.
Output fields¶
role_result
string
A machine-readable serialized JSON object containing the result, depending on the role.
Quick snippets¶
$ ros2 action send_goal /skill/chat communication_skills/action/Chat # then press Tab to complete the message prototype
How to use in your code¶
See code samples for the corresponding /skill/chat: code samples
See code samples for the corresponding /skill/chat: code samples
To be added soon
You can call this skill from QML using the following code snippet. See Building a touchscreen interface via ROS and QML to learn more.
import Ros 2.0
// ...
ChatSkill {
id: mySkill
onResult: {
console.log("Skill result: " + result);
}
onFeedback: {
console.log("Skill feedback: " + feedback);
}
}
// Call the skill
mySkill.chat(role);
say
¶
Interface: /skill/say
Message type: communication_skills/action/Say
Speak out the provided input text (also executing any additional markup action).
See How-to: Speech synthesis (TTS) for details.
Input parameters¶
input
string
, requiredThe multi-modal expression to be read-out.
person_id
string
, default:""
If targeting a specific person, the ID of the person.
group_id
string
, default:""
If targeting a group of people, the ID of the group (currently not supported).
meta.priority
integer
, default:128
Between 0 and 255. Higher value means that this skill invokation will have higher priority.
Quick snippets¶
$ ros2 action send_goal /skill/say communication_skills/action/Say # then press Tab to complete the message prototype
How to use in your code¶
See code samples for the corresponding /skill/say: code samples
See code samples for the corresponding /skill/say: code samples
To be added soon
You can call this skill from QML using the following code snippet. See Building a touchscreen interface via ROS and QML to learn more.
import Ros 2.0
// ...
SaySkill {
id: mySkill
onResult: {
console.log("Skill result: " + result);
}
onFeedback: {
console.log("Skill feedback: " + feedback);
}
}
// Call the skill
mySkill.say(input);