../_images/tiagopro-icon.png ../_images/kangaroo-icon.png ../_images/tiago-icon.png ../_images/ari-icon.png ../_images/talos-icon.png ../_images/mobile-bases-icon.png

How to access information of a face, a skeleton or a person?#

In this page you will learn how to use the ROS 2 tools to read the information published by your robot’s social perception pipeline.

Note

These instructions are compatible with any social perception pipeline following the ROS4HRI standard.

Prerequisites#

  • The following instructions require you to be familiar with the ROS 2 command-line tools.

  • This how-to assumes that you already have an up and running face detection pipeline. The pipeline must be ROS4HRI compatible. If you do not have one available, you can start using hri_face_detect.

  • This how-to assumes that you already have an up and running body detection pipeline. The pipeline must be ROS4HRI compatible. If you do not have one available, you can start using hri_fullbody.

  • This how-to assumes that you already have an up and running person manager handling the combination of faces, bodies and voices into persons. The person manager must be ROS4HRI compatible. If you do not have one available, you can start using hri_person_manager.

Note

Your robot already comes with hri_face_detect, hri_fullbody and hri_person_manager installed. If you want to test the pipeline in your own computer, follow the instructions in How to launch the different components for person detection.

Faces#

The following instructions will help you understand more about the faces detected by the social perception pipeline.

How do I check how many faces are currently detected?#

Any ROS4HRI-compatible node detecting faces will publish the list of the faces id on the /humans/faces/tracked topic.

$ ros2 topic echo /humans/faces/tracked

Next you can see an execution example for this command:

$ ros2 topic echo /humans/faces/tracked

---
header:
  seq: 50
  stamp:
    secs: 1674655540
    nsecs: 786450287
  frame_id: "usb_cam"
ids:
  - dldve
  - ugfwo
  - hcpwl
---

The message published on the topic has type hri_msgs/IdsList, where we can find a std_msgs/Header object, header, and the ids field, that is, the ids list for the currently detected faces.

How do I check what are the coordinates of the bounding box of a face?#

Once you are sure which is the id of the face you are interested in:

$ ros2 topic echo /humans/faces/<face_id>/roi

The message published on the /humans/faces/*/roi topic has type sensor_msgs/RegionOfInterest.

Next you can see an execution example for this command:

$ ros2 topic echo /humans/faces/ugfwo/roi

---
x_offset: 283
y_offset: 235
height: 194
width: 154
do_rectify: True
---

Where:

  • x_offset, y_offset are respectively the x and y coordinates of the top-left corner pixel of the bounding box.

  • height, width define the bounding box size.

Note

Face bounding boxes can also be visualised using the Humans plugin in RViz. For more information, check the ROS4HRI tooling and debugging section.

What other information can I expect from the face detector?#

Other face-specific topics are:

  • /humans/faces/*/landmarks: here you can find the facial landmarks coordinates. The message published here has type hri_msgs/FacialLandmarks and contains 69 facial landmarks, following the dlib convention.

  • /humans/faces/*/cropped: here you can find the cropped image of the detected face. You can use RViz or rqt image_view to visualise it.

  • /humans/faces/*/aligned: here you can find the cropped and aligned image of the detected face. You can use RViz or rqt image_view to visualise it. The cropped and aligned image of the face is useful for recognition purposes.

Note

Facial landmarks can be visualised using the Humans plugin in RViz or rqt image_view . For more information, check the ROS4HRI tooling and debugging section.

Bodies#

The following instructions will help you know more about the bodies detected by the social perception pipeline.

How do I check how many bodies are currently detected?#

Any ROS4HRI-compatible node detecting bodies will publish the list of the bodies id on the /humans/bodies/tracked topic.

$ ros2 topic echo /humans/bodies/tracked

Here an output example:

$ ros2 topic echo /humans/bodies/tracked

---
header:
  seq: 50
  stamp:
    secs: 1674656320
    nsecs: 837982903
  frame_id: "usb_cam"
ids:
  - kaobg
  - banyw
---

The message published on the topic has type hri_msgs/IdsList, where you can find a std_msgs/Header object, header, and the ids field, i.e., the ids list of the currently detected bodies.

How do I check what are the coordinates of the bounding box of a body?#

Once you know the id of the body you are interested in, you can get its bounding box as follows:

$ ros2 topic echo /humans/bodies/<body_id>/roi

The message published on the /humans/bodies/*/roi topic has type sensor_msgs/RegionOfInterest.

You can see an execution example for this command below:

$ ros2 topic echo /humans/bodies/<body_id>/roi

---
x_offset: 516
y_offset: 248
height: 103
width: 110
do_rectify: False
---

Where:

  • x_offset, y_offset are respectively the x and y coordinates of the top-left corner pixel of the bounding box.

  • height, width define the bounding box size.

Note

Bodies bounding boxes can also be visualised using the Humans plugin in RViz. For more information, check the ROS4HRI tooling and debugging section.

How do I check what is the current position of a body?#

Once you know the id of the body you are interested in, you can access its postion as follows:

$ ros2 topic echo /humans/bodies/<body_id>/position

The message published on the /humans/bodies/*/position topic has type geometry_msgs/PositionStamped.

Next you can see an execution example of this command:

$ ros2 topic echo /humans/bodies/ynwzg/position

---
header:
  seq: 98
  stamp:
    secs: 1674662668
    nsecs: 363162279
  frame_id: "camera_color_optical_frame"
point:
  x: 1.0094782819513077
  y: 0.0
  z: 2.7442159928511325
---

The position is expressed in frame_id coordinates and refers to the middle point between the body’s left and right hips.

How do I check what is the current velocity of a body?#

You can check the velocity of a given body id as follows:

$ ros2 topic echo /humans/bodies/<body_id>/velocity

The message published on the /humans/bodies/*/velocity topic has type geometry_msgs/TwistStamped.

An execution example for this command follows next:

$ ros2 topic echo /humans/bodies/ynwzg/velocity

---
header:
  seq: 88
  stamp:
    secs: 1674663064
    nsecs: 820648202
  frame_id: "body_ynwzg"
twist:
  linear:
    x: 0.09314650347944815
    y: 0.04139227172094225
    z: 0.0
  angular:
    x: 0.0
    y: 0.0
    z: 0.0
---

The velocity is expressed in frame_id coordinates.

What other information can I expect from the body detector?#

Other body-specific topics are:

  • /humans/bodies/*/joint_states: describes the joint states for the body kinematic model. The message published in this topic has type sensor_msgs/JointState.

  • /humans/bodies/*/skeleton2d: describes the normalised key points coordinates of the detected 2D skeleton. Each key point comes with a confidence value. The topic messages have type hri_msgs/Skeleton2D and the keypoints are stored in an array of hri_msgs/NormalizedPointOfInterest2D. The 2D skeleton follows the OpenPose notation.

Note

The kinematic model for the detected bodies can be visualised using the Skeletons 3D plugin in RViz. For more information, check the ROS4HRI tooling and debugging section.

Note

Bodies 2D skeletons can be visualised using the Humans plugin in RViz. For more information, check the ROS4HRI tooling and debugging section.

Persons#

The following instructions will help you understand more about the persons detected by the social perception pipeline.

How do I check how many persons are currently detected?#

Any ROS4HRI-compatible node detecting persons will publish the list of persons ids in the /humans/persons/tracked topic.

$ ros2 topic echo /humans/persons/tracked

This is an output example:

$ ros2 topic echo /humans/persons/tracked

---
header:
  seq:
  stamp:
    secs: 1674727046
    nsecs: 105304003
  frame_id: ''
ids:
  - fjplc
---

The message published in the topic has type hri_msgs/IdsList, where you can find a std_msgs/Header object, header, and the ids field, i.e., the ids list for the currently detected persons.

Note

For a better understanding of the frame_id field semantics, we invite you to check its definition in the REP 155.

How do I find out which is the face associated to a person?#

Once you know the id of the person you are interested in, you can view its associated id face as follows:

$ ros2 topic echo /humans/persons/<person_id>/face_id

An execution example for this command follows next:

$ ros2 topic echo /humans/persons/fjplc/face_id

---
data: "yjksh"
---

The messages published on the topic /humans/persons/*/face_id have type std_msgs/String.

How do I find what is the body associated to a person?#

Once you know the id of the person you are interested in, you can view its associated body as follows:

$ ros2 topic echo /humans/persons/<person_id>/body_id

An execution example for this command follows next:

$ ros2 topic echo /humans/persons/fjplc/body_id

---
data: "bsoek"
---

The messages published on the topic /humans/persons/*/body_id have type std_msgs/String.

How do I find out whether a person is anonymous or not?#

You can check if a person is anonymous or not as follows:

$ ros2 topic echo /humans/persons/<person_id>/anonymous

An execution example for this command follows next:

$ ros2 topic echo /humans/persons/fjplc/anonymous

---
data: False
---

The messages published on the topic /humans/persons/published/ have type std_msgs/Bool.

How do I know the location confidence for a person?#

The location confidence of a given person id can be checked as follows:

$ ros2 topic echo /humans/persons/<person_id>/location_confidence

An execution example for this command follows next:

$ ros2 topic echo /humans/persons/fjplc/location_confidence

---
data: 1.0
---

The messages published on the topic /humans/persons/*/location_confidence have type std_msgs/Float32.

Note

For a better understanding of what a person’s location_confidence represents, we invite you to check its definition in the REP 155.

How do I know what is a person’s engagement status?#

A person’s engagement status can be checked as follows:

$ ros2 topic /humans/persons/<person_id>/engagement_status

You can see next an execution example for this command:

$ ros2 topic echo /humans/persons/fjplc/engagement_status

---
header:
    seq:
    stamp:
        secs: 1674742450
        nsecs:  89568377
    frame_id: ''
level: 1
---

The messages published on the topic /humans/persons/*/engagement_status have type hri_msgs/EngagementLevel, where level represents one of the five possible states for engagement:

  • ENGAGING

  • ENGAGED,

  • DISENGAGING

  • DISENGAGED

  • UNKNOWN

Each one of these states is associated with an integer value of type uint8.

See also#

  • If you have any doubt regarding some of the ROS4HRI-related concepts, check the ROS4HRI introduction or the whole standard definition in the REP 155.

  • If you want to know more on how to use the information processed by the social perception pipeline directly in your python code, check the pyhri tutorial.

  • If you want to know more on how to use the information processed by the social perception pipeline directly in your C++ code, check the libhri tutorial.