../_images/tiago-icon.png ../_images/ari-icon.png

How-to: How to detect a face, a skeleton or a person?#

On this page, you will learn how to use the ROS tools to read the information published by PAL’s robots’s social perception pipeline.

Note

These instructions are compatible with any social perception pipeline following the ROS4HRI standard.

Prerequisites#

  • The following instructions require you to be familiar with the ROS command-line tools.

  • This how-to assumes that you already have an up and running face detection pipeline. The pipeline must be ROS4HRI compatible. If you do not have one available, you can start using hri_face_detect.

  • This how-to assumes that you already have an up and running body detection pipeline. The pipeline must be ROS4HRI compatible. If you do not have one available, you can start using hri_fullbody.

  • This how-to assumes that you already have an up and running person manager, handling the combination of faces, bodies and voices into persons. The person manager must be ROS4HRI compatible. If you do not have one available, you can start using hri_person_manager.

Note

PAL’s robots already comes with hri_face_detect, hri_fullbody and hri_person_manager installed.

Faces#

The following instructions will help you understanding more about the faces detected by the social perception pipeline.

How do I check how many faces are currently detected?#

Any ROS4HRI-compatible node detecting faces will publish the list of the faces id on the /humans/faces/tracked topic.

$ rostopic echo /humans/faces/tracked

Here an execution example for this command:

$ rostopic echo /humans/faces/tracked

---
header:
  seq: 50
  stamp:
    secs: 1674655540
    nsecs: 786450287
  frame_id: "usb_cam"
ids:
  - dldve
  - ugfwo
  - hcpwl
---

The message published on the topic has type hri_msgs/IdsList, where we can find a std_msgs/Header object, header, and the ids field, that is, the ids list for the currently detected faces.

How do I check what are the coordinates of the bounding box of a face?#

Once you are sure what is the id of the face you are interested in:

$ rostopic echo /humans/faces/<face_id>/roi

The message published on the /humans/faces/*/roi topic has type sensor_msgs/RegionOfInterest.

Here an execution example for this command:

$ rostopic echo /humans/faces/ugfwo/roi

---
x_offset: 283
y_offset: 235
height: 194
width: 154
do_rectify: True
---

Where:

  • x_offset, y_offset are respectively the x and y coordinates of the top-left corner pixel of the bounding box.

  • height, width define the bounding box size.

Note

Faces bounding boxes can also be visualised using the Humans plugin in rviz. For more information, check the ROS4HRI tooling and debugging section.

What other information can I expect from the face detector?#

Other face-specific topics are:

  • /humans/faces/*/landmarks: here you can find the facial landmarks coordinates. The message published here has type hri_msgs/FacialLandmarks and contains 69 facial landmarks, following the dlib convention.

  • /humans/faces/*/cropped: here you can find the cropped image of the detected face. You can use rviz or rqt image_view to visualise it.

  • /humans/faces/*/aligned: here you can find the cropped and aligned image of the detected face. You can use rviz or rqt image_view to visualise it. The cropped and aligned image of the face is useful for recognition purposes.

Note

Facial landmarks can be visualised using the Humans plugin in rviz. For more information, check the ROS4HRI tooling and debugging section.

Bodies#

The following instructions will help you understanding more about the bodies detected by the social perception pipeline.

How do I check how many bodies are currently detected?#

Any ROS4HRI-compatible node detecting bodies will publish the list of the bodies id on the /humans/bodies/tracked topic.

$ rostopic echo /humans/bodies/tracked

Here an output example:

$ rostopic echo /humans/bodies/tracked

---
header:
  seq: 50
  stamp:
    secs: 1674656320
    nsecs: 837982903
  frame_id: "usb_cam"
ids:
  - kaobg
  - banyw
---

The message published on the topic has type hri_msgs/IdsList, where you can find a std_msgs/Header object, header, and the ids field, that is, the ids list for the currently detected bodies.

How do I check what are the coordinates of the bounding box of a body?#

Once you are sure what is the id of the body you are interested in:

$ rostopic echo /humans/bodies/<body_id>/roi

The message published on the /humans/bodies/*/roi topic has type sensor_msgs/RegionOfInterest.

Here an execution example for this command:

$ rostopic echo /humans/bodies/<body_id>/roi

---
x_offset: 516
y_offset: 248
height: 103
width: 110
do_rectify: False
---

Where:

  • x_offset, y_offset are respectively the x and y coordinates of the top-left corner pixel of the bounding box.

  • height, width define the bounding box size.

Note

Bodies bounding boxes can also be visualised using the Humans plugin in rviz. For more information, check the ROS4HRI tooling and debugging section.

How do I check what is the current position of a body?#

Once you are sure what is the id of the body you are interested in:

$ rostopic echo /humans/bodies/<body_id>/position

The message published on the /humans/bodies/*/position topic has type geometry_msgs/PositionStamped.

Here an execution example for this command:

$ rostopic echo /humans/bodies/ynwzg/position

---
header:
  seq: 98
  stamp:
    secs: 1674662668
    nsecs: 363162279
  frame_id: "camera_color_optical_frame"
point:
  x: 1.0094782819513077
  y: 0.0
  z: 2.7442159928511325
---

The position is expressed in frame_id coordinates and refers to the middle point between the body’s left and right hips.

How do I check what is the current velocity of a body?#

Once you are sure what is the id of the body you are interested in:

$ rostopic echo /humans/bodies/<body_id>/velocity

The message published on the /humans/bodies/*/velocity topic has type geometry_msgs/TwistStamped.

Here an execution example for this command:

$ rostopic echo /humans/bodies/ynwzg/velocity

---
header:
  seq: 88
  stamp:
    secs: 1674663064
    nsecs: 820648202
  frame_id: "body_ynwzg"
twist:
  linear:
    x: 0.09314650347944815
    y: 0.04139227172094225
    z: 0.0
  angular:
    x: 0.0
    y: 0.0
    z: 0.0
---

The velocity is expressed in frame_id coordinates.

What other information can I expect from the body detector?#

Other body-specific topics are:

  • /humans/bodies/*/joint_states: here you can find the joint states for the body kinematic model. The message published on this topic has type sensor_msgs/JointState.

  • /humans/bodies/*/skeleton2d: here you can find the normalised key points coordinates of the detected 2d skeleton. Each key point comes with a confidence value. The topic messages have type hri_msgs/Skeleton2D and the keypoints are stored in an array of hri_msgs/NormalizedPointOfInterest2D. The 2D skeleton follows the OpenPose notation.

Note

The kinematic model for the detected bodies can be visualised using the Skeletons 3D plugin in rviz. For more information, check the ROS4HRI tooling and debugging section.

Note

Bodies 2D skeletons can be visualised using the Humans plugin in rviz. For more information, check the ROS4HRI tooling and debugging section.

Persons#

The following instructions will help you understanding more about the persons detected by the social perception pipeline.

How do I check how many persons are currently detected?#

Any ROS4HRI-compatible node detecting persons will publish the list of the persons id on the /humans/persons/tracked topic.

$ rostopic echo /humans/persons/tracked

Here an output example:

$ rostopic echo /humans/persons/tracked

---
header:
  seq:
  stamp:
    secs: 1674727046
    nsecs: 105304003
  frame_id: ''
ids:
  - fjplc
---

The message published on the topic has type hri_msgs/IdsList, where you can find a std_msgs/Header object, header, and the ids field, that is, the ids list for the currently detected persons.

Note

For a better understanding of the frame_id field semantics, we invite you to check its definition in the REP 155.

How do I find what is the face associated to a person?#

Once you are sure what is the id of the person you are interested in:

$ rostopic echo /humans/persons/<person_id>/face_id

Here an execution exemple for this command:

$ rostopic echo /humans/persons/fjplc/face_id

---
data: "yjksh"
---

The messages published on the topic /humans/persons/*/face_id have type std_msgs/String.

How do I find what is the body associated to a person?#

Once you are sure what is the id of the person you are interested in:

$ rostopic echo /humans/persons/<person_id>/body_id

Here an execution example for this command:

$ rostopic echo /humans/persons/fjplc/body_id

---
data: "bsoek"
---

The messages published on the topic /humans/persons/*/body_id have type std_msgs/String.

How do I find whether a person is anonymous or not?#

Once you are sure what is the id of the person you are interested in:

$ rostopic echo /humans/persons/<person_id>/anonymous

Here an execution example for this command:

$ rostopic echo /humans/persons/fjplc/anonymous

---
data: False
---

The messages published on the topic /humans/persons/published/ have type std_msgs/Bool.

How do I find what is the location confidence for a person?#

Once you are sure what is the id of the person you are interested in:

$ rostopic echo /humans/persons/<person_id>/location_confidence

Here an execution example for this command:

$ rostopic echo /humans/persons/fjplc/location_confidence

---
data: 1.0
---

The messages published on the topic /humans/persons/*/location_confidence have type std_msgs/Float32.

Note

For a better understanding of what a person’s location_confidence represents, we invite you to check its definition in the REP 155.

How do I find what is a person’s engagement status?#

Once you are sure what is the id of the person you are interested in:

$ rostopic /humans/persons/<person_id>/engagement_status

Here an execution example for this command:

$ rostopic echo /humans/persons/fjplc/engagement_status

---
header:
    seq:
    stamp:
        secs: 1674742450
        nsecs:  89568377
    frame_id: ''
level: 1
---

The messages published on the topic /humans/persons/*/engagement_status have type hri_msgs/EngagementLevel. Here, level represents one five possible states for engagement: ENGAGING, ENGAGED, DISENGAGING, DISENGAGED and UNKNOWN. Each one of these states is associated with an integer value, and level has type uint8.

See also#

  • If you have any doubt regarding some of the ROS4HRI-related concepts, check the ROS4HRI introduction or the whole standard definition in the REP 155.

  • If you want to know more on how to use the information processed by the social perception pipeline directly in your python code, check the pyhri tutorial.

  • If you want to know more on how to use the information processed by the social perception pipeline directly in your C++ code, check the libhri tutorial.