hri_emotion_recognizer

Caution

This documentation page has been auto-generated.

It may be missing some details.

hri_emotion_recognizer Quick Facts

Category

👥 Social perception

Debian package

hri_emotion_recognizer-None

License

Apache-2.0

Source code

https://github.com/ros4hri/hri_emotion_recognizer

Recognises basic emotions from detected faces.

Overview

The hri_emotion_recognizer node is for emotion recognition using ONNX (Open Neural Network Exchange) models. It integrates within the ROS4HRI framework by analyzing the output from the hri_face_detector node to identify emotional states.

For more information about ONNX Fer Plus models, visit ONNX GitHub repository.

Preparation

The emotion recognizer node relies on DNN models, currently supporting ONNX models, collected in the hri_emotion_models repository.

Resources

  • dnn_models.emotion_recognition: All the available emotion models are installed under this resource. It expects the model name as saved in the hri_emotion_models/model/ folder. Note not all models may be supported at this time.

ROS API

Topics

Parameters

  • model: This parameter specifies the ONNX model used for

    emotion recognition. It is a string and defaults to emotion-ferplus-8.onnx. Other ONNX models can be used, but compatibility with this node should be verified as not all models may be compatible. Models should be included in hri_emotion_models/models directory.

Launch

ros2 launch hri_emotion_recognizer emotion_recognizer.launch.py

The emotion_recognizer.launch.py launch file accepts as arguments and configures the defined parameters. It also automatically transitions the node to the active state.

Example

To test the package using the system default webcam:

  1. Install the usb_cam package: sudo apt install ros-humble-usb-cam

  2. Launch the usb_cam package: ros2 run usb_cam usb_cam_node_exe

  3. In a new terminal, install launch the hri_face_detect package: sudo apt install ros-humble-hri-face-detect ros2 launch hri_face_detect face_detect.launch.py rgb_camera:=<input camera namespace>

  4. In a new terminal, run the hri_emotion_recognizer package: ros2 launch hri_emotion_recognizer emotion_recognizer.launch.py

  5. Check the faces tracked and the corresponding expression ros2 topic echo /humans/faces/tracked ros2 topic echo /humans/faces/wstw/expression

In ROS’s RViz, add the Humans plugin to see the expression together with the faces detected.

Node management

How to check the status of the node?

# this  a copy-pastable snippet to check the status
# this  a copy-pastable snippet using rosnode/rostopic to check the node is working

How to access the node’s logs?

# this  a copy-pastable snippet

How to start/stop/restart the node

# this  a copy-pastable snippet

Using in your code/application

Access via the robot’s GUI

[insert screenshots here]

Access via ROS standard tools

# this  a copy-pastable snippet

[insert screenshots here]

Using in Python

# this  a copy-pastable snippet

Using in C++

# this  a copy-pastable snippet

Reference

Subscribed topics

Published topics

Actions server

Actions client

Services