How-to: Implement a gaze-following behaviour#
This section describes how to test and implement gaze following behaviors using the gaze manager.
Check for concurrent nodes#
At system startup, a few nodes will spin up that might send gaze commands to the gaze manager. By design the gaze manager implements no prioritization over goals. It only generates and sends trajectories to look at the newest goal it has received. As other nodes’ commands can interfere with the commands we send, we may need to kill them.
First, we check which nodes are potentially sending commands to the gaze manager on the /look_at topic:
rostopic info /look_at
Type: geometry_msgs/PointStamped
Publishers:
* /gm_face (http://ari<SN>c:39283/)
* /eyes_demo (http://ari<SN>c:40155/)
Subscribers:
* /gaze_manager_node (http://ari<SN>c:39395/)
Any node listed under Publishers:
can potentially send concurrent goals to ours.
If you are not sure whether a node might interfere, it is best to kill it:
1rosnode kill /gm_face /eyes_demo
Note
If the eyes are still moving after these nodes are killed, check whether
anything other than the gaze manager is sending targets to the eyes directly at
/robot_face/look_at
or /robot_face/trajectory/goal
.
Manually testing gaze#
When debugging it may be helpful to test the gaze manager from the terminal to rule out problems with the components under development. To have ARI look at a point, run:
rostopic pub /look_at geometry_msgs/PointStamped "
header:
seq: 0
stamp:
secs: 0
nsecs: 0
frame_id: 'base_link'
point:
x: 1.0
y: 1.0
z: 1.0"
gaze manager also offers to select a specific gaze style when looking at a target (/look_at_with_style):
rostopic pub /look_at_with_style hri_actions_msgs/LookAtWithStyle "
style: 0
target:
header:
seq: 0
stamp:
secs: 0
nsecs: 0
frame_id: 'base_link'
point:
x: 1.0
y: -0.8
z: 1.5"
where
style: 0
- default style (head+eyes)style: 1
- eyes-onlystyle: 2
- head-only
Note
Both topics support points in any frame_id
as long as a transform to the eye frames exists.
Implementing gaze behaviors#
We use the face_tracker_minimal.py
example from the gaze manager package to show how
gaze following can be implemented in python. The code polls pyhri
continuously to check whether there are any faces detected. When there is at
least one detection, we command ARI to look at the face’s position. In case
there are multiple detections, we simply choose the first face in the list as a
target.
The complete face tracker is implemented as follows:
1import rospy
2
3from hri import HRIListener
4from geometry_msgs.msg import Point, PointStamped
5
6rospy.init_node("face_gaze_tracker")
7
8# we use pyhri to get faces to look at
9hri = HRIListener()
10
11# create publishers for look at targets
12look_at_pub = rospy.Publisher("/look_at", PointStamped, queue_size=1)
13
14r = rospy.Rate(10)
15while not rospy.is_shutdown():
16 if len(hri.faces) > 0: # are there faces detected?
17 face = list(hri.faces.values())[0] # look at first face in the list
18 print(f"looking at {face.id}")
19
20 # extract point stamped to look at from face
21 T = face.transform()
22 trans = T.transform.translation
23 target = PointStamped(point=Point(x=trans.x, y=trans.y, z=trans.z), header=T.header)
24 look_at_pub.publish(target)
25 else:
26 print("no faces detected")
Next, we will explain the most important parts of the script in more detail.
1hri = HRIListener()
2look_at_pub = rospy.Publisher("/look_at", PointStamped, queue_size=1)
HRIListener
serves as an interface to the ROS4HRI ecosystem. In this
example we solely use the face detections, however the code can be easily
adapted to target other percepts. look_at_pub
is used to send targets to
the gaze manager.
1if len(hri.faces) > 0:
2 face = list(hri.faces.values())[0]
3
4 T = face.transform()
5 trans = T.transform.translation
6 target = PointStamped(point=Point(x=trans.x, y=trans.y, z=trans.z), header=T.header)
In the first line we check whether any faces are detected. If so, we simply
extract the first face in the list of detections. One could also implement a
more complex behavior here that would prefer looking at the face ID we were
looking at before and only look at the first face as a fallback. The last three
lines extract the face’s transform and create a PointStamped
from its
translation which is ready to be sent as a gaze target.