Skip to content

Commit f569b58

Browse files
Add hand-eye calibration tutorial (#205)
Adds a comprehensive tutorial for hand-eye calibration using ROS2 and Aruco markers. Includes scene setup, data capture, and computation details. --------- Co-authored-by: Nevin Valsaraj <nevin.valsaraj32@gmail.com>
1 parent 3f05538 commit f569b58

3 files changed

Lines changed: 57 additions & 1 deletion

File tree

_data/navigation.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -125,6 +125,8 @@ wiki:
125125
url: /wiki/sensing/stag/
126126
- title: Camera Calibration
127127
url: /wiki/sensing/camera-calibration/
128+
- title: Hand-Eye Calibration
129+
url: /wiki/sensing/handeye-calibration/
128130
- title: Computer Vision Consideration
129131
url: /wiki/sensing/computer-vision-considerations/
130132
- title: Delphi ESR Radar
Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
---
2+
date: 2025-04-26
3+
title: Hand-Eye Calibration
4+
---
5+
6+
**This is a tutorial for estimating the frame transformation between an image frame and an operating frame by using a third reference frame. An application is to estimate the transformation between pixel coordinates to end effector coordinates using an Aruco marker pose as reference. ROS has been chosen as the framework for this process due to its functionality that facilitates synchronized parallel communication. While most existing packages use ROS1, this tutorial uses ROS2. The entire workflow, from scene setup, data capture, computation and integration has been covered in this tutorial.**
7+
8+
## Hand-Eye Calibration
9+
10+
### Different Frames
11+
1. Image Frame (Pixel Space)
12+
2. Target Frame (Operation Space eg, base frame of manipulator or end-effector frame)
13+
3. World Frame (Global Frame: usually set as the operating frame)
14+
15+
### The Algorithm
16+
17+
This package uses the method introduced by Lenz and Tsai in 1989. This is a data-driven method and it was observed that around thirty images are required for this method to work reliably.
18+
19+
- [Documentation](https://docs.opencv.org/3.4/d9/d0c/group__calib3d.html#gad10a5ef12ee3499a0774c7904a801b99)
20+
- [Original Research Paper](https://ieeexplore.ieee.org/document/34770)
21+
- [GitHub Package](https://github.com/SNAAK-CMU/handeye_calibration_ros2)
22+
23+
### This Setup
24+
25+
For this tutorial, ROS2 will be used as the environment for its functionality that makes it easy to define frames and transformations using a transformation tree.
26+
27+
1. Image Frame: Realsense Camera Frame (ROS TF Frame: `camera_color_optical_frame`)
28+
2. Target Frame: Base Frame of manipulator (ROS TF Frames : `base_link: "panda_link_0"`; `ee_link: "panda_hand"`)
29+
3. World Frame: Aruco marker pose (From Aruco marker detection)
30+
31+
### This Process
32+
33+
The GitHub package has detailed instructions on installation and setup. The parameters in the file `handeye_realsense/config.yaml` need to be rewritten with the ROS2 topic and frame names of your system.
34+
35+
## Summary
36+
1. Keep in mind that the manipulator's poses must be as different as possible when sampling data in order to get a generalized result. If possible, put your manipulator in guide mode and move to the poses yourself, as this repository does not include a random pose generator.
37+
2. Not moving the Aruco marker's position yields better results.
38+
3. Configuring a random pose generator would require defining your workspace in a planning framework such as MoveIt! and generate random, collision free poses where the aruco pose is in the field of view of the camera.
39+
4. Ensure that the `child frame` specified in the config is the frame on which images are published. If not, set the child frame as the camera frame and chain together an intrinsic transformation to the image frame with the extrinsic transform from the target frame you will get from this process. This process has been described in detail on the README of the repository.
40+
41+
## See Also
42+
- [Camera Calibration](/wiki/sensing/camera-calibration/)
43+
44+
## Further Reading
45+
- [Original GitHub Repository](https://github.com/shengyangzhuang/handeye_calibration_ros2)
46+
47+
## References
48+
1. https://github.com/shengyangzhuang/handeye_calibration_ros2
49+
2. https://docs.opencv.org/3.4/d9/d0c/group__calib3d.html#gad10a5ef12ee3499a0774c7904a801b99
50+
3. https://docs.opencv.org/3.4/d0/de3/citelist.html#CITEREF_Tsai89
51+
4. R. Y. Tsai and R. K. Lenz, "A new technique for fully autonomous and efficient 3D robotics hand/eye calibration," in IEEE Transactions on Robotics and Automation, vol. 5, no. 3, pp. 345-358, June 1989, doi: 10.1109/70.34770.

wiki/sensing/index.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,9 +35,12 @@ This section dives into various sensing modalities such as GPS modules, fiducial
3535
- **[Delphi ESR Radar](/wiki/sensing/delphi-esr-radar/):**
3636
Provides an overview of Delphi’s ESR radar for detecting objects and estimating their range, speed, and position.
3737

38-
- **[Comparison of Fiducial Markers](/wiki/sensing/fiducial-markers/):**
38+
- **[Fiducial Markers](/wiki/sensing/fiducial-markers/):**
3939
Compares various fiducial marker systems like ArUco, AprilTags, and STag, listing their pros, cons, and ideal use cases.
4040

41+
- **[Hand-Eye Calibration](/wiki/sensing/handeye-calibration/):**
42+
Provides a tutorial for estimating the frame transformation between an image frame and an operating frame using ROS2 and Aruco markers.
43+
4144
- **[Using an RTK GPS](/wiki/sensing/gps/):**
4245
Explains how to achieve centimeter-level accuracy using RTK GPS systems, along with practical lessons and setup guidance.
4346

0 commit comments

Comments
 (0)