Downloads provided by UsageCounts
The Robot Tracking Benchmark (RTB) is a synthetic dataset that facilitates the quantitative evaluation of 3D tracking algorithms for multi-body objects. It was created using the procedural rendering pipeline BlenderProc. The dataset contains photo-realistic sequences with HDRi lighting and physically-based materials. Perfect ground truth annotations for camera and robot trajectories are provided in the BOP format. Many physical effects, such as motion blur, rolling shutter, and camera shaking, are accurately modeled to reflect real-world conditions. For each frame, four depth qualities exist to simulate sensors with different characteristics. While the first quality provides perfect ground truth, the second considers measurements with the distance-dependent noise characteristics of the Azure Kinect time-of-flight sensor. Finally, for the third and fourth quality, two stereo RGB images with and without a pattern from a simulated dot projector were rendered. Depth images were then reconstructed using Semi-Global Matching (SGM). The benchmark features six robotic systems with different kinematics, ranging from simple open-chain and tree topologies to structures with complex closed kinematics. For each robotic system, three difficulty levels are provided: easy, medium, and hard. In all sequences, the kinematic system is in motion. While for easy sequences the camera is mostly static with respect to the robot, medium and hard sequences feature faster and shakier motions for both the robot and camera. Consequently, motion blur increases, which also reduces the quality of stereo matching. Finally, for each object, difficulty level, and depth image quality, 10 sequences with 150 frames are rendered. In total, this results in 108.000 frames that feature different kinematic structures, motion patterns, depth measurements, scenes, and lighting conditions. In summary, the Robot Tracking Benchmark allows to extensively measure, compare, and ablate the performance of multi-body tracking algorithms, which is essential for further progress in the field.
3D Object Tracking, Pose Estimation, Robotics, Multi-body, Articulated Objects
3D Object Tracking, Pose Estimation, Robotics, Multi-body, Articulated Objects
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 169 | |
| downloads | 153 |

Views provided by UsageCounts
Downloads provided by UsageCounts