Robust Real Time Color Tracking
-
Authors: Mark Simon, Sven Behnke, and Raul Rojas
-
In Proceedings of The Fourth International Workshop on RoboCup, Melbourne,
Australia, pp. 62-71, 2000.
Revised version in P. Stone, T. Balch, and G. Kraetszchmar (editors):
RoboCup-2000: Robot Soccer World Cup IV, LNAI 2019, pp. 239-248, Springer,
2001.
-
Abstract:
This paper describes the vision system that was developed for the RoboCup
F180 team FU-Fighters.
The system analyzes the video stream captured from a camera mounted
above the field. It localizes the robots and the ball predicting their
positions in the next video frame and processing only small windows around
the predicted positions. Several mechanisms were implemented to make this
tracking robust. First, the size of the search windows is adjusted dynamically.
Next, the quality of the detected objects is evaluated, and further
analysis is carried out until it is satisfying. The system not only tracks
the position of the objects, but also adapts their colors and sizes. If
tracking fails, e.g. due to occlusions, we start a global search module
that localizes the lost objects again. The pixel coordinates of the objects
found are mapped to a Cartesian coordinate system using a non-linear transformation
that takes into account the distortions of the camera. To make tracking
more robust against inhomogeneous lighting, we modeled the appearance of
colors in dependence of the location using color grids. Finally, we added
a module for automatic identification of our robots.
The system analyzes 30 frames per second on a standard PC, causing
only light computational load in almost all situations.
-
Full paper: RoboCup00.pdf
back
to selected robotic soccer publications