Challenge

A Benchmark for Out-of-Distribution Tracking

Obstacle Sequence Challenge

In open-world scenarios, such as automated driving, the presence of objects not seen during training can lead to inaccurate predictions and safety-critical situations, in particular when such unknown objects appear on the road. As in many applications images do not come as single frames, but are embedded in video sequences, tracking of these objects therefore is the logical next step past detection. Our benchmark addresses two tasks: detection and segmentation of previously-unseen objects; tracking of these unknown objects over subsequent frames. We provide two corresponding datasets together with a test suite, performing an in-depth method analysis.

Subsequent frames of segmented video data

Datasets

Street Obstacle Sequences (SOS)

  • The given scenes are shown from a perspective of a vehicle that is approaching objects placed on the street, starting from a distance of 20 meters to the street obstacle.
  • Each of the 13 different object corresponds to a class that is semantically out-of-distribution according to the Cityscapes labeling policy (e.g. bags, umbrellas, balls, toys, or scooters).
  • 20 real-world video sequences recorded at a rate of 25 frames per second.
  • Every eighth frame is labeled yielding a total number of 1,129 pixel-accurately labeled frames.
Example image from the SOS dataset

Wuppertal Obstacle Sequences (WOS)

  • While the SOS dataset considers static OOD objects located on the street, WOS provides additional moving OOD objects.
  • Moving objects are mostly dogs, rolling or bouncing balls, skateboards or bags and were captured with either a static or a moving camera from the viewpoint of a vehicle.
  • 44 real-world video sequences.
  • Every eighth frame is labeled yielding a total number of 938 pixel-accurately labeled frames.
Example image from the WOS dataset

Labeling Policy

The pixel-level annotations of both datasets include three classes:

  • street obstacle / OOD
  • street / not OOD
  • void

Image regions outside the drivable area are labeled as void and are ignored during OOD segmentation evaluation. Furthermore, the OOD objects are provided with tracking IDs in consecutive frames.

Illustration of the labeling policy

Leaderboard

Evaluation Metrics

  • OOD Segmentation
    • \(\text{AUPRC}\) : pixel-wise area under precision recall curve
    • \(\text{FPR}_{95}\) : pixel-wise false positive rate at a true positive rate of 95%
    • \(\overline{\text{F}}_1\) : component-wise \(\text{F}_1\)-score averaged over different detection thresholds
  • OOD Segmentation
    • \(\text{MOTA}\) : multiple object tracking accuracy
    • \(\text{MOTP}\) : multiple object tracking precision

For a more detailed explanation of the metrics, we refer to our paper.

Street Obstacle Sequences (SOS)

OOD Tracking OOD Segmentation
Method \(\text{MOTA}\ \uparrow\) \(\text{MOTP}\ \downarrow\) \(\text{AUPRC}\ \uparrow\) \(\text{FPR}_{95}\ \downarrow\) \(\overline{\text{F}}_1\ \uparrow\)
baseline 0.32 12.45 85.20 1.30 50.40
RbA 0.36 5.93 89.47 0.33 53.58
JSR-Net -4.59 97.67 87.01 1.59 9.98
DaCUP -1.34 89.79 93.57 0.17 20.25
PixOOD 0.41 8.24 94.76 0.18 47.15
UNO-SAM2 0.44 2.15 90.21 0.18 15.45
GroundedSAM 0.17 141.96 93.10 0.08 58.64

Wuppertal Obstacle Sequences (WOS)

OOD Tracking OOD Segmentation
Method \(\text{MOTA}\ \uparrow\) \(\text{MOTP}\ \downarrow\) \(\text{AUPRC}\ \uparrow\) \(\text{FPR}_{95}\ \downarrow\) \(\overline{\text{F}}_1\ \uparrow\)
baseline 0.13 51.17 94.92 0.59 30.13
RbA 0.23 16.88 93.76 0.81 48.52
JSR-Net -19.30 301.52 59.66 19.49 2.11
DaCUP -2.26 280.74 83.13 3.77 12.93
PixOOD 0.00 14.05 97.57 0.21 43.36
UNO-SAM2 0.37 3.98 88.10 0.22 26.57

Submission Guidelines

Evaluation of Your Method

In order to evaluate your method, we provide:

  • the labels of both datasets, SOS and WOS
  • code available on GitHub that computes the evaluation metrics given OOD heatmaps and tracking IDs obtained by your method

Participate in the Challenge

Please send your results including a link to your GitHub repository (and, if available, the corresponding paper) to the following mail: rrow2024@gmail.com.

Important Dates & Deadlines

While preparing your challenge submission, please keep the following dates in mind.
All deadlines are Anywhere on Earth (UTC-12).

Milestone Date
Challenge submission deadline 14 September 2024
Challenge winner notification 16 September 2024

This challenge is maintained by the Technical University of Berlin and will remain open after the submission deadline has expired.

If you want to participate in our Call for Papers, different deadlines apply, see here.