MissFormer: (In-)Attention-Based Handling of Missing Observations for Trajectory Filtering and Prediction

Document Type

Conference Proceeding

Publication Date

1-1-2021

Publication Title

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Publisher

Springer

Publisher Location

Midtown Manhattan, New York City

Volume

13017 LNCS

First page number:

521

Last page number:

533

Abstract

In applications such as object tracking, time-series data inevitably carry missing observations. Following the success of deep learning-based models for various sequence learning tasks, these models increasingly replace classic approaches in object tracking applications for inferring the objects’ motion states. While traditional tracking approaches can deal with missing observations, most of their deep counterparts are, by default, not suited for this. Towards this end, this paper introduces a transformer-based approach for handling missing observations in variable input length trajectory data. The model is formed indirectly by successively increasing the complexity of the demanded inference tasks. Starting from reproducing noise-free trajectories, the model then learns to infer trajectories from noisy inputs. By providing missing tokens, binary-encoded missing events, the model learns to in-attend to missing data and infers a complete trajectory conditioned on the remaining inputs. In the case of a sequence of successive missing events, the model then acts as a pure prediction model. The abilities of the approach are demonstrated on synthetic data and real-world data reflecting prototypical object tracking scenarios.

Keywords

Filtering; Missing Input Data; Missing observations; Trajectory data; Trajectory prediction; Transformer

Disciplines

Numerical Analysis and Scientific Computing | Programming Languages and Compilers

UNLV article access

Search your library

Share

COinS