Appearance-free Tripartite Matching for Multiple Object Tracking

9 Aug 2020  ·  Lijun Wang, Yanting Zhu, Jue Shi, Xiaodan Fan ·

Multiple Object Tracking (MOT) detects the trajectories of multiple objects given an input video. It has become more and more important for various research and industry areas, such as cell tracking for biomedical research and human tracking in video surveillance. Most existing algorithms depend on the uniqueness of the object's appearance, and the dominating bipartite matching scheme ignores the speed smoothness. Although several methods have incorporated the velocity smoothness for tracking, they either fail to pursue global smooth velocity or are often trapped in local optimums. We focus on the general MOT problem regardless of the appearance and propose an appearance-free tripartite matching to avoid the irregular velocity problem of the bipartite matching. The tripartite matching is formulated as maximizing the likelihood of the state vectors constituted of the position and velocity of objects, which results in a chain-dependent structure. We resort to the dynamic programming algorithm to find such a maximum likelihood estimate. To overcome the high computational cost induced by the vast search space of dynamic programming when many objects are to be tracked, we decompose the space by the number of disappearing objects and propose a reduced-space approach by truncating the decomposition. Extensive simulations have shown the superiority and efficiency of our proposed method, and the comparisons with top methods on Cell Tracking Challenge also demonstrate our competence. We also applied our method to track the motion of natural killer cells around tumor cells in a cancer study.\footnote{The source code is available on \url{https://github.com/szcf-weiya/TriMatchMOT}

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here