Learning mixture of neural temporal point processes for event sequence clustering

29 Sep 2021  ·  Yunhao Zhang, Junchi Yan, Zhenyu Ren, Jian Yin ·

Event sequence clustering applies to many scenarios e.g. e-Commerce and electronic health. Traditional clustering models fail to characterize complex real-world processes due to the strong parametric assumption. While Neural Temporal Point Processes (NTPPs) mainly focus on modeling similar sequences instead of clustering. To fill the gap, we propose Mixture of Neural Temporal Point Processes (NTPP-MIX), a general framework that can utilize many existing NTPPs for event sequence clustering. In NTPP-MIX, the prior distribution of coefficients for cluster assignment is modeled by a Dirichlet distribution. When the assignment is given, the conditional probability of a sequence is modeled by the mixture of series of NTPPs. We combine variational EM algorithm and Stochastic Gradient Descent (SGD) to efficiently train the framework. Moreover, to further improve its capability, we propose a fully data-driven NTPP based on the attention mechanism named Fully Attentive Temporal Point Process (FATPP). Experiments on both synthetic and real-world datasets show the effectiveness of NTPP-MIX against state-of-the-arts, especially when using FATPP as a basic NTPP module.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here