Speeding up Word Mover's Distance and its variants via properties of distances between embeddings

1 Dec 2019  ยท  Matheus Werner, Eduardo Laber ยท

The Word Mover's Distance (WMD) proposed by Kusner et al. is a distance between documents that takes advantage of semantic relations among words that are captured by their embeddings. This distance proved to be quite effective, obtaining state-of-art error rates for classification tasks, but is also impracticable for large collections/documents due to its computational complexity. For circumventing this problem, variants of WMD have been proposed. Among them, Relaxed Word Mover's Distance (RWMD) is one of the most successful due to its simplicity, effectiveness, and also because of its fast implementations. Relying on assumptions that are supported by empirical properties of the distances between embeddings, we propose an approach to speed up both WMD and RWMD. Experiments over 10 datasets suggest that our approach leads to a significant speed-up in document classification tasks while maintaining the same error rates.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Text Classification 20NEWS REL-RWMD k-NN Accuracy 74.78 # 15
Document Classification Amazon REL-RWMD k-NN Accuracy 93.03 # 3
Document Classification BBCSport REL-RWMD k-NN Accuracy 95.18 # 4
Document Classification Classic REL-RWMD k-NN Accuracy 96.85 # 1
Text Classification Ohsumed REL-RWMD k-NN Accuracy 58.74 # 9
Document Classification Recipe REL-RWMD k-NN Accuracy 56.80 # 2
Document Classification Reuters-21578 REL-RWMD k-NN Accuracy 95.61 # 2
Document Classification Twitter REL-RWMD k-NN Accuracy 71.05 # 2

Methods