Few-shot Personalized Saliency Prediction Based on Inter-personnel Gaze Patterns

6 Jul 2023  ·  Yuya Moroto, Keisuke Maeda, Takahiro Ogawa, Miki Haseyama ·

This paper presents few-shot personalized saliency prediction based on inter-personnel gaze patterns. In contrast to general saliency maps, personalized saliecny maps (PSMs) have been great potential since PSMs indicate the person-specific visual attention useful for obtaining individual visual preferences. The PSM prediction is needed for acquiring the PSMs for unseen images, but its prediction is still a challenging task due to the complexity of individual gaze patterns. Moreover, the eye-tracking data obtained from each person is necessary to construct and predict PSMs, but it is difficult to acquire the massive amounts of such data. One solution for realizing PSM prediction from the limited amount of data is the effective use of eye-tracking data obtained from other persons. To efficiently treat the PSMs of other persons, this paper focuses on the selection of images to acquire eye-tracking data and the preservation of structural information of PSMs of other persons. In the proposed method, such images are selected such that they bring more diverse gaze patterns to persons, and the structural information is preserved by adopting the tensor-based regression method. Experimental results demonstrate that the above two points are beneficial for the few-shot PSM prediction.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods