Affinity Functions

Embedded Dot Product Affinity

Introduced by Wang et al. in Non-local Neural Networks

Embedded Dot Product Affinity is a type of affinity or self-similarity function between two points $\mathbb{x_{i}}$ and $\mathbb{x_{j}}$ that uses a dot product function in an embedding space:

$$ f\left(\mathbb{x_{i}}, \mathbb{x_{j}}\right) = \theta\left(\mathbb{x_{i}}\right)^{T}\phi\left(\mathbb{x_{j}}\right) $$

Here $\theta\left(x_{i}\right) = W_{θ}x_{i}$ and $\phi\left(x_{j}\right) = W_{φ}x_{j}$ are two embeddings.

The main difference between the dot product and embedded Gaussian affinity functions is the presence of softmax, which plays the role of an activation function.

Source: Non-local Neural Networks

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Action Classification 1 14.29%
Action Recognition 1 14.29%
Instance Segmentation 1 14.29%
Keypoint Detection 1 14.29%
Object Detection 1 14.29%
Pose Estimation 1 14.29%
Video Classification 1 14.29%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories