no code implementations • 12 Feb 2024 • Armin Gerami, Monte Hoover, Pranav S. Dulepet, Ramani Duraiswami
Motivated by the factorization inherent in the original fast multipole method and the improved fast Gauss transform we introduce a factorable form of attention that operates efficiently in high dimensions.