1 code implementation • ICCV 2023 • Guhnoo Yun, Juhan Yoo, Kijung Kim, Jeongho Lee, Dong Hwan Kim
Recent studies show that self-attentions behave like low-pass filters (as opposed to convolutions) and enhancing their high-pass filtering capability improves model performance.
no code implementations • 12 Aug 2020 • Myoungha Song, Jeongho Lee, Donghwan Kim
In GAP, it is designed to pay attention to important information in geometric information, and CAP is designed to pay attention to important information in Channel information.