1 code implementation • 3 Mar 2024 • Ameen Ali, Itamar Zimerman, Lior Wolf
The Mamba layer offers an efficient selective state space model (SSM) that is highly effective in modeling multiple domains, including NLP, long-range sequence processing, and computer vision.
no code implementations • 16 Dec 2023 • Ameen Ali, Hakan Cevikalp, Lior Wolf
Here, we propose a different approach that is based on a stratification of the graph nodes.
no code implementations • 2 Jun 2023 • Ameen Ali, Tomer Galanti, Lior Wolf
The self-attention mechanism in transformers and the message-passing mechanism in graph neural networks are repeatedly applied within deep learning architectures.
1 code implementation • 15 Feb 2022 • Ameen Ali, Thomas Schnake, Oliver Eberle, Grégoire Montavon, Klaus-Robert Müller, Lior Wolf
Transformers have become an important workhorse of machine learning, with numerous applications.
1 code implementation • 21 Oct 2021 • Ameen Ali, Idan Schwartz, Tamir Hazan, Lior Wolf
Traditionally video and text matching is done by learning a shared embedding space and the encoding of one modality is independent of the other.
1 code implementation • 25 Mar 2021 • Ameen Ali, Tal Shaharabany, Lior Wolf
Radiologist examination of chest CT is an effective way for screening COVID-19 cases.
no code implementations • 22 Mar 2021 • Ameen Ali, Tomer Galanti, Evgeniy Zheltonozhskiy, Chaim Baskin, Lior Wolf
We consider the problem of the extraction of semantic attributes, supervised only with classification labels.
1 code implementation • 3 Dec 2020 • Shir Gur, Ameen Ali, Lior Wolf
However, as we show, these methods are limited in their ability to identify the support for alternative classifications, an effect we name {\em the saliency bias} hypothesis.