Search Results for author: Mohammad Mahdi Derakhshani

Found 10 papers, 5 papers with code

Any-Shift Prompting for Generalization over Distributions

no code implementations15 Feb 2024 Zehao Xiao, Jiayi Shen, Mohammad Mahdi Derakhshani, Shengcai Liao, Cees G. M. Snoek

To effectively encode the distribution information and their relationships, we further introduce a transformer inference network with a pseudo-shift training mechanism.

Language Modelling

Unlocking Spatial Comprehension in Text-to-Image Diffusion Models

no code implementations28 Nov 2023 Mohammad Mahdi Derakhshani, Menglin Xia, Harkirat Behl, Cees G. M. Snoek, Victor Rühle

We propose CompFuser, an image generation pipeline that enhances spatial comprehension and attribute assignment in text-to-image generative models.

Attribute Image Generation +3

Self-Supervised Open-Ended Classification with Small Visual Language Models

no code implementations30 Sep 2023 Mohammad Mahdi Derakhshani, Ivona Najdenkoska, Cees G. M. Snoek, Marcel Worring, Yuki M. Asano

We present Self-Context Adaptation (SeCAt), a self-supervised approach that unlocks few-shot abilities for open-ended classification with small visual language models.

Few-Shot Learning Image Captioning

LifeLonger: A Benchmark for Continual Disease Classification

1 code implementation12 Apr 2022 Mohammad Mahdi Derakhshani, Ivona Najdenkoska, Tom van Sonsbeek, XianTong Zhen, Dwarikanath Mahapatra, Marcel Worring, Cees G. M. Snoek

Task and class incremental learning of diseases address the issue of classifying new samples without re-training the models from scratch, while cross-domain incremental learning addresses the issue of dealing with datasets originating from different institutions while retaining the previously obtained knowledge.

Classification Class Incremental Learning +1

Generative Kernel Continual learning

no code implementations26 Dec 2021 Mohammad Mahdi Derakhshani, XianTong Zhen, Ling Shao, Cees G. M. Snoek

Kernel continual learning by \citet{derakhshani2021kernel} has recently emerged as a strong continual learner due to its non-parametric ability to tackle task interference and catastrophic forgetting.

Continual Learning

Kernel Continual Learning

1 code implementation12 Jul 2021 Mohammad Mahdi Derakhshani, XianTong Zhen, Ling Shao, Cees G. M. Snoek

We further introduce variational random features to learn a data-driven kernel for each task.

Continual Learning Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.