Search Results for author: Mozhgan PourKeshavarz

Found 6 papers, 0 papers with code

TrACT: A Training Dynamics Aware Contrastive Learning Framework for Long-tail Trajectory Prediction

no code implementations18 Apr 2024 JunRui Zhang, Mozhgan PourKeshavarz, Amir Rasouli

In this paper, we propose to incorporate richer training dynamics information into a prototypical contrastive learning framework.

Autonomous Driving Contrastive Learning +2

Adversarial Backdoor Attack by Naturalistic Data Poisoning on Trajectory Prediction in Autonomous Driving

no code implementations27 Jun 2023 Mozhgan PourKeshavarz, Mohammad Sabokrou, Amir Rasouli

In autonomous driving, behavior prediction is fundamental for safe motion planning, hence the security and robustness of prediction models against adversarial attacks are of paramount importance.

Autonomous Driving Backdoor Attack +3

Stacked Cross-modal Feature Consolidation Attention Networks for Image Captioning

no code implementations8 Feb 2023 Mozhgan PourKeshavarz, Shahabedin Nabavi, Mohsen Ebrahimi Moghaddam, Mehrnoush Shamsfard

Thus, we propose a stacked cross-modal feature consolidation (SCFC) attention network for image captioning in which we simultaneously consolidate cross-modal features through a novel compounding function in a multi-step reasoning fashion.

Caption Generation Decoder +1

Learn TAROT with MENTOR: A Meta-Learned Self-Supervised Approach for Trajectory Prediction

no code implementations ICCV 2023 Mozhgan PourKeshavarz, Changhe Chen, Amir Rasouli

More specifically, 1) we define TAROT prediction as a novel self-supervised proxy task to identify the complex heterogeneous structure of the map.

Meta-Learning Trajectory Prediction

ZS-IL: Looking Back on Learned ExperiencesFor Zero-Shot Incremental Learning

no code implementations22 Mar 2021 Mozhgan PourKeshavarz, Mohammad Sabokrou

In this paper, we shed light on an on-call transfer set to provide past experiences whenever a new class arises in the data stream.

Incremental Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.