Search Results for author: Sharan Narasimhan

Found 3 papers, 2 papers with code

On Text Style Transfer via Style Masked Language Models

no code implementations12 Oct 2022 Sharan Narasimhan, Pooja Shekar, Suvodip Dey, Maunendra Sankar Desarkar

Text Style Transfer (TST) is performable through approaches such as latent space disentanglement, cycle-consistency losses, prototype editing etc.

Disentanglement Language Modelling +3

Towards Robust and Semantically Organised Latent Representations for Unsupervised Text Style Transfer

1 code implementation NAACL 2022 Sharan Narasimhan, Suvodip Dey, Maunendra Sankar Desarkar

We empirically show that this (a) produces a better organised latent space that clusters stylistically similar sentences together, (b) performs best on a diverse set of text style transfer tasks than similar denoising-inspired baselines, and (c) is capable of fine-grained control of Style Transfer strength.

Denoising Sentence +3

Towards Transparent and Explainable Attention Models

2 code implementations ACL 2020 Akash Kumar Mohankumar, Preksha Nema, Sharan Narasimhan, Mitesh M. Khapra, Balaji Vasan Srinivasan, Balaraman Ravindran

To make attention mechanisms more faithful and plausible, we propose a modified LSTM cell with a diversity-driven training objective that ensures that the hidden representations learned at different time steps are diverse.

Attribute

Cannot find the paper you are looking for? You can Submit a new open access paper.