1 code implementation • 12 Oct 2022 • Yatin Chaudhary, Pranav Rai, Matthias Schubert, Hinrich Schütze, Pankaj Gupta
The objective of Federated Continual Learning (FCL) is to improve deep learning models over life time at each client by (relevant and efficient) knowledge transfer without sharing data.
1 code implementation • NAACL 2021 • Pankaj Gupta, Yatin Chaudhary, Hinrich Schütze
Though word embeddings and topics are complementary representations, several past works have only used pretrained word embeddings in (neural) topic modeling to address data sparsity in short-text or small collection of documents.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Yatin Chaudhary, Pankaj Gupta, Khushbu Saxena, Vivek Kulkarni, Thomas Runkler, Hinrich Schütze
Our work thus focuses on optimizing the computational cost of fine-tuning for document classification.
1 code implementation • ICML 2020 • Pankaj Gupta, Yatin Chaudhary, Thomas Runkler, Hinrich Schütze
To address the problem, we propose a lifelong learning framework for neural topic modeling that can continuously process streams of document collections, accumulate topics and guide future topic modeling tasks by knowledge transfer from several sources to better deal with the sparse data.
1 code implementation • ICML 2020 • Yatin Chaudhary, Hinrich Schütze, Pankaj Gupta
Marrying topic models and language models exposes language understanding to a broader source of document-level context beyond sentences via topics.
1 code implementation • WS 2019 • Yatin Chaudhary, Pankaj Gupta, Hinrich Schütze
This paper presents our system details and results of participation in the RDoC Tasks of BioNLP-OST 2019.
no code implementations • 29 Sep 2019 • Yatin Chaudhary, Pankaj Gupta, Thomas Runkler
in topic modeling, (2) A novel lifelong learning mechanism into neural topic modeling framework to demonstrate continuous learning in sequential document collections and minimizing catastrophic forgetting.
no code implementations • 25 Sep 2019 • Pankaj Gupta, Yatin Chaudhary, Hinrich Schütze
Though word embeddings and topics are complementary representations, several past works have only used pretrained word embeddings in (neural) topic modeling to address data sparsity problem in short text or small collection of documents.
no code implementations • 14 Sep 2019 • Pankaj Gupta, Yatin Chaudhary, Hinrich Schütze
Though word embeddings and topics are complementary representations, several past works have only used pre-trained word embeddings in (neural) topic modeling to address data sparsity problem in short text or small collection of documents.
1 code implementation • ICLR 2019 • Pankaj Gupta, Yatin Chaudhary, Florian Buettner, Hinrich Schütze
We address two challenges of probabilistic topic modelling in order to better estimate the probability of a word in a given context, i. e., P(word|context): (1) No Language Structure in Context: Probabilistic topic models ignore word order by summarizing a given context as a "bag-of-word" and consequently the semantics of words in the context is lost.
1 code implementation • 15 Sep 2018 • Pankaj Gupta, Yatin Chaudhary, Florian Buettner, Hinrich Schütze
Here, we extend a neural autoregressive topic model to exploit the full context information around words in a document in a language modeling fashion.