Search Results for author: Adithya Avvaru

Found 5 papers, 0 papers with code

BERT at SemEval-2020 Task 8: Using BERT to Analyse Meme Emotions

no code implementations SEMEVAL 2020 Adithya Avvaru, Sanath Vobilisetty

Our system built using state-of-the-art Transformer-based pre-trained Bidirectional Encoder Representations from Transformers (BERT) performed better compared to baseline models for the two tasks A and C and performed close to the baseline model for task B.

Sentiment Analysis

Detecting Sarcasm in Conversation Context Using Transformer-Based Models

no code implementations WS 2020 Adithya Avvaru, Sanath Vobilisetty, Radhika Mamidi

Sarcasm detection, regarded as one of the sub-problems of sentiment analysis, is a very typical task because the introduction of sarcastic words can flip the sentiment of the sentence itself.

Sarcasm Detection Sentence +1

CodeForTheChange at SemEval-2019 Task 8: Skip-Thoughts for Fact Checking in Community Question Answering

no code implementations SEMEVAL 2019 Adithya Avvaru, P, Anupam ey

The strengths of the scalable gradient tree boosting algorithm, XGBoost and distributed sentence encoder, Skip-Thought Vectors are not explored yet by the cQA research community.

Community Question Answering Fact Checking +2

Affect in Tweets Using Experts Model

no code implementations PACLIC 2018 Subba Reddy Oota, Adithya Avvaru, Mounika Marreddy, Radhika Mamidi

We compared the results of our Experts Model with both baseline results and top five performers of SemEval-2018 Task-1, Affect in Tweets (AIT).

Sentiment Analysis

Mixture of Regression Experts in fMRI Encoding

no code implementations26 Nov 2018 Subba Reddy Oota, Adithya Avvaru, Naresh Manwani, Raju S. Bapi

We argue that each expert learns a certain region of brain activations corresponding to its category of words, which solves the problem of identifying the regions with a simple encoding model.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.