SARCASM DETECTION USING MULTI-HEAD ATTENTION BASED BIDIRECTIONAL LSTM

Sarcasm Detection Using Multi-Head Attention Based Bidirectional LSTM

Sarcasm Detection Using Multi-Head Attention Based Bidirectional LSTM

Blog Article

Sarcasm is often used to express a negative opinion using positive or intensified positive words in social media.This intentional ambiguity makes sarcasm detection, an important task of sentiment analysis.Sarcasm detection is considered a binary classification problem wherein both feature-rich traditional models and deep learning models have been successfully built to predict sarcastic comments.In previous research works, models have Walkers/Rollators been built using lexical, semantic and pragmatic features.

We extract the most significant features and build a feature-rich SVM that outperforms these models.In this paper, we introduce a coffee table multi-head attention-based bidirectional long-short memory (MHA-BiLSTM) network to detect sarcastic comments in a given corpus.The experiment results reveal that a multi-head attention mechanism enhances the performance of BiLSTM, and it performs better than feature-rich SVM models.

Report this page