BERT-based Emotion Classification Model π
This model is a fine-tuned version of BERT for emotion classification. It predicts one of six emotion categories from a given English text input.
π§ Model Details
- Architecture:
BertForSequenceClassification - Base Model:
bert-base-uncased - Labels:
0: sadness1: joy2: love3: anger4: fear5: surprise
- Problem Type: Single-label classification
- Hidden Size: 768
- Max Sequence Length: 512
- Number of Layers: 12
π How to Use
from transformers import pipeline
classifier = pipeline("text-classification", model="AbhishekBhavnani/TweetClassification")
result = classifier("I'm feeling so happy today!")
print(result)
Example
Input:
I can't stop smiling, this movie is too funny!
Output:
[{'label': 'joy', 'score': 0.9821}]
- Downloads last month
- 10