Skip to content

Latest commit

 

History

History
13 lines (8 loc) · 783 Bytes

File metadata and controls

13 lines (8 loc) · 783 Bytes

Emotion Classification with DistilBERT

This repository contains a fine-tuned variant of distilbert-base-uncased for emotion classification. The model was trained on the emotion dataset and has demonstrated strong performance in evaluations.

Model Details

  • Model Name: ale-dp/distilbert-base-uncased-finetuned-emotion
  • Performance: Check the Hugging Face Model Page for detailed performance metrics and evaluation results.

Usage

To use the pre-trained model in your Python script or Jupyter Notebook, you can follow the emotion_classification_demo.ipynb