distilbert-imdb

Maintained By
lvwerra

distilbert-imdb

PropertyValue
LicenseApache 2.0
Base Modeldistilbert-base-uncased
Accuracy92.8%
Training FrameworkPyTorch 1.10.0

What is distilbert-imdb?

distilbert-imdb is a fine-tuned version of DistilBERT specifically optimized for sentiment analysis on the IMDB dataset. Created by lvwerra, this model demonstrates impressive performance with a 92.8% accuracy rate on evaluation tasks. It leverages the lightweight DistilBERT architecture while maintaining robust sentiment classification capabilities.

Implementation Details

The model was trained using carefully selected hyperparameters, including a learning rate of 5e-05 and Adam optimizer with betas=(0.9,0.999). Training was conducted with a batch size of 16 over a single epoch, using a linear learning rate scheduler.

  • Built on Transformers 4.15.0 framework
  • Implemented with PyTorch 1.10.0+cu111
  • Uses Datasets 1.17.0 and Tokenizers 0.10.3
  • Achieves a validation loss of 0.1903

Core Capabilities

  • Sentiment analysis on movie reviews
  • Binary classification tasks
  • Efficient inference with distilled architecture
  • Production-ready with TensorBoard support

Frequently Asked Questions

Q: What makes this model unique?

This model combines the efficiency of DistilBERT with high accuracy for sentiment analysis, making it particularly suitable for production environments where both performance and computational efficiency are crucial.

Q: What are the recommended use cases?

The model is ideal for sentiment analysis of movie reviews and similar text classification tasks. It's particularly well-suited for applications requiring quick inference times while maintaining high accuracy.

The first platform built for prompt engineering