roberta-base-sst2

roberta-base-sst2

WillHeld

RoBERTa-based text classification model fine-tuned on SST2 dataset, achieving 93.23% accuracy for sentiment analysis. MIT licensed with PyTorch implementation.

PropertyValue
LicenseMIT
FrameworkPyTorch 1.7.1
Accuracy93.23%
Base ModelRoBERTa-base

What is roberta-base-sst2?

roberta-base-sst2 is a fine-tuned version of the RoBERTa base model specifically optimized for sentiment analysis using the SST2 (Stanford Sentiment Treebank) dataset. This model demonstrates impressive performance with 93.23% accuracy on the evaluation set and a loss of 0.1952.

Implementation Details

The model was trained using a carefully curated set of hyperparameters including a learning rate of 2e-05, batch sizes of 16 for training and 8 for evaluation, and the Adam optimizer. The training process spanned 10 epochs with a linear learning rate scheduler and 6% warmup ratio.

  • Built on Transformers 4.21.3 framework
  • Implements PyTorch 1.7.1
  • Uses Datasets 1.18.3 and Tokenizers 0.11.6
  • Trained with seed 42 for reproducibility

Core Capabilities

  • High-accuracy sentiment classification (93.23%)
  • Optimized for English language text
  • Suitable for production deployment via inference endpoints
  • Efficient text classification with transformer architecture

Frequently Asked Questions

Q: What makes this model unique?

This model stands out for its high accuracy in sentiment analysis tasks, achieved through careful fine-tuning of the RoBERTa architecture on the SST2 dataset. The training process showed consistent improvement, reaching optimal performance with minimal overfitting.

Q: What are the recommended use cases?

The model is ideal for sentiment analysis tasks in English text, particularly for binary classification scenarios. It's well-suited for production environments needing reliable sentiment detection with high accuracy.

Related Models

Socials
PromptLayer
Company
All services online
Location IconPromptLayer is located in the heart of New York City
PromptLayer © 2026