e5-small

Maintained By
intfloat

E5-Small

PropertyValue
Parameters33.4M
PaperText Embeddings by Weakly-Supervised Contrastive Pre-training
LicenseMIT
Architecture12-layer Transformer with 384d embeddings

What is e5-small?

E5-small is a compact but powerful text embedding model developed through weakly-supervised contrastive pre-training. It's designed to generate high-quality semantic embeddings for text retrieval, similarity comparison, and classification tasks. The model has shown strong performance on the MTEB benchmark while maintaining a relatively small parameter footprint.

Implementation Details

The model utilizes a 12-layer Transformer architecture with 384-dimensional embeddings. It requires specific text prefixes ("query:" or "passage:") for optimal performance and supports both PyTorch and sentence-transformers frameworks. The model uses a specialized average pooling mechanism and normalizes embeddings for enhanced performance.

  • Efficient 33.4M parameter architecture
  • Supports both symmetric and asymmetric tasks
  • Maximum input length of 512 tokens
  • Specialized contrastive pre-training with low temperature InfoNCE loss

Core Capabilities

  • Text retrieval and semantic search
  • Semantic similarity assessment
  • Classification tasks through embedding features
  • Clustering and paraphrase detection
  • Cross-document similarity analysis

Frequently Asked Questions

Q: What makes this model unique?

E5-small combines efficiency with strong performance through its weakly-supervised contrastive pre-training approach. Despite its compact size, it achieves competitive results on various MTEB benchmark tasks while requiring minimal computational resources.

Q: What are the recommended use cases?

The model excels in text retrieval, semantic similarity tasks, and classification applications. It's particularly suitable for applications requiring efficient text embeddings while maintaining high accuracy, such as search systems, content recommendation, and document clustering.

The first platform built for prompt engineering