stsb-bert-tiny-openvino

Maintained By
sentence-transformers-testing

stsb-bert-tiny-openvino

PropertyValue
Parameter Count4.39M parameters
Tensor TypeF32
Embedding Dimension128
FrameworkOpenVINO

What is stsb-bert-tiny-openvino?

stsb-bert-tiny-openvino is a lightweight sentence transformer model optimized for OpenVINO that maps sentences and paragraphs to 128-dimensional dense vector spaces. This compact model is specifically designed for efficient semantic similarity tasks and can be deployed in production environments requiring minimal computational resources.

Implementation Details

The model utilizes a modified BERT architecture with mean pooling and was trained using CosineSimilarityLoss. It was optimized with AdamW optimizer using a learning rate of 8e-05 over 10 epochs with warmup steps of 36.

  • Supports maximum sequence length of 512 tokens
  • Implements mean pooling strategy for sentence embeddings
  • Optimized for OpenVINO inference
  • Uses F32 tensor type for computations

Core Capabilities

  • Sentence and paragraph embedding generation
  • Semantic similarity computation
  • Clustering and semantic search applications
  • Efficient inference with OpenVINO optimization

Frequently Asked Questions

Q: What makes this model unique?

This model stands out for its efficient architecture that combines the power of BERT with OpenVINO optimization, making it particularly suitable for production deployments where computational resources are limited while maintaining good performance for semantic similarity tasks.

Q: What are the recommended use cases?

The model is ideal for applications requiring semantic similarity matching, document clustering, and semantic search functionality, particularly in resource-constrained environments or where quick inference is crucial.

The first platform built for prompt engineering