hebert-finetuned-hebrew-metaphor
Property | Value |
---|---|
Base Model | avichr/heBERT |
Framework | PyTorch 2.0.1 |
Accuracy | 95.10% |
Downloads | 29,253 |
What is hebert-finetuned-hebrew-metaphor?
This is a specialized NLP model designed for detecting metaphorical usage of verbs in Hebrew text. Built upon the heBERT architecture, it has been fine-tuned to distinguish between literal and metaphorical uses of 20 common Hebrew verbs. The model demonstrates exceptional accuracy of 95.10% on the evaluation set, making it a reliable tool for Hebrew metaphor analysis.
Implementation Details
The model was trained using carefully selected hyperparameters, including a learning rate of 2e-05 and Adam optimizer. Training was conducted over 15 epochs with batch sizes of 16, achieving optimal performance through linear learning rate scheduling.
- Trained on the HebrewMetaphors dataset
- Implements Transformer architecture with BERT-based encoding
- Optimized for Hebrew language processing
- Utilizes advanced tokenization for Hebrew text
Core Capabilities
- Metaphor detection for 20 specific Hebrew verbs
- Binary classification (metaphorical vs. literal usage)
- High-accuracy predictions (95.10%)
- Efficient processing of Hebrew text
Frequently Asked Questions
Q: What makes this model unique?
This model specializes in Hebrew metaphor detection, a niche but crucial aspect of natural language understanding. It's particularly noteworthy for its high accuracy and focus on specific verbs commonly used in metaphorical contexts in Hebrew.
Q: What are the recommended use cases?
The model is ideal for linguistic analysis, literary studies, and automated text understanding systems working with Hebrew content. It's particularly useful for applications requiring distinction between literal and metaphorical usage of common Hebrew verbs.