knowledge-graph-nlp

Maintained By
vishnun

knowledge-graph-nlp

PropertyValue
Parameter Count66.4M
LicenseApache 2.0
Base ModelDistilBERT-base-uncased
F1 Score0.8849
Accuracy0.9453

What is knowledge-graph-nlp?

knowledge-graph-nlp is a specialized token classification model built on the DistilBERT architecture, fine-tuned specifically for knowledge graph construction tasks. The model demonstrates impressive performance metrics with 89.88% precision and 87.15% recall on its evaluation dataset.

Implementation Details

The model was trained using the Adam optimizer with a carefully tuned learning rate of 2e-05 over 4 epochs. Training was conducted with batch sizes of 16 for both training and evaluation, achieving a final training loss of 0.1336.

  • Architecture: DistilBERT-based token classifier
  • Training Framework: PyTorch with Transformers library
  • Model Size: 66.4M parameters with F32 tensor type
  • Dataset: vishnun/NLP-KnowledgeGraph

Core Capabilities

  • Token Classification for Knowledge Graph Construction
  • High-Accuracy Entity Recognition (94.53%)
  • Optimized Performance with Linear Learning Rate Scheduling
  • Support for English Language Processing

Frequently Asked Questions

Q: What makes this model unique?

The model's standout feature is its exceptional balance of precision (89.88%) and recall (87.15%), making it particularly effective for knowledge graph applications while maintaining computational efficiency through the DistilBERT architecture.

Q: What are the recommended use cases?

This model is best suited for applications requiring token classification in knowledge graph construction, entity recognition, and text analysis tasks where high accuracy and balanced precision-recall metrics are crucial.

The first platform built for prompt engineering