coedit-large

Maintained By
grammarly

CoEdIT-Large

PropertyValue
Parameter Count783M
Base Modelgoogle/flan-t5-large
LicenseCC-BY-NC-4.0
PaperarXiv:2305.09857

What is coedit-large?

CoEdIT-Large is a sophisticated text editing model developed by Grammarly that specializes in task-specific instruction tuning. Fine-tuned from FLAN-T5-large, this 783M parameter model excels at various text revision tasks including grammar correction, coherence improvement, simplification, and style adaptation.

Implementation Details

The model leverages the T5 architecture and has been trained on seven diverse datasets including facebook/asset, wi_locness, and GEM/wiki_auto_asset_turk. It uses F32 tensor types and implements text-to-text generation using PyTorch and Transformers libraries.

  • Built on FLAN-T5-large architecture
  • Supports text-generation-inference endpoints
  • Implements task-specific instruction tuning
  • Trained on multiple high-quality datasets

Core Capabilities

  • Grammar error correction
  • Text coherence improvement
  • Content simplification
  • Paraphrasing
  • Formality adjustment
  • Neutralization of tone

Frequently Asked Questions

Q: What makes this model unique?

CoEdIT-Large stands out for its task-specific instruction tuning approach, allowing it to handle various text editing tasks through natural language instructions. It's particularly effective at understanding context and maintaining the original meaning while making necessary improvements.

Q: What are the recommended use cases?

The model is ideal for applications requiring sophisticated text editing, including academic writing assistance, content optimization, and professional documentation improvement. It excels at tasks ranging from basic grammar correction to complex style transformations.

The first platform built for prompt engineering