sqlcoder

Maintained By
defog

SQLCoder

PropertyValue
Parameter Count15B
LicenseCC BY-SA 4.0 with OpenRAIL-M
Base ModelStarCoder
Training Data10,537 human-curated questions

What is SQLCoder?

SQLCoder is a state-of-the-art language model specifically designed for converting natural language questions into SQL queries. Developed by Defog, this 15B parameter model demonstrates impressive capabilities, slightly outperforming GPT-3.5-turbo and achieving 64.6% accuracy on novel datasets not seen during training.

Implementation Details

The model underwent a two-phase training process using 10,537 human-curated questions across 10 different schemas. Phase one focused on "easy" and "medium" difficulty questions, while phase two tackled "hard" and "extra hard" queries. This strategic approach resulted in a 7 percentage point performance improvement.

  • Trained on diverse question types including group_by, order_by, ratio calculations, table joins, and where clauses
  • Requires minimum 20GB GPU memory (A100 40GB recommended)
  • Supports 8-bit quantization for consumer GPUs

Core Capabilities

  • 77.1% accuracy on group_by queries
  • 65.7% accuracy on order_by and where clause queries
  • 57.1% accuracy on complex ratio calculations
  • Comparable performance to GPT-3.5-turbo with significantly fewer parameters

Frequently Asked Questions

Q: What makes this model unique?

SQLCoder combines high accuracy with efficient resource usage, outperforming models 10 times its size like text-davinci-003. It's specifically optimized for SQL generation tasks and offers open-source accessibility with commercial use rights.

Q: What are the recommended use cases?

The model excels at converting natural language questions into SQL queries, making it ideal for database query generation, data analysis automation, and building natural language interfaces for database interactions.

The first platform built for prompt engineering