Midnight-Miqu-70B-v1.0
Property | Value |
---|---|
Parameter Count | 70B |
Model Type | Language Model (SLERP Merge) |
Context Length | 32K (up to 64K with alpha_rope) |
License | Personal Use Only |
Research Paper | Related Research |
What is Midnight-Miqu-70B-v1.0?
Midnight-Miqu-70B is a sophisticated language model created through a SLERP merge of miqu-1-70b-sf and Midnight-Rose-70B. This model represents a careful balance between the long-context capabilities of Miqu and the creative strengths of Midnight Rose, specifically optimized for roleplaying and storytelling applications.
Implementation Details
The model utilizes advanced SLERP merging techniques with carefully tuned interpolation parameters, preserving the first and last layers of Miqu while blending the middle layers with Midnight Rose. It supports extended context lengths of 32K tokens with alpha_rope=1, and can potentially handle up to 64K tokens with alpha_rope=2.5.
- Advanced context handling capabilities
- Optimized for creative writing and roleplay
- Supports multiple instruction formats (Vicuna, Mistral, ChatML)
- Available in various quantization formats (GGUF, GPTQ, Exllama2)
Core Capabilities
- Extended context length processing
- Detailed character interaction and roleplay
- Creative storytelling and narrative generation
- Customizable sampling parameters for optimal output
- Support for internal thought representation and spatial awareness
Frequently Asked Questions
Q: What makes this model unique?
The model uniquely combines the long-context capabilities of Miqu with the creative strengths of Midnight Rose, offering an uncensored platform specifically optimized for roleplay and storytelling with extensive context understanding.
Q: What are the recommended use cases?
The model excels in roleplaying scenarios, creative writing, and storytelling applications. It's particularly well-suited for character-based interactions and narrative generation, with support for both brief and extended contexts.