Mythalion-13B
Property | Value |
---|---|
Parameter Count | 13B |
License | LLaMA2 |
Format | FP16 |
Language | English |
Training Datasets | 5 datasets including PIPPA, OpenOrca, Claude Multiround Chat |
What is mythalion-13b?
Mythalion-13B is an advanced language model that represents a strategic merger between Pygmalion-2 13B and MythoMax L2 13B, built on the LLaMA2 architecture. This collaboration between PygmalionAI and Gryphe has resulted in a model specifically optimized for roleplay and chat applications, offering superior performance compared to its predecessors.
Implementation Details
The model supports dual prompting formats - both Alpaca and Pygmalion formatting - and utilizes a three-token role system (<|system|>, <|user|>, and <|model|>) for structured interactions. Built using PyTorch and implementing the Transformers architecture, it maintains compatibility with text-generation-inference systems.
- FP16 tensor format for efficient computation
- Trained on five diverse datasets for comprehensive knowledge
- Supports both instruction-following and roleplay scenarios
- Built with Axolotl framework
Core Capabilities
- Advanced roleplay and chat interactions
- Multi-turn conversation handling
- Flexible prompt formatting support
- Out-of-channel information injection via system prompts
- Long-form response generation
Frequently Asked Questions
Q: What makes this model unique?
The model uniquely combines the strengths of Pygmalion-2 and MythoMax, offering enhanced roleplay capabilities while maintaining the ability to handle general text generation tasks. Its dual prompting system provides flexibility for different use cases.
Q: What are the recommended use cases?
The model is specifically designed for fictional writing and entertainment purposes, particularly excelling in roleplay and chat scenarios. It's important to note that it's not fine-tuned for safety or factual accuracy, and should not be used for critical applications requiring verified information.