Favorite Uncensored Drivers
Collection
These models have no refusals and require no jailbreaks • 24 items • Updated
• 11
⚠️ Warning: This model can produce narratives and RP that contain violent and graphic erotic content. Adjust your system prompt accordingly, and use Llama 3 chat template.
A custom built Llama 3.1 8B MoE (Mixture of Experts) merge which combines Morpheus v1 with Assistant Pepe. The merge is suprisingly intelligent, detailed, and based. Scores ~15K at Q0 Bench. Fully uncensored and almost as fast as 8B dense. It also appears to have strong context retrieval ability. Asked for a summary at 16K and it works flawlessly (did not test higher yet).
If you want to merge custom Llama MoE you can add these scripts to your mergekit environment:
Then assign the num_experts_per_tok in config.json (or the config.yaml)
(bolded kobold non-defaults)