[Experimental model]

This model is an experiment using the frankenstein script from https://huggingface.co/chargoddard/llama2-22b BLOCK_DIAGONAL = False

Using: https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-FP16 + Then used https://huggingface.co/upstage/llama-30b-instruct-2048 as donor model.

It used 160GB of system ram to merge these models, they merge fast without swap.

For prompt template and model information see huginnV1.

Downloads last month
6
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support