Small Models
Collection
A list of all small models (=<1B) that I have published. • 9 items • Updated
h_model
Nano-H is a revolutionary, ultra-minimalist language model architecture. While the industry trends toward trillion-parameter behemoths, Nano-H proves that with just 2 trainable parameters, you can achieve 100% precision, 100% recall, and 0% hallucination for the most important character in the alphabet: H.
h_model| Benchmark | Nano-H Score |
|---|---|
| Output Consistency | 100% |
| H-Accuracy | 100% |
To experience the definitive power of the h_model architecture, load it with trust_remote_code=True:
from transformers import AutoModel, AutoTokenizer
model_path = "Fu01978/Nano-H"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModel.from_pretrained(model_path, trust_remote_code=True)
inputs = tokenizer("Hello?", return_tensors="pt")
outputs = model.generate(inputs["input_ids"], max_length=1)
print(tokenizer.decode(outputs[0]))
Nano-H is inherently safe. It cannot be jailbroken to provide instructions for dangerous activities, as any such request will be met with a singular "H".