molcrawl-genome-sequence-gpt2-large

Model Description

GPT-2 large (774M parameters) foundation model pre-trained on human genome DNA sequences from the GRCh38 reference assembly.

Datasets

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

model = AutoModelForCausalLM.from_pretrained("kojima-lab/molcrawl-genome-sequence-gpt2-large")
tokenizer = AutoTokenizer.from_pretrained("kojima-lab/molcrawl-genome-sequence-gpt2-large")

# Generate DNA/genome sequence
prompt = "ATCGATCGATCGATCGATCGATCGATCGATCG"
inputs = tokenizer(prompt, return_tensors="pt")
with torch.no_grad():
    output_ids = model.generate(
        **inputs,
        max_new_tokens=50,
        do_sample=True,
        temperature=0.8,
        eos_token_id=None,  # HF config.json has legacy eos_token_id=0; disable early stop
        pad_token_id=0,
    )
print(tokenizer.decode(output_ids[0], skip_special_tokens=True))

Source Code

Training pipeline, configuration files, and data preparation scripts are available in the MolCrawl GitHub repository: https://github.com/mmai-framework-lab/MolCrawl

License

This model is released under the APACHE-2.0 license.

Citation

If you use this model, please cite:

@misc{molcrawl_genome_sequence_gpt2_large,
  title={molcrawl-genome-sequence-gpt2-large},
  author={{RIKEN}},
  year={2026},
  publisher={{Hugging Face}},
  url={{https://huggingface.co/kojima-lab/molcrawl-genome-sequence-gpt2-large}}
}
Downloads last month
2,062
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including kojima-lab/molcrawl-genome-sequence-gpt2-large