MolCrawl/genome_sequence
Collection
11 items • Updated
GPT-2 large (774M parameters) foundation model pre-trained on human genome DNA sequences from the GRCh38 reference assembly.
GRCh38 human genome reference assembly: https://www.ncbi.nlm.nih.gov/assembly/GCF_000001405.26/ (Pre-training corpus)
Model Type: gpt2
Data Type: DNA/Genome
Training Date: 2026-04-24
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model = AutoModelForCausalLM.from_pretrained("kojima-lab/molcrawl-genome-sequence-gpt2-large")
tokenizer = AutoTokenizer.from_pretrained("kojima-lab/molcrawl-genome-sequence-gpt2-large")
# Generate DNA/genome sequence
prompt = "ATCGATCGATCGATCGATCGATCGATCGATCG"
inputs = tokenizer(prompt, return_tensors="pt")
with torch.no_grad():
output_ids = model.generate(
**inputs,
max_new_tokens=50,
do_sample=True,
temperature=0.8,
eos_token_id=None, # HF config.json has legacy eos_token_id=0; disable early stop
pad_token_id=0,
)
print(tokenizer.decode(output_ids[0], skip_special_tokens=True))
Training pipeline, configuration files, and data preparation scripts are available in the MolCrawl GitHub repository: https://github.com/mmai-framework-lab/MolCrawl
This model is released under the APACHE-2.0 license.
If you use this model, please cite:
@misc{molcrawl_genome_sequence_gpt2_large,
title={molcrawl-genome-sequence-gpt2-large},
author={{RIKEN}},
year={2026},
publisher={{Hugging Face}},
url={{https://huggingface.co/kojima-lab/molcrawl-genome-sequence-gpt2-large}}
}