molcrawl-genome-sequence-dnabert2-large

Model Description

GPT-2 large (774M parameters) foundation model pre-trained on human genome DNA sequences from the GRCh38 reference assembly.

Datasets

Usage

from transformers import AutoModelForMaskedLM, AutoTokenizer
import torch

model = AutoModelForMaskedLM.from_pretrained("kojima-lab/molcrawl-genome-sequence-dnabert2-large")
tokenizer = AutoTokenizer.from_pretrained("kojima-lab/molcrawl-genome-sequence-dnabert2-large")

# Predict masked DNA token
# Use tokenizer.mask_token instead of hardcoded "[MASK]":
# BERT-style tokenizers vary ("[MASK]", "<mask>", etc.)
if tokenizer.mask_token is None:
    raise ValueError("This tokenizer has no mask_token; masked LM inference is not supported.")
prompt = "ATCGATCG{MASK}ATCGATCG".replace("{MASK}", tokenizer.mask_token)
inputs = tokenizer(prompt, return_tensors="pt")
mask_index = (inputs["input_ids"] == tokenizer.mask_token_id).nonzero(as_tuple=True)[1]

with torch.no_grad():
    outputs = model(**inputs)
logits = outputs.logits

predicted_token_id = logits[0, mask_index].argmax(dim=-1)
predicted_token = tokenizer.decode(predicted_token_id)
result = prompt.replace(tokenizer.mask_token, predicted_token)
print(f"Predicted: {result}")

Source Code

Training pipeline, configuration files, and data preparation scripts are available in the MolCrawl GitHub repository: https://github.com/mmai-framework-lab/MolCrawl

License

This model is released under the APACHE-2.0 license.

Citation

If you use this model, please cite:

@misc{molcrawl_genome_sequence_dnabert2_large,
  title={molcrawl-genome-sequence-dnabert2-large},
  author={{RIKEN}},
  year={2026},
  publisher={{Hugging Face}},
  url={{https://huggingface.co/kojima-lab/molcrawl-genome-sequence-dnabert2-large}}
}
Downloads last month
11
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including kojima-lab/molcrawl-genome-sequence-dnabert2-large