NGIML Model Card
Inference
NGIML performs single-image forgery localization from a pretrained checkpoint and an input RGB image.
Checkpoints
Pretrained checkpoints are hosted on Hugging Face:
CASIA2 / Extended Models
CASIA2-EffNet-42.pt(110 MB)CASIA2-EffNet+Noise-42.pt(172 MB)CASIA2-EffNet+Swin-42.pt(593 MB)CASIA2-Full-42.pt(618 MB)CASIA2-Full-4.pt(618 MB)CASIA2-Full-420.pt(618 MB)CASIA2-Full(mbconv)-42.pt(611 MB)CASIA2-Swin-42.pt(495 MB)CASIA2-Swin+Noise-42.pt(556 MB)
Additional Datasets / Models
CCC-Full-42.pt(618 MB)TampCOCO-Full-42.pt (618 MB)
Run Inference
Recommended: Google Colab
The easiest way to test the model is through Colab:
- Google Colab: Open
infer.ipynbin Colab
This is the recommended path for quick testing because the notebook is already set up for checkpoint-based inference.
Local CLI
If you want to run the project locally, use the repository files here:
- GitHub: juhenes/ngiml-infer
Install the dependencies:
pip install -r requirements.txt