NGIML Model Card

Inference

NGIML performs single-image forgery localization from a pretrained checkpoint and an input RGB image.


Checkpoints

Pretrained checkpoints are hosted on Hugging Face:

juhenes/ngiml

CASIA2 / Extended Models

  • CASIA2-EffNet-42.pt (110 MB)
  • CASIA2-EffNet+Noise-42.pt (172 MB)
  • CASIA2-EffNet+Swin-42.pt (593 MB)
  • CASIA2-Full-42.pt (618 MB)
  • CASIA2-Full-4.pt (618 MB)
  • CASIA2-Full-420.pt (618 MB)
  • CASIA2-Full(mbconv)-42.pt (611 MB)
  • CASIA2-Swin-42.pt (495 MB)
  • CASIA2-Swin+Noise-42.pt (556 MB)

Additional Datasets / Models

  • CCC-Full-42.pt (618 MB)
  • TampCOCO-Full-42.pt (618 MB)

Run Inference

Recommended: Google Colab

The easiest way to test the model is through Colab:

This is the recommended path for quick testing because the notebook is already set up for checkpoint-based inference.


Local CLI

If you want to run the project locally, use the repository files here:

Install the dependencies:

pip install -r requirements.txt
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support