CrisisPulse β BERT (fine-tuned)
A compact, fine-tuned BERT model to classify tweets as Disaster or Not Disaster. It is a lightweight disaster-intent classifier optimized for short-form social media text. Designed for fast inference, containerized deployment, and reproducible research.
IMPORTANT NOTE: Class imbalance was INTENTIONALLY preserved to reflect real-world disaster distribution. See training notebook for detailed rationale.
- Model:
bert-base-uncased - Author: sakibalfahim
- Uploaded: 2025-12-22
Label mapping
0β Not Disaster1β Disaster
How to use (example)
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
repo_id = 'sakibalfahim/CrisisPulse'
token = 'hf_xxx' # use secure token or Hugging Face login
tokenizer = AutoTokenizer.from_pretrained(repo_id, token=token)
model = AutoModelForSequenceClassification.from_pretrained(repo_id, token=token)
model.eval()
inputs = tokenizer('Massive flood reported in downtown area', return_tensors='pt', truncation=True)
with torch.no_grad():
logits = model(**inputs).logits
pred = int(logits.argmax(-1)[0].item())
print('Prediction:', {0: 'Not Disaster', 1: 'Disaster'}[pred])
Demo
Training summary
- Base model:
bert-base-uncased - Training environment: NVIDIA GPU (PyTorch, Transformers)
- Saved artifacts: Uploaded to this repository.
Intended use & limitations
Intended for research/demo use. Validate on your domain before any high-stakes use. Be cautious with domain shift, sarcasm, or non-English text.
Reproducibility
Check the notebook for exact preprocessing, hyperparameters, and seed.
License
Apache-2.0 License.
Contact
Author: sakibalfahim β via Hugging Face profile.
- Downloads last month
- 19
Model tree for sakibalfahim/CrisisPulse
Base model
google-bert/bert-base-uncased