--- language: en license: mit tags: - federated-learning - finance - sentiment-analysis - bert - finbert library_name: transformers pipeline_tag: text-classification authors: - Harsh Prasad - Sai Dhole --- ## FinBERTโ€“AdaptiveFedAvg: Adaptive Federated Aggregation for Financial Sentiment Analysis --- ### ๐Ÿ“Œ Model Summary This model is a **federated version of FinBERT** fine-tuned for **financial sentiment classification (Positive / Negative / Neutral)**. Training is performed across **three clients**: * Financial Twitter posts * Financial news headlines * Financial reports & statements Unlike standard FedAvg, this model uses an **Adaptive Aggregation strategy**, where client contributions are **weighted dynamically based on validation performance**, allowing stronger clients to influence the global model more. This model is part of a research project comparing: * FedAvg * FedProx * Adaptive Aggregation for federated financial NLP. --- ### ๐Ÿง  Intended Use Designed for: * Financial sentiment research * Risk & market analytics * Academic exploration of federated learning Not intended for automated trading without expert oversight. --- ### ๐Ÿ— Model Architecture Base Model: ``` ProsusAI/finbert ``` Task: ``` Sequence classification โ€” 3 classes ``` Training Setup: ``` 3 federation clients 10 global rounds 3 local epochs Adaptive weighted aggregation ``` --- ### ๐Ÿ“Š Client Data Sources | Client | Data Type | | -------- | ----------------- | | Client-1 | Financial Twitter | | Client-2 | Financial News | | Client-3 | Financial Reports | No raw data is shared between clients. --- ### ๐Ÿ” Privacy Advantage Only model updates are exchanged โ€” not text data. This supports data governance and privacy-aware ML. --- ### ๐Ÿ“ˆ Performance (Validation) | Method | Final Avg F1-Score | | --------------- | ------------------ | | Adaptive FedAvg | **0.823** | Adaptive aggregation showed **smooth convergence and stable performance** while preserving privacy. --- ### ๐Ÿš€ Example Usage ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification import torch model = AutoModelForSequenceClassification.from_pretrained( "harshprasad03/FinBERT-Adaptive" ) tokenizer = AutoTokenizer.from_pretrained( "harshprasad03/FinBERT-Adaptive" ) text = "Global markets improved after positive earnings reports." inputs = tokenizer(text, return_tensors="pt") outputs = model(**inputs) prob = torch.softmax(outputs.logits, dim=1) print(prob) ``` --- ### โš ๏ธ Limitations * Trained only on finance-domain text * Sentiment โ‰  market prediction * Model may inherit dataset biases * Designed for research use --- ### ๐Ÿ“š Citation ``` Harsh Prasad, Sai Dhole (2025). Adaptive Federated FinBERT for Financial Sentiment Analysis. ``` --- ### ๐Ÿ‘จโ€๐Ÿ’ป Authors **Harsh Prasad** AI and ML Research **Sai Dhole** AI and ML Research ---