reproducing-cross-encoders
Collection
A set of cross-encoders trained from various backbones and losses for equal comparison • 55 items • Updated • 4
This model is a cross-encoder based on jhu-clsp/ettin-encoder-150m. It was trained on Ms-Marco using loss infoNCE as part of a reproducibility paper for training cross encoders: "Reproducing and Comparing Distillation Techniques for Cross-Encoders", see the paper for more details.
This model is intended for re-ranking the top results returned by a retrieval system (like BM25, Bi-Encoders or SPLADE).
Training can be easily reproduced using the assiciated repository. The exact training configuration used for this model is also detailed in config.yaml.
Quick Start:
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
tokenizer = AutoTokenizer.from_pretrained("xpmir/cross-encoder-ettin-150m-infoNCE")
model = AutoModelForSequenceClassification.from_pretrained("xpmir/cross-encoder-ettin-150m-infoNCE")
features = tokenizer("What is experimaestro ?", "Experimaestro is a powerful framework for ML experiments management...", padding=True, truncation=True, return_tensors="pt")
model.eval()
with torch.no_grad():
scores = model(**features).logits
print(scores)
We provide evaluations of this cross-encoder re-ranking the top 1000 documents retrieved by naver/splade-v3-distilbert.
| dataset | RR@10 | nDCG@10 |
|---|---|---|
| msmarco_dev | 41.18 | 47.80 |
| trec2019 | 96.40 | 75.27 |
| trec2020 | 93.83 | 73.24 |
| fever | 81.69 | 81.36 |
| arguana | 18.21 | 27.48 |
| climate_fever | 30.59 | 22.75 |
| dbpedia | 76.20 | 46.31 |
| fiqa | 50.71 | 41.99 |
| hotpotqa | 89.99 | 73.59 |
| nfcorpus | 56.47 | 36.13 |
| nq | 54.85 | 59.89 |
| quora | 76.53 | 79.03 |
| scidocs | 29.55 | 17.22 |
| scifact | 57.77 | 63.58 |
| touche | 63.36 | 36.89 |
| trec_covid | 94.50 | 78.88 |
| robust04 | 66.74 | 47.46 |
| lotte_writing | 77.07 | 68.14 |
| lotte_recreation | 65.45 | 60.32 |
| lotte_science | 52.17 | 43.57 |
| lotte_technology | 60.63 | 51.70 |
| lotte_lifestyle | 75.67 | 66.26 |
| Mean In Domain | 77.14 | 65.44 |
| BEIR 13 | 60.03 | 51.16 |
| LoTTE (OOD) | 66.29 | 56.24 |
Base model
jhu-clsp/ettin-encoder-150m