This model was introduced in the paper Rank-DistiLLM: Closing the Effectiveness Gap Between Cross-Encoders and LLMs for Passage Re-Ranking.
For code, examples and more, please visit https://github.com/webis-de/msmarco-llm-distillation.
- Downloads last month
- 374
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for webis/monoelectra-base
Base model
google/electra-base-discriminator