Span NLI BERT (large)

This is a BERT-large model (bert-large-uncased-whole-word-masking) fine-tuned on the ContractNLI dataset (non-disclosure agreements) with the Span NLI BERT model architecture, from ContractNLI: A Dataset for Document-level Natural Language Inference for Contracts (Koreeda and Manning, 2021).

For a hypothesis, the Span NLI BERT model predicts NLI labels and identifies evidence for documents as premises. Spans of documents should be pre-annotated; evidence is always full sentences or items in an enumerated list in the document.

For details of the architecture and usage of the relevant training/testing scripts, check out the paper and their Github repo. This model is fine-tuned according to the recommended hyperparameters in the Appendix of the paper, some of which differ from the hyperparameters in data/conf_large.yml in their repo.

ArXiv: https://arxiv.org/abs/2110.01799

Downloads last month
21
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.