Usage

$ pip install coref-onnx
from coref_onnx import CoreferenceResolver, decode_clusters

resolver = CoreferenceResolver.from_pretrained("talmago/allennlp-coref-onnx-mMiniLMv2-L12-H384-distilled-from-XLMR-Large")

sentences = [
    ["Barack", "Obama", "was", "the", "44th", "President", "of", "the", "United", "States", "."],
    ["He", "was", "born", "in", "Hawaii", "."]
]

pred = resolver(sentences)

print("Clusters:", pred["clusters"][0])
print("Decoded clusters:", decode_clusters(sentences, pred["clusters"][0]))

Output is:

Clusters: [[[(0, 1), (11, 11)]]]
Decoded clusters: [['Barack Obama', 'He']]

ONNX

Download MiniLM model archive

$ mkdir -p models/minillm
$ wget -P models/minillm https://storage.googleapis.com/pandora-intelligence/models/crosslingual-coreference/minilm/model.tar.gz

Run docker container:

$ docker run -it --platform linux/amd64 --entrypoint /bin/bash -v $(pwd)/models/minillm:/models/minillm allennlp/allennlp:latest

Install allennlp_models

$ pip install allennlp_models

Use another tab copy source code and scripts to the container

$ docker cp allennlp_models/coref/models/coref.py <CONTAINER_ID>:/opt/conda/lib/python3.8/site-packages/allennlp_models/coref/models/coref.py
$ docker cp allennlp/nn/util.py <CONTAINER_ID>:/stage/allennlp/allennlp/nn/util.py
$ docker cp export_onnx.py <CONTAINER_ID>:/app/export_onnx.py

In the container run:

$ mkdir nreimers
$ git clone https://huggingface.co/nreimers/mMiniLMv2-L12-H384-distilled-from-XLMR-Large nreimers

And then run the export script:

$ python export_onnx.py

Model Optimization

Run onnxsim

$ python -m onnxsim models/minillm/model.onnx optimized_model.onnx
Downloads last month
24
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for talmago/allennlp-coref-onnx-mMiniLMv2-L12-H384-distilled-from-XLMR-Large

Quantized
(1)
this model