File size: 2,648 Bytes
0edaf39
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c53637d
0edaf39
 
 
 
 
8f914c5
0edaf39
 
 
 
 
 
 
 
 
 
 
c53637d
0edaf39
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8f914c5
0edaf39
 
 
c53637d
0edaf39
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
---
license: apache-2.0
language:
- es
base_model:
- PlanTL-GOB-ES/roberta-base-biomedical-clinical-es
tags:
- medical
- spanish
- bi-encoder
- entity-linking
- sapbert
- umls
- snomed-ct
---

# **ClinLinker-KB-GP**

## Model Description
ClinLinker-KB-GP is a state-of-the-art bi-encoder model for medical entity linking (MEL) in Spanish, optimized for clinical domain tasks. It enriches concept representations by incorporating not only synonyms but also hierarchical relationships (parents and grandparents) from the UMLS and SNOMED-CT ontologies. The model was trained with a contrastive-learning strategy using hard negative mining and multi-similarity loss.

## 💡 Intended Use
- **Domain**: Spanish Clinical NLP
- **Tasks**: Entity linking (diseases, symptoms, procedures) to SNOMED-CT
- **Evaluated On**: DisTEMIST, MedProcNER, SympTEMIST
- **Users**: Researchers and practitioners working in clinical NLP

## 📈 Performance Summary (Top-25 Accuracy)

| Model               | DisTEMIST | MedProcNER | SympTEMIST |
|--------------------|-----------|------------|------------|
| ClinLinker         | 0.845     | 0.898      | 0.909      |
| ClinLinker-KB-P    | 0.853     | 0.891      | 0.918      |
| **ClinLinker-KB-GP** | **0.864** | **0.901**  | **0.922**  |
| SapBERT-XLM-R-large| 0.800     | 0.850      | 0.827      |
| RoBERTa biomedical | 0.600     | 0.668      | 0.609      |

*Results correspond to the cleaned gold-standard version (no "NO CODE" or "COMPOSITE").*

## 🧪 Usage

```python
from transformers import AutoModel, AutoTokenizer
import torch

model = AutoModel.from_pretrained("ICB-UMA/ClinLinker-KB-GP")
tokenizer = AutoTokenizer.from_pretrained("ICB-UMA/ClinLinker-KB-GP")

mention = "insuficiencia renal aguda"
inputs = tokenizer(mention, return_tensors="pt")
with torch.no_grad():
    outputs = model(**inputs)
embedding = outputs.last_hidden_state[:, 0, :]
print(embedding.shape)
```

For scalable retrieval, use [Faiss](https://github.com/facebookresearch/faiss) or the [`FaissEncoder`](https://github.com/ICB-UMA/KnowledgeGraph) class.

## ⚠️ Limitations
- The model is optimized for Spanish clinical data and may underperform outside this domain.
- Expert validation is advised in critical applications.

## 📚 Citation

> Gallego, Fernando and López-García, Guillermo and Gasco, Luis and Krallinger, Martin and Veredas, Francisco J., Clinlinker-Kb: Clinical Entity Linking in Spanish with Knowledge-Graph Enhanced Biencoders. Available at SSRN:http://dx.doi.org/10.2139/ssrn.4939986.

## Authors

Fernando Gallego, Guillermo López-García, Luis Gasco-Sánchez, Martin Krallinger, Francisco J Veredas