Commit
·
54e927b
1
Parent(s):
4c53a51
Update README.md
Browse files
README.md
CHANGED
@@ -4,7 +4,7 @@ widget:
|
|
4 |
- text: "Ḣ Q V Q [MASK] E"
|
5 |
---
|
6 |
|
7 |
-
## AntiBERTa2 🧬
|
8 |
|
9 |
AntiBERTa2 is an antibody-specific language model based on the [RoFormer model](https://arxiv.org/abs/2104.09864) - it is pre-trained using masked language modelling.
|
10 |
We also provide a multimodal version of AntiBERTa2, AntiBERTa2-CSSP, that has been trained using a contrastive objective, similar to the [CLIP method](https://arxiv.org/abs/2103.00020).
|
@@ -26,8 +26,8 @@ non-commercial use. For any users seeking commercial use of our model and genera
|
|
26 |
RoFormerTokenizer,
|
27 |
RoFormerForSequenceClassification
|
28 |
)
|
29 |
-
>>> tokenizer = RoFormerTokenizer.from_pretrained("alchemab/antiberta2")
|
30 |
-
>>> model = RoFormerModel.from_pretrained("alchemab/antiberta2")
|
31 |
>>> model(**tokenizer("Ḣ Q V Q ... T V S S", return_tensors='pt')).last_hidden_state... # etc
|
32 |
|
33 |
>>> new_model = RoFormerForSequenceClassification.from_pretrained(
|
|
|
4 |
- text: "Ḣ Q V Q [MASK] E"
|
5 |
---
|
6 |
|
7 |
+
## AntiBERTa2-CSSP 🧬
|
8 |
|
9 |
AntiBERTa2 is an antibody-specific language model based on the [RoFormer model](https://arxiv.org/abs/2104.09864) - it is pre-trained using masked language modelling.
|
10 |
We also provide a multimodal version of AntiBERTa2, AntiBERTa2-CSSP, that has been trained using a contrastive objective, similar to the [CLIP method](https://arxiv.org/abs/2103.00020).
|
|
|
26 |
RoFormerTokenizer,
|
27 |
RoFormerForSequenceClassification
|
28 |
)
|
29 |
+
>>> tokenizer = RoFormerTokenizer.from_pretrained("alchemab/antiberta2-cssp")
|
30 |
+
>>> model = RoFormerModel.from_pretrained("alchemab/antiberta2-cssp")
|
31 |
>>> model(**tokenizer("Ḣ Q V Q ... T V S S", return_tensors='pt')).last_hidden_state... # etc
|
32 |
|
33 |
>>> new_model = RoFormerForSequenceClassification.from_pretrained(
|