Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
line = "SC has first two presumptive cases of coronavirus , DHEC confirms HTTPURL via @USER :cry:"
input_ids = torch.tensor([tokenizer.encode(line)])
with torch.no_grad():
features = bertweet(input_ids) # Models outputs are now tuples
With TensorFlow 2.0+:
from transformers import TFAutoModel
bertweet = TFAutoModel.from_pretrained("vinai/bertweet-base")
This implementation is the same as BERT, except for tokenization method.