File size: 612 Bytes
5fa1a76
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
NLP tasks
First, let's set up the environment: 

pip install -q transformers accelerate
Next, let's load the model with the appropriate pipeline ("text-generation"): 
thon

from transformers import pipeline, AutoTokenizer
import torch
torch.manual_seed(0) # doctest: +IGNORE_RESULT
model = "tiiuae/falcon-7b-instruct"
tokenizer = AutoTokenizer.from_pretrained(model)
pipe = pipeline(
     "text-generation",
     model=model,
     tokenizer=tokenizer,
     torch_dtype=torch.bfloat16,
     device_map="auto",
 )

Note that Falcon models were trained using the bfloat16 datatype, so we recommend you use the same.