File size: 172 Bytes
5fa1a76
 
 
1
2
3
The abstract from the paper is the following:
GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset[1] of 8 million
web pages.