File size: 137 Bytes
5fa1a76
 
1
2
GPT-2 was trained with a causal language modeling (CLM) objective and is therefore powerful at predicting the next
  token in a sequence.