🌎 | |
LlamaConfig | |
[[autodoc]] LlamaConfig | |
LlamaTokenizer | |
[[autodoc]] LlamaTokenizer | |
- build_inputs_with_special_tokens | |
- get_special_tokens_mask | |
- create_token_type_ids_from_sequences | |
- save_vocabulary | |
LlamaTokenizerFast | |
[[autodoc]] LlamaTokenizerFast | |
- build_inputs_with_special_tokens | |
- get_special_tokens_mask | |
- create_token_type_ids_from_sequences | |
- update_post_processor | |
- save_vocabulary | |
LlamaModel | |
[[autodoc]] LlamaModel | |
- forward | |
LlamaForCausalLM | |
[[autodoc]] LlamaForCausalLM | |
- forward | |
LlamaForSequenceClassification | |
[[autodoc]] LlamaForSequenceClassification | |
- forward | |
LlamaForQuestionAnswering | |
[[autodoc]] LlamaForQuestionAnswering | |
- forward | |
FlaxLlamaModel | |
[[autodoc]] FlaxLlamaModel | |
- call | |
FlaxLlamaForCausalLM | |
[[autodoc]] FlaxLlamaForCausalLM | |
- call |