shtras commited on
Commit
bc42d6b
·
verified ·
1 Parent(s): 56ac3e1

Fix the value of model_max_length

Browse files

Make model_max_length value match that of the original model meta-llama/Llama-3.2-90B-Vision-Instruct

Files changed (1) hide show
  1. tokenizer_config.json +1 -1
tokenizer_config.json CHANGED
@@ -2065,7 +2065,7 @@
2065
  "input_ids",
2066
  "attention_mask"
2067
  ],
2068
- "model_max_length": 512,
2069
  "pad_token": "<|eot_id|>",
2070
  "padding_side": "left",
2071
  "tokenizer_class": "PreTrainedTokenizerFast"
 
2065
  "input_ids",
2066
  "attention_mask"
2067
  ],
2068
+ "model_max_length": 131072,
2069
  "pad_token": "<|eot_id|>",
2070
  "padding_side": "left",
2071
  "tokenizer_class": "PreTrainedTokenizerFast"