wrong max_position_embeddings

#8
by Djip007 - opened

Hello, thanks for this nice repo.

https://github.com/ollama/ollama/issues/5862#issuecomment-2719488696

look like there was an error on first config.json upload, now:
https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407/blob/main/config.json

report now:

"max_position_embeddings": 131072,

But look like the gguf have the old wrong "llama.context_length 1024000".

is it possible to correct it?

Sign up or log in to comment