Text-to-Speech
Safetensors
inf5
custom_code

How is the model being loaded, i am unable to run this inference in offline mode

#6
by sanjoy2 - opened

Is this model not allowed to use offline ?
It seems the model is loaded using custom loader.

I am unable to run this.

Please help

The local model path is being treated as repo id, tried updating the local_path and trust remote code.. and all possible ways

I am having the same problem

I'm using model.safetensors and vocab.txt in ComfyUI, but I keep getting an error. What could be the reason?
Shot 0260.png

i am also getting same error for using using model.safetensors and vocab.txt, any solution?

i don't know if you guys are still stuck in this, but I was having the same issue. The way to fix it is to not load the weights inside the IndicF5 class, instead you have to load the weights inside a subobject of that class named ema_model. So basically if you guys are doing

model = load_model(...).load_state_dict(...)

instead of that, you have to do something like this

model = load_model(...)
model.ema_model.load_state_dict(...)

This definitely is kinda annoying but If it works it works, however from my experience I've realised it's better to just use AutoModel.from_pretrained method and if you want the model you can get it like this

main_indic_f5_model = AutoModel.from_pretrained(...)
model = main_indic_f5_model.ema_model
vocoder = main_indic_f5_model.vocoder

Sign up or log in to comment