500 internal error when inference from model

#15
by Diaa-Essam - opened

This error keeps showing when using for inference from hugging face api:
500 Server Error: Internal Server Error for url: https://api-inference.huggingface.co/models/lmsys/fastchat-t5-3b-v1.0

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment