Some HuggingFace models return 'Bad Request, status code 400' #1386
Unanswered
Captain-Ironbump
asked this question in
Q&A
Replies: 1 comment 4 replies
-
Any chance you can add a sample I can run and see the issue in action? |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello Everyone,
when I'm trying to make a REST call to the LLM using the HuggingFace dependency of quarkus-langchain4j, some of those models only use the 'inputs' variable for the body. Now, please correct me here if I'm wrong, but the @RegisterAiService appends additional parameters ('parameters', 'options') to the body of the REST call, which will fail some calls because they only expect the 'inputs' parameter.
Is there a option for disabling those parameters of do I need to build a RestClient myself?
for Example the Inference Endpoint of the model 'google/flan-t5-large' example payload looks like this:
curl https://router.huggingface.co/hf-inference/models/google/flan-t5-large \ -X POST \ -H 'Authorization: Bearer hf_xxxxxxxxxxxxxxxxxxxxxxxx' \ -H 'Content-Type: application/json' \ -d '{ "inputs": "\"The answer to the universe is\"" }'
and this is the call created from the extension:
2025-03-26 09:15:41,396 INFO [io.qua.lan.hug.QuarkusHuggingFaceClientFactory$HuggingFaceClientLogger] (vert.x-eventloop-thread-1) Request:
"inputs" : "This is a placeholder text",
"parameters" : {
"temperature" : 1.0,
"return_full_text" : false
},
"options" : {
"wait_for_model" : true
}
}
Thanks in advance :)
Beta Was this translation helpful? Give feedback.
All reactions