-
Notifications
You must be signed in to change notification settings - Fork 15
fix!: Improve response handling from Toolbox server #69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
c02780c
to
2858997
Compare
Earlier we made all fields as optional since we wanted to keep some fields optional for the LLM. Since Toolbox did not support optional fields, there was no way to know which fields were optional, so as a worst-case, we did a temporary workaround of keeping all fields as optional in the schema generated by Toolbox SDK. Now, there has been some evidence that the LLMs do not work very well with optional parameters, and so we have decided not to support optional fields for now, neither in Toolbox service nor in the SDK. This PR removes that temporary fix of making all the fields optional. This PR also removes an augmentation to the request body where `None` values were converted to empty strings (`''`). This is because now that LLM knows no fields are optional, we can be sure that we would not be getting any `None` values as inputs to the tools. So the function `_convert_none_to_empty_string` is not required anymore.
13e0835
to
e33aecf
Compare
* We remove `response.raise_for_status()` as it masks error reasons thrown by Toolbox server. * We return the value of `result` key from the response body, so that it can directly be fed to LLMs. * This would also prevent situations where the response is `{ "result": '{ "some": "value" }' }` and when we feed that to LLM by stringifying, it becomes something like `"{ "result": '{ \"some\": \"value\" }' }"` which is more cryptic for the LLM because of the extra `\` characters due to double stringification. * We also check for `error` in the response and throw a `ToolException` with the response if applicable.
2858997
to
293a055
Compare
Earlier we made all fields as optional since we wanted to keep some fields optional for the LLM. Since Toolbox did not support optional fields, there was no way to know which fields were optional, so as a worst-case, we did a temporary workaround of keeping all fields as optional in the schema generated by Toolbox SDK. Now, there has been some evidence that the LLMs do not work very well with optional parameters, and so we have decided not to support optional fields for now, neither in Toolbox service nor in the SDK. This PR removes that temporary fix of making all the fields optional. This PR also removes an augmentation to the request body where `None` values were converted to empty strings (`''`). This is because now that LLM knows no fields are optional, we can be sure that we would not be getting any `None` values as inputs to the tools. So the function `_convert_none_to_empty_string` is not required anymore.
e33aecf
to
6085287
Compare
Earlier we made all fields as optional since we wanted to keep some fields optional for the LLM. Since Toolbox did not support optional fields, there was no way to know which fields were optional, so as a worst-case, we did a temporary workaround of keeping all fields as optional in the schema generated by Toolbox SDK. Now, there has been some evidence that the LLMs do not work very well with optional parameters, and so we have decided not to support optional fields for now, neither in Toolbox service nor in the SDK. This PR removes that temporary fix of making all the fields optional. This PR also removes an augmentation to the request body where `None` values were converted to empty strings (`''`). This is because now that LLM knows no fields are optional, we can be sure that we would not be getting any `None` values as inputs to the tools. So the function `_convert_none_to_empty_string` is not required anymore.
6085287
to
c3fc9d7
Compare
Earlier we made all fields as optional since we wanted to keep some fields optional for the LLM. Since Toolbox did not support optional fields, there was no way to know which fields were optional, so as a worst-case, we did a temporary workaround of keeping all fields as optional in the schema generated by Toolbox SDK. Now, there has been some evidence that the LLMs do not work very well with optional parameters, and so we have decided not to support optional fields for now, neither in Toolbox service nor in the SDK. This PR removes that temporary fix of making all the fields optional. This PR also removes an augmentation to the request body where `None` values were converted to empty strings (`''`). This is because now that LLM knows no fields are optional, we can be sure that we would not be getting any `None` values as inputs to the tools. So the function `_convert_none_to_empty_string` is not required anymore.
c3fc9d7
to
3ec2c8e
Compare
twishabansal
approved these changes
Mar 21, 2025
kurtisvg
approved these changes
Mar 21, 2025
This was referenced Apr 4, 2025
Merged
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
We remove
response.raise_for_status()
as it masks error reasons thrown by Toolbox server.We return the value of
result
key from the response body, so that it can directly be fed to LLMs.{ "result": "{ 'some': 'value' }" }
and when we feed that to LLM by stringifying, it becomes something like:
{ \'result\': "{ \'key\': \'value\' }" }
which is more cryptic for the LLM because of the extra
\
characters due to double stringification.We also check for
error
in the response and throw aToolException
with the response if applicable.