Using code in LLMGraphTransformer #29725
Replies: 3 comments
-
The error you're encountering indicates that the 'response_format' parameter set to 'json_schema' is not supported by the model you are using. To resolve this, you can try the following steps:
By ensuring compatibility with the model and using a supported response format, you should be able to pass a script of code as text to the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
can you give me an example for the second option and 'json_object' |
Beta Was this translation helpful? Give feedback.
-
I managed to solve the problem by changing the model to gpt-4o. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to pass a script of code as text in the LLMGraphTransformer as in the tutorial .
But I am getting the following error:
BadRequestError: Error code: 400 - {'error': {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}}
Is there any way of passing a script of code as text in order to make a graph?
Beta Was this translation helpful? Give feedback.
All reactions