Skip to content

Commit 4a5439e

Browse files
author
Ikko Eltociear Ashimine
authored
chore: update _extract.py (#70)
reponse -> response
1 parent bf598dc commit 4a5439e

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/raglite/_extract.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ class MyNameResponse(BaseModel):
4141
system_prompt = getattr(return_type, "system_prompt", "").strip()
4242
if not llm_supports_response_format or config.llm.startswith("llama-cpp-python"):
4343
system_prompt += f"\n\nFormat your response according to this JSON schema:\n{return_type.model_json_schema()!s}"
44-
# Constrain the reponse format to the JSON schema if it's supported by the LLM [1]. Strict mode
44+
# Constrain the response format to the JSON schema if it's supported by the LLM [1]. Strict mode
4545
# is disabled by default because it only supports a subset of JSON schema features [2].
4646
# [1] https://docs.litellm.ai/docs/completion/json_mode
4747
# [2] https://platform.openai.com/docs/guides/structured-outputs#some-type-specific-keywords-are-not-yet-supported

0 commit comments

Comments
 (0)