Releases: jackmpcollins/magentic
Releases · jackmpcollins/magentic
v0.40.0
What's Changed
- Bump astral-sh/setup-uv from 5 to 6 by @dependabot in #443
- Add OpenRouter chat model by @piiq in #448
- Add reasoning_effort param to OpenaiChatModel by @jackmpcollins in #451
New Contributors
Full Changelog: v0.39.3...v0.40.0
v0.39.3
What's Changed
- Fix: function call parsing positional args ignoring arg defaults by @jackmpcollins in #439
Full Changelog: v0.39.2...v0.39.3
v0.39.2
What's Changed
- Add tests for Gemini via openai package by @jackmpcollins in #382
Full Changelog: v0.39.1...v0.39.2
v0.39.1
What's Changed
- Add tests and docs for xAI / Grok via OpenaiChatModel by @jackmpcollins in #433
Full Changelog: v0.39.0...v0.39.1
v0.39.0
What's Changed
- Use TypeVar default to remove overloads by @jackmpcollins in #411
- Add missing Field import in docs by @jackmpcollins in #428
- feat: support for passing extra_headers to LitellmChatModel by @ashwin153 in #426
New Contributors
- @ashwin153 made their first contribution in #426
Full Changelog: v0.38.1...v0.39.0
v0.38.1
What's Changed
Full Changelog: v0.38.0...v0.38.1
v0.38.0
What's Changed
- Async streamed response to api message conversion by @ananis25 in #405
- Support AsyncParallelFunctionCall in message_to_X_message by @jackmpcollins in #406
Full Changelog: v0.37.1...v0.38.0
v0.37.1
What's Changed
Anthropic model message serialization now supports StreamedResponse
in AssistantMessage
. Thanks to @ananis25 🎉
PRs
New Contributors
Full Changelog: v0.37.0...v0.37.1
v0.37.0
What's Changed
The @prompt_chain
decorator can now accept a sequence of Message
as input, like @chatprompt
.
from magentic import prompt_chain, UserMessage
def get_current_weather(location, unit="fahrenheit"):
"""Get the current weather in a given location"""
return {"temperature": "72", "forecast": ["sunny", "windy"]}
@prompt_chain(
template=[UserMessage("What's the weather like in {city}?")],
functions=[get_current_weather],
)
def describe_weather(city: str) -> str: ...
describe_weather("Boston")
'The weather in Boston is currently 72°F with sunny and windy conditions.'
PRs
- Allow Messages as input to prompt_chain by @jackmpcollins in #403
Full Changelog: v0.36.0...v0.37.0
v0.36.0
What's Changed
Document the Chat
class and make it importable from the top level.
docs: https://magentic.dev/chat/
from magentic import Chat, OpenaiChatModel, UserMessage
# Create a new Chat instance
chat = Chat(
messages=[UserMessage("Say hello")],
model=OpenaiChatModel("gpt-4o"),
)
# Append a new user message
chat = chat.add_user_message("Actually, say goodbye!")
print(chat.messages)
# [UserMessage('Say hello'), UserMessage('Actually, say goodbye!')]
# Submit the chat to the LLM to get a response
chat = chat.submit()
print(chat.last_message.content)
# 'Hello! Just kidding—goodbye!'
PRs
- Use public import for ChatCompletionStreamState by @jackmpcollins in #398
- Make Chat class public and add docs by @jackmpcollins in #401
- Remove unused content None from openai messages by @jackmpcollins in #402
Full Changelog: v0.35.0...v0.36.0