Skip to content

add thread_id to llm_events #99

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Mar 12, 2025
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,12 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

## [Unreleased]

### Added
- thread_id to track_llm_events

### Changed
- assistant_id now optional in track_llm_events

## [1.8.3] - 2025-02-26
### Changed
- Change logging level to debug for flushing events and adding event to queue
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,10 +40,10 @@ client.track_llm(
assistant_id="gpt-4",
generation="The capital of France is Paris.",
properties={
"$thread_id": "your-thread-id", # special trubrics property to group an interaction by conversation thread
"model": "gpt-4"
"support_tier": "entreprise",
Comment on lines -44 to +43
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's inside properties apart support_tier

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can be anything really. I removed model because that's typically in assistant_id

},
latency=1.5 # seconds
latency=1.5, # seconds
thread_id="user123_thread_id"
)
```

Expand Down
7 changes: 5 additions & 2 deletions trubrics/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -114,22 +114,24 @@ def track_llm(
self,
user_id: str,
prompt: str,
assistant_id: str,
generation: str,
assistant_id: str | None = None,
properties: dict | None = None,
timestamp: datetime | None = None,
latency: float | None = None,
thread_id: str | None = None,
):
"""
Track an LLM prompt and generation.
Args:
user_id (str): The ID of the user.
prompt (str): The prompt given to the LLM.
assistant_id (str): The ID of the assistant.
generation (str): The generated response from the LLM.
assistant_id (str | None): The ID of the assistant.
properties (dict | None): Additional properties to track.
timestamp (datetime | None): The timestamp of the generation event. If None, the current time in UTC is used.
latency (float | None): The latency in seconds between the prompt and the generation. Defaults to 1.
thread_id (str | None): The ID of the thread, used to link messages in a conversation.
"""

llm_event_dict = {
Expand All @@ -145,6 +147,7 @@ def track_llm(
),
"latency": latency,
"event_type": EventTypes.llm_event,
"thread_id": str(thread_id),
}

with self._lock:
Expand Down