Skip to content

feat: update beeai-framework integration #1829

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

Tomas2D
Copy link

@Tomas2D Tomas2D commented Jul 2, 2025

Completely rework the beeai-framework integration to be more general and easier to maintain.

@Tomas2D Tomas2D requested a review from a team as a code owner July 2, 2025 13:56
Copy link
Contributor

github-actions bot commented Jul 2, 2025

CLA Assistant Lite bot All contributors have signed the CLA ✍️ ✅

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @Tomas2D - I've reviewed your changes - here's some feedback:

  • In on_success for RequirementAgent, you construct generated_message but never append or emit it—make sure to add it to the history or outgoing payload like with other agent types.
  • Filtering messages based only on m.text may still allow whitespace-only entries; consider stripping or normalizing the text before filtering to avoid empty messages.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- In on_success for RequirementAgent, you construct generated_message but never append or emit it—make sure to add it to the history or outgoing payload like with other agent types.
- Filtering messages based only on m.text may still allow whitespace-only entries; consider stripping or normalizing the text before filtering to avoid empty messages.

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Jul 2, 2025
@Tomas2D Tomas2D marked this pull request as draft July 2, 2025 16:26
@Tomas2D Tomas2D force-pushed the feat/beeai-framework-update branch from 67b07a7 to 3701254 Compare July 23, 2025 14:01
@Tomas2D Tomas2D force-pushed the feat/beeai-framework-update branch from 3701254 to fd52d5e Compare July 23, 2025 14:07
@Tomas2D Tomas2D marked this pull request as ready for review July 23, 2025 14:08
@dosubot dosubot bot added size:XXL This PR changes 1000+ lines, ignoring generated files. and removed size:M This PR changes 30-99 lines, ignoring generated files. labels Jul 23, 2025
@Tomas2D
Copy link
Author

Tomas2D commented Jul 23, 2025

I have read the CLA Document and I hereby sign the CLA

github-actions bot added a commit that referenced this pull request Jul 23, 2025
cursor[bot]

This comment was marked as outdated.

cursor[bot]

This comment was marked as outdated.

cursor[bot]

This comment was marked as outdated.

cursor[bot]

This comment was marked as outdated.

cursor[bot]

This comment was marked as outdated.

Copy link
Contributor

@mikeldking mikeldking left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @Tomas2D thanks for maintaining this! Generally LGTM - left some minor comments but I probably lack the context you have so just ping me and I'll get this released. I'll trigger CI now.

Comment on lines 19 to 22
from beeai_framework.context import (
RunContextFinishEvent,
)
from beeai_framework.emitter import Emitter, EmitterOptions, EventMeta
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With Otel instrumentation it's typical to lazily import so that the instrumentor doesn't cause an import error if you don't have the package. E.g. it can be bundled with other instrumentors and automatically be initialized when beeai gets added. Up to you ofc. but definitely a nice pattern.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. I updated all the imports to be lazy. All seems to be working.


if result.middleware:
result.middleware(create_telemetry_middleware(self._tracer, span_kind.value))
self._uninstrument()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

intentional?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The implementation should work fine with multiple instances; therefore, this is no longer needed. Removed.

@Tomas2D
Copy link
Author

Tomas2D commented Jul 23, 2025

I addressed all the issues.

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Decorator Fails to Return Function Results

The exception_handler decorator does not return the result of the wrapped function when no exception occurs, causing it to always return None.

python/instrumentation/openinference-instrumentation-beeai/src/openinference/instrumentation/beeai/_utils.py#L56-L65

def exception_handler(func: Callable[P, Awaitable[T]]) -> Callable[P, Awaitable[T | None]]:
@functools.wraps(func)
async def wrapped(*args: P.args, **kwargs: P.kwargs) -> T | None:
try:
await func(*args, **kwargs)
except Exception as e:
logger.error("Error has occurred in the telemetry package.", exc_info=e)
return wrapped

Fix in CursorFix in Web


Bug: Incorrect Version Retrieval in Instrumentation Package

The openinference-instrumentation-beeai package incorrectly retrieves the version of beeai-framework instead of its own version, leading to an incorrect version number for the instrumentation package.

python/instrumentation/openinference-instrumentation-beeai/src/openinference/instrumentation/beeai/__init__.py#L38-L39

Fix in CursorFix in Web


Bug: Race Condition in Parent-Child Event Processing

A ValueError is raised in the _handler method when a parent run ID is referenced but not found in _processes. This occurs due to a race condition where child events may be processed before their parent events, causing instrumentation to fail or leading to crashes.

python/instrumentation/openinference-instrumentation-beeai/src/openinference/instrumentation/beeai/__init__.py#L114-L116

)
if event.trace.parent_run_id and not parent:
raise ValueError(f"Parent run with ID {event.trace.parent_run_id} was not found!")

Fix in CursorFix in Web


Was this report helpful? Give feedback by reacting with 👍 or 👎

"_original_tool_run",
"_tracer",
)
__slots__ = ("_tracer", "_cleanup", "_processes")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Incorrect Version Retrieval in Instrumentation Package

The instrumentation package incorrectly retrieves its version from beeai-framework instead of openinference-instrumentation-beeai, leading to incorrect version reporting for the instrumentation package.

Locations (1)

Fix in CursorFix in Web

else None
)
if event.trace.parent_run_id and not parent:
raise ValueError(f"Parent run with ID {event.trace.parent_run_id} was not found!")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Asynchronous Event Processing Race Condition

A ValueError is raised when a parent run ID is not found during event processing. This race condition occurs because events can arrive out of order in an asynchronous system, causing child events to be processed before their parents. This leads to the instrumentation crashing or rejecting legitimate events.

Locations (1)

Fix in CursorFix in Web

@mikeldking mikeldking force-pushed the feat/beeai-framework-update branch from b15224e to fcb5a9c Compare July 24, 2025 12:35
@mikeldking
Copy link
Contributor

@Tomas2D I ran CI on beeai and it seems it's red. Let us know if you have any questions about the setup.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:XXL This PR changes 1000+ lines, ignoring generated files.
Projects
Status: No status
Development

Successfully merging this pull request may close these issues.

2 participants