Skip to content

LLM interface function calling and implementation for OpenAILLM #268

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 11 commits into from

Conversation

willtai
Copy link
Contributor

@willtai willtai commented Feb 10, 2025

Description

This PR enables function calling for the OpenAILLM. It updates the OpenAILLM class and also LLMInterface type. An example script is also added.

Type of Change

  • New feature
  • Bug fix
  • Breaking change
  • Documentation update
  • Project configuration change

Complexity

Complexity: Low

How Has This Been Tested?

  • Unit tests
  • E2E tests
  • Manual tests

Checklist

The following requirements should have been met (depending on the changes in the branch):

  • Documentation has been updated
  • Unit tests have been updated
  • E2E tests have been updated
  • Examples have been updated
  • New files have copyright header
  • CLA (https://neo4j.com/developer/cla/) has been signed
  • CHANGELOG.md updated if appropriate

@willtai willtai requested a review from a team as a code owner February 10, 2025 17:08
@willtai willtai force-pushed the LLMInterface-function-calling branch 2 times, most recently from 255f481 to 0278184 Compare February 11, 2025 13:29
@willtai willtai changed the title LLM interface function calling LLM interface function calling and implementation for OpenAILLM Feb 11, 2025
@willtai willtai force-pushed the LLMInterface-function-calling branch from 1a9aa2d to bda358a Compare February 13, 2025 14:12
"""


rag = GraphRAG(retriever=retriever, llm=llm)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe this example needs to be updated to remove the GraphRAG part?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wanted to show an example of how the tool can be used after being called. I was thinking to keep unless you prefer to have it removed to just show the tool call?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit uncomfortable with it at the moment, but it's mainly because of our design: using the same llm object which has the tool definition (in model_params), we have to say in the prompt "do not use any tool", which I feel is the way it is intended to be used. We might have to reconsider this design before using this new feature in that way. WDYT?

@willtai willtai force-pushed the LLMInterface-function-calling branch from 1376ad7 to 4801ed0 Compare February 13, 2025 16:02
@willtai willtai requested a review from stellasia February 13, 2025 16:19
@willtai willtai closed this Feb 19, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants