Skip to content

Commit 81d1f31

Browse files
authored
More on LLM message history and system instruction (neo4j#240)
* Add examples for LLM message history and system instructions * Fix for when system_message is None * Ruff * Better (?) system prompt for RAG * Update system_instruction * Mypy * Mypy * Fix ollama test * Fix anthropic test * Fix cohere test * Fix vertexai test * Fix mistralai test * Fix graphrag test * Ruff * Mypy * Variable is not used * Ruff... * Mypy + e2e tests * Ruffffffff * CHANGELOG * Fix examples * Remove useless commented api_key from examples
1 parent feeddbb commit 81d1f31

30 files changed

+381
-299
lines changed

CHANGELOG.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,8 @@
44

55
### Added
66
- Support for conversations with message history, including a new `message_history` parameter for LLM interactions.
7-
- Ability to include system instructions and override them for specific invocations.
8-
- Summarization of chat history to enhance query embedding and context handling.
7+
- Ability to include system instructions in LLM invoke method.
8+
- Summarization of chat history to enhance query embedding and context handling in GraphRAG.
99

1010
### Changed
1111
- Updated LLM implementations to handle message history consistently across providers.

examples/README.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,7 @@ are listed in [the last section of this file](#customize).
5151
## Answer: GraphRAG
5252

5353
- [End to end GraphRAG](./answer/graphrag.py)
54+
- [GraphRAG with message history](./question_answering/graphrag_with_message_history.py)
5455

5556

5657
## Customize
@@ -73,6 +74,9 @@ are listed in [the last section of this file](#customize).
7374
- [Ollama](./customize/llms/ollama_llm.py)
7475
- [Custom LLM](./customize/llms/custom_llm.py)
7576

77+
- [Message history](./customize/llms/llm_with_message_history.py)
78+
- [System Instruction](./customize/llms/llm_with_system_instructions.py)
79+
7680

7781
### Prompts
7882

examples/customize/embeddings/cohere_embeddings.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@
22

33
# set api key here on in the CO_API_KEY env var
44
api_key = None
5-
# api_key = "sk-..."
65

76
embeder = CohereEmbeddings(
87
model="embed-english-v3.0",

examples/customize/embeddings/mistalai_embeddings.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,6 @@
66

77
# set api key here on in the MISTRAL_API_KEY env var
88
api_key = None
9-
# api_key = "sk-..."
109

1110
embeder = MistralAIEmbeddings(model="mistral-embed", api_key=api_key)
1211
res = embeder.embed_query("my question")

examples/customize/embeddings/openai_embeddings.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,6 @@
66

77
# set api key here on in the OPENAI_API_KEY env var
88
api_key = None
9-
# api_key = "sk-..."
109

1110
embeder = OpenAIEmbeddings(model="text-embedding-ada-002", api_key=api_key)
1211
res = embeder.embed_query("my question")

examples/customize/llms/anthropic_llm.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,6 @@
22

33
# set api key here on in the ANTHROPIC_API_KEY env var
44
api_key = None
5-
# api_key = "sk-..."
6-
75

86
llm = AnthropicLLM(
97
model_name="claude-3-opus-20240229",

examples/customize/llms/cohere_llm.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@
22

33
# set api key here on in the CO_API_KEY env var
44
api_key = None
5-
# api_key = "sk-..."
65

76
llm = CohereLLM(
87
model_name="command-r",
Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
"""This example illustrates the message_history feature
2+
of the LLMInterface by mocking a conversation between a user
3+
and an LLM about Tom Hanks.
4+
5+
OpenAILLM can be replaced by any supported LLM from this package.
6+
"""
7+
8+
from neo4j_graphrag.llm import LLMResponse, OpenAILLM
9+
10+
# set api key here on in the OPENAI_API_KEY env var
11+
api_key = None
12+
13+
llm = OpenAILLM(model_name="gpt-4o", api_key=api_key)
14+
15+
questions = [
16+
"What are some movies Tom Hanks starred in?",
17+
"Is he also a director?",
18+
"Wow, that's impressive. And what about his personal life, does he have children?",
19+
]
20+
21+
history: list[dict[str, str]] = []
22+
for question in questions:
23+
res: LLMResponse = llm.invoke(
24+
question,
25+
message_history=history, # type: ignore
26+
)
27+
history.append(
28+
{
29+
"role": "user",
30+
"content": question,
31+
}
32+
)
33+
history.append(
34+
{
35+
"role": "assistant",
36+
"content": res.content,
37+
}
38+
)
39+
40+
print("#" * 50, question)
41+
print(res.content)
42+
print("#" * 50)
Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
"""This example illustrates how to set system instructions for LLM.
2+
3+
OpenAILLM can be replaced by any supported LLM from this package.
4+
"""
5+
6+
from neo4j_graphrag.llm import LLMResponse, OpenAILLM
7+
8+
# set api key here on in the OPENAI_API_KEY env var
9+
api_key = None
10+
11+
llm = OpenAILLM(
12+
model_name="gpt-4o",
13+
api_key=api_key,
14+
)
15+
16+
question = "How fast is Santa Claus during the Christmas eve?"
17+
18+
res: LLMResponse = llm.invoke(
19+
question,
20+
system_instruction="Answer with a serious tone",
21+
)
22+
print(res.content)

examples/customize/llms/mistalai_llm.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,6 @@
22

33
# set api key here on in the MISTRAL_API_KEY env var
44
api_key = None
5-
# api_key = "sk-..."
6-
75

86
llm = MistralAILLM(
97
model_name="mistral-small-latest",

0 commit comments

Comments
 (0)