Skip to content

Commit 56bc660

Browse files
authored
Add support for Anthropic LLM (#150)
* Add support for Anthopic LLM * mypy, docs, changelog * Fix test names
1 parent ff1c6ee commit 56bc660

File tree

11 files changed

+306
-15
lines changed

11 files changed

+306
-15
lines changed

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@
1414
- Introduced Vertex AI LLM class for integrating Vertex AI models.
1515
- Added unit tests for the Vertex AI LLM class.
1616
- Added support for Cohere LLM and embeddings - added optional dependency to `cohere`.
17+
- Added support for Anthropic LLM - added optional dependency to `anthropic`.
1718

1819
### Fixed
1920
- Resolved import issue with the Vertex AI Embeddings class.

docs/source/api.rst

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -172,8 +172,11 @@ CohereEmbeddings
172172
Generation
173173
**********
174174

175+
LLM
176+
===
177+
175178
LLMInterface
176-
============
179+
------------
177180

178181
.. autoclass:: neo4j_graphrag.llm.LLMInterface
179182
:members:
@@ -187,12 +190,27 @@ OpenAILLM
187190
:undoc-members: get_messages, client_class, async_client_class
188191

189192

193+
AzureOpenAILLM
194+
--------------
195+
196+
.. autoclass:: neo4j_graphrag.llm.openai_llm.AzureOpenAILLM
197+
:members:
198+
:undoc-members: get_messages, client_class, async_client_class
199+
200+
190201
VertexAILLM
191202
-----------
192203

193204
.. autoclass:: neo4j_graphrag.llm.vertexai_llm.VertexAILLM
194205
:members:
195206

207+
AnthropicLLM
208+
------------
209+
210+
.. autoclass:: neo4j_graphrag.llm.anthropic_llm.AnthropicLLM
211+
:members:
212+
213+
196214
CohereLLM
197215
---------
198216

docs/source/user_guide_rag.rst

Lines changed: 60 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -75,14 +75,15 @@ Using Another LLM Model
7575

7676
If OpenAI cannot be used directly, there are a few available alternatives:
7777

78-
- Use Azure OpenAI.
79-
- Use Google VertexAI.
78+
- Use Azure OpenAI (GPT...).
79+
- Use Google VertexAI (Gemini...).
80+
- Use Anthropic LLM (Claude...).
8081
- Use Cohere.
8182
- Use a local Ollama model.
8283
- Implement a custom interface.
8384
- Utilize any LangChain chat model.
8485

85-
All options are illustrated below, using a local Ollama model as an example.
86+
All options are illustrated below.
8687

8788
Using Azure Open AI LLM
8889
-----------------------
@@ -109,6 +110,9 @@ to learn more about the configuration.
109110
`pip install openai`
110111

111112

113+
See :ref:`azureopenaillm`.
114+
115+
112116
Using VertexAI LLM
113117
------------------
114118

@@ -132,6 +136,59 @@ To use VertexAI, instantiate the `VertexAILLM` class:
132136
`pip install google-cloud-aiplatform`
133137

134138

139+
See :ref:`vertexaillm`.
140+
141+
142+
Using Anthropic LLM
143+
-------------------
144+
145+
To use Anthropic, instantiate the `AnthropicLLM` class:
146+
147+
.. code:: python
148+
149+
from neo4j_graphrag.llm import AnthropicLLM
150+
151+
llm = AnthropicLLM(
152+
model_name="claude-3-opus-20240229",
153+
model_params={"max_tokens": 1000}, # max_tokens must be specified
154+
api_key=api_key, # can also set `ANTHROPIC_API_KEY` in env vars
155+
)
156+
llm.invoke("say something")
157+
158+
159+
.. note::
160+
161+
In order to run this code, the `anthropic` Python package needs to be installed:
162+
`pip install anthropic`
163+
164+
See :ref:`anthropicllm`.
165+
166+
167+
Using Cohere LLM
168+
----------------
169+
170+
To use Cohere, instantiate the `CohereLLM` class:
171+
172+
.. code:: python
173+
174+
from neo4j_graphrag.llm import CohereLLM
175+
176+
llm = CohereLLM(
177+
model_name="command-r",
178+
api_key=api_key, # can also set `CO_API_KEY` in env vars
179+
)
180+
llm.invoke("say something")
181+
182+
183+
.. note::
184+
185+
In order to run this code, the `cohere` Python package needs to be installed:
186+
`pip install cohere`
187+
188+
189+
See :ref:`coherellm`.
190+
191+
135192
Using a Local Model via Ollama
136193
-------------------------------
137194

poetry.lock

Lines changed: 27 additions & 2 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

pyproject.toml

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,7 @@ pygraphviz = [
4343
]
4444
google-cloud-aiplatform = {version = "^1.66.0", optional = true}
4545
cohere = {version = "^5.9.0", optional = true}
46+
anthropic = { version = "^0.34.2", optional = true}
4647

4748
[tool.poetry.group.dev.dependencies]
4849
pylint = "^3.1.0"
@@ -77,9 +78,10 @@ pygraphviz = [
7778
]
7879
google-cloud-aiplatform = {version = "^1.66.0"}
7980
cohere = {version = "^5.9.0"}
81+
anthropic = { version = "^0.34.2"}
8082

8183
[tool.poetry.extras]
82-
external_clients = ["weaviate-client", "pinecone-client", "google-cloud-aiplatform", "cohere"]
84+
external_clients = ["weaviate-client", "pinecone-client", "google-cloud-aiplatform", "cohere", "anthropic"]
8385
kg_creation_tools = ["pygraphviz"]
8486

8587
[build-system]

src/neo4j_graphrag/llm/__init__.py

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,19 @@
1212
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1313
# See the License for the specific language governing permissions and
1414
# limitations under the License.
15+
from .anthropic_llm import AnthropicLLM
1516
from .base import LLMInterface
17+
from .cohere_llm import CohereLLM
1618
from .openai_llm import AzureOpenAILLM, OpenAILLM
1719
from .types import LLMResponse
1820
from .vertexai_llm import VertexAILLM
1921

20-
__all__ = ["LLMResponse", "LLMInterface", "OpenAILLM", "VertexAILLM", "AzureOpenAILLM"]
22+
__all__ = [
23+
"AnthropicLLM",
24+
"CohereLLM",
25+
"LLMResponse",
26+
"LLMInterface",
27+
"OpenAILLM",
28+
"VertexAILLM",
29+
"AzureOpenAILLM",
30+
]
Lines changed: 116 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,116 @@
1+
# Neo4j Sweden AB [https://neo4j.com]
2+
# #
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
# #
7+
# https://www.apache.org/licenses/LICENSE-2.0
8+
# #
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
from __future__ import annotations
15+
16+
from typing import Any, Optional
17+
18+
from neo4j_graphrag.exceptions import LLMGenerationError
19+
from neo4j_graphrag.llm.base import LLMInterface
20+
from neo4j_graphrag.llm.types import LLMResponse
21+
22+
try:
23+
import anthropic
24+
from anthropic import APIError
25+
except ImportError:
26+
anthropic = None # type: ignore
27+
APIError = None # type: ignore
28+
29+
30+
class AnthropicLLM(LLMInterface):
31+
"""Interface for large language models on Anthropic
32+
33+
Args:
34+
model_name (str, optional): Name of the LLM to use. Defaults to "gemini-1.5-flash-001".
35+
model_params (Optional[dict], optional): Additional parameters passed to the model when text is sent to it. Defaults to None.
36+
**kwargs (Any): Arguments passed to the model when for the class is initialised. Defaults to None.
37+
38+
Raises:
39+
LLMGenerationError: If there's an error generating the response from the model.
40+
41+
Example:
42+
43+
.. code-block:: python
44+
45+
from neo4j_graphrag.llm import AnthropicLLM
46+
47+
llm = AnthropicLLM(
48+
model_name="claude-3-opus-20240229",
49+
model_params={"max_tokens": 1000},
50+
api_key="sk...", # can also be read from env vars
51+
)
52+
llm.invoke("Who is the mother of Paul Atreides?")
53+
"""
54+
55+
def __init__(
56+
self,
57+
model_name: str,
58+
model_params: Optional[dict[str, Any]] = None,
59+
**kwargs: Any,
60+
):
61+
if anthropic is None:
62+
raise ImportError(
63+
"Could not import Anthropic Python client. "
64+
"Please install it with `pip install anthropic`."
65+
)
66+
super().__init__(model_name, model_params)
67+
self.client = anthropic.Anthropic(**kwargs)
68+
self.async_client = anthropic.AsyncAnthropic(**kwargs)
69+
70+
def invoke(self, input: str) -> LLMResponse:
71+
"""Sends text to the LLM and returns a response.
72+
73+
Args:
74+
input (str): The text to send to the LLM.
75+
76+
Returns:
77+
LLMResponse: The response from the LLM.
78+
"""
79+
try:
80+
response = self.client.messages.create(
81+
model=self.model_name,
82+
messages=[
83+
{
84+
"role": "user",
85+
"content": input,
86+
}
87+
],
88+
**self.model_params,
89+
)
90+
return LLMResponse(content=response.content)
91+
except APIError as e:
92+
raise LLMGenerationError(e)
93+
94+
async def ainvoke(self, input: str) -> LLMResponse:
95+
"""Asynchronously sends text to the LLM and returns a response.
96+
97+
Args:
98+
input (str): The text to send to the LLM.
99+
100+
Returns:
101+
LLMResponse: The response from the LLM.
102+
"""
103+
try:
104+
response = await self.async_client.messages.create(
105+
model=self.model_name,
106+
messages=[
107+
{
108+
"role": "user",
109+
"content": input,
110+
}
111+
],
112+
**self.model_params,
113+
)
114+
return LLMResponse(content=response.content)
115+
except APIError as e:
116+
raise LLMGenerationError(e)

src/neo4j_graphrag/llm/cohere_llm.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,8 @@
1717
from typing import Any, Optional
1818

1919
from neo4j_graphrag.exceptions import LLMGenerationError
20-
from neo4j_graphrag.llm import LLMInterface, LLMResponse
20+
from neo4j_graphrag.llm.base import LLMInterface
21+
from neo4j_graphrag.llm.types import LLMResponse
2122

2223
try:
2324
import cohere

0 commit comments

Comments
 (0)