Need support for adding ChatPromptTemplate and chatmessagehistory #25585
Unanswered
veeramanikandanvv
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am connecting to the local model with hmac authentication using the below code.
I want to enhance this code to add ChatPromptTemplate, chatmessagehistory functionality
Code I am using:
from typing import Any, Dict, List, Optional
import hmac
import hashlib
import base64
import json
import requests
import time
import uuid
from langchain_core.callbacks.manager import CallbackManagerForLLMRun
from langchain_core.language_models.llms import LLM
class HMACAuthenticatedLLM(LLM):
"""A custom LLM that uses HMAC authentication to connect to a model."""
Example usage
llm = HMACAuthenticatedLLM(api_key='your_api_key', secret_key='your_secret_key', endpoint='https://api.yourmodel.com')
print(llm.invoke("This is a test prompt"))
System Info
I am connecting to the local model with hmac authentication using the below code.
I want to enhance this code to add ChatPromptTemplate, chatmessagehistory functionality
Code I am using:
from typing import Any, Dict, List, Optional
import hmac
import hashlib
import base64
import json
import requests
import time
import uuid
from langchain_core.callbacks.manager import CallbackManagerForLLMRun
from langchain_core.language_models.llms import LLM
class HMACAuthenticatedLLM(LLM):
"""A custom LLM that uses HMAC authentication to connect to a model."""
Example usage
llm = HMACAuthenticatedLLM(api_key='your_api_key', secret_key='your_secret_key', endpoint='https://api.yourmodel.com')
print(llm.invoke("This is a test prompt"))
Beta Was this translation helpful? Give feedback.
All reactions