Skip to content

Commit 980f72e

Browse files
committed
update version
1 parent 57f8207 commit 980f72e

File tree

3 files changed

+5
-5
lines changed

3 files changed

+5
-5
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,8 +13,7 @@ more details below in Usage section.
1313

1414
## Base Interface
1515
The package exposes two simple interfaces for communicating with LLMs (In the future, we
16-
will expand the interface to support more tasks like embeddings, list models, edits, etc.
17-
and we will add a standardized for LLMs param like max_tokens, temperature, etc.):
16+
will expand the interface to support more tasks like list models, edits, etc.):
1817
```python
1918
from abc import ABC, abstractmethod
2019
from dataclasses import dataclass, field
@@ -47,7 +46,7 @@ class BaseLLMAPIClient(BaseLLMClient, ABC):
4746

4847
@abstractmethod
4948
async def text_completion(self, prompt: str, model: Optional[str] = None, max_tokens: int | None = None,
50-
temperature: Optional[float] = None, **kwargs) -> list[str]:
49+
temperature: Optional[float] = None, top_p: Optional[float] = None, **kwargs) -> list[str]:
5150
raise NotImplementedError()
5251

5352
async def embedding(self, text: str, model: Optional[str] = None, **kwargs) -> list[float]:
@@ -200,6 +199,7 @@ Contributions are welcome! Please check out the todos below, and feel free to op
200199
- [x] Convert common models parameter
201200
- [x] temperature
202201
- [x] max_tokens
202+
- [x] top_p
203203
- [ ] more
204204

205205
### Development

llm_client/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
__version__ = "0.6.2"
1+
__version__ = "0.7.0"
22

33
from llm_client.base_llm_client import BaseLLMClient
44

llm_client/llm_api_client/base_llm_api_client.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ def __init__(self, config: LLMAPIClientConfig):
3030

3131
@abstractmethod
3232
async def text_completion(self, prompt: str, model: Optional[str] = None, max_tokens: Optional[int] = None,
33-
temperature: Optional[float] = None,top_p : Optional[float] = None, **kwargs) -> list[str]:
33+
temperature: Optional[float] = None, top_p: Optional[float] = None, **kwargs) -> list[str]:
3434
raise NotImplementedError()
3535

3636
async def embedding(self, text: str, model: Optional[str] = None, **kwargs) -> list[float]:

0 commit comments

Comments
 (0)