Description
LM Studio version: 0.3.16 (Build 3)
Python package version: lmstudio==1.3.0
I've tried to write a custom model loading script using lmstudio-python API.
And one important thing is to specify custom gpuOffload.
I've tried to do it as it's documented here: https://lmstudio.ai/docs/python/llm-prediction/parameters
But it just doesn't have any effect.
Moreover, pylint checker gives an error:
Argument of type "dict[str, int | float | dict[str, float]]" cannot be assigned to parameter "config" of type "LlmLoadModelConfig | LlmLoadModelConfigDict | None" in function "llm"
Type "dict[str, int | float | dict[str, float]]" is not assignable to type "LlmLoadModelConfig | LlmLoadModelConfigDict | None"
"dict[str, int | float | dict[str, float]]" is not assignable to "LlmLoadModelConfig"
"dict[str, int | float | dict[str, float]]" is not assignable to "LlmLoadModelConfigDict"
"dict[str, int | float | dict[str, float]]" is not assignable to "None"Pylance[reportArgumentType](https://github.com/microsoft/pylance-release/blob/main/docs/diagnostics/reportArgumentType.md)
I've tried to do some research, and found out that more correct way according to Python type hints is following:
model = lms.llm(
model_id,
config={
"contextLength": context_size,
"gpu": {
"ratio": gpu_offload,
"mainGpu": 1,
"splitStrategy": "favorMainGpu",
"disabledGpus": [],
},
},
)
But this this produces bunch of runtime errors.
Field with key load.gpuSplitConfig does not satisfy the schema:[
{
"expected": "'evenly' | 'priorityOrder' | 'custom'",
"received": "undefined",
"code": "invalid_type",
"path": [
"strategy"
],
"message": "Required"
},
{
"code": "invalid_type", "expected": "array",
"received": "undefined",
"path": [
"priority" ],
"message": "Required" },
{ "code": "invalid_type",
"expected": "array", "received": "undefined",
"path": [
"customRatio"
],
"message": "Required"
}
]