Skip to content

Commit c7f22b6

Browse files
tidy(mm): remove extraneous docstring
It's inherited from the ABC.
1 parent 9941325 commit c7f22b6

File tree

1 file changed

+0
-17
lines changed

1 file changed

+0
-17
lines changed

invokeai/app/services/model_load/model_load_default.py

Lines changed: 0 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -87,23 +87,6 @@ def load_model(self, model_config: AnyModelConfig, submodel_type: Optional[SubMo
8787
def load_model_from_path(
8888
self, model_path: Path, loader: Optional[Callable[[Path], Dict[str, Tensor]]] = None
8989
) -> LoadedModel:
90-
"""
91-
Load the checkpoint-format model file located at the indicated Path.
92-
93-
This will load an arbitrary model file into the RAM cache. If the optional loader
94-
argument is provided, the loader will be invoked to load the model into
95-
memory. Otherwise the method will call safetensors.torch.load_file() or
96-
torch.load() as appropriate to the file suffix.
97-
98-
Be aware that the LoadedModel object will have a `config` attribute of None.
99-
100-
Args:
101-
model_path: A pathlib.Path to a checkpoint-style models file
102-
loader: A Callable that expects a Path and returns a Dict[str, Tensor]
103-
104-
Returns:
105-
A LoadedModel object.
106-
"""
10790
cache_key = str(model_path)
10891
ram_cache = self.ram_cache
10992
try:

0 commit comments

Comments
 (0)