-
-
Notifications
You must be signed in to change notification settings - Fork 153
Closed
Description
I was able to run mlx-community/SmolVLM-Instruct-4bit
on mlx-swift-examples (VLMEval) by simply updating the modelConfiguration, and it generated responses without any issues.
However, when trying to run mlx-community/SmolVLM-256M-Instruct-{}
or mlx-community/SmolVLM-500M-Instruct-{}
, the inference crashes with the following MLX error:
MLX error: Shapes (1,576,768) and (1,1024,768) cannot be broadcast. at /Users/user/Library/Developer/Xcode/DerivedData/mlx-swift-examples-djkwqspwvzjyvodqeyxeahqxpsse/SourcePackages/checkouts/mlx-swift/Source/Cmlx/include/mlx/c/ops.cpp:31
Since these models are still based on Idefics3, I expected them to work similarly. This looks like a shape mismatch issue, which might be related to mlx-vlm or mlx-swift.
Metadata
Metadata
Assignees
Labels
No labels