AI Copilot for Vim/NeoVim
-
Updated
Feb 28, 2025 - Python
AI Copilot for Vim/NeoVim
A high-performance API server that provides OpenAI-compatible endpoints for MLX models. Developed using Python and powered by the FastAPI framework, it provides an efficient, scalable, and user-friendly solution for running MLX-based vision and language models locally with an OpenAI-compatible interface.
Unified management and routing for llama.cpp, MLX and vLLM models with web dashboard.
Build an Autonomous Web3 AI Trading Agent (BASE + Uniswap V4 example)
Experimental: MLX model provider for Strands Agents - Build, train, and deploy AI agents on Apple Silicon.
Various LLM resources and experiments
Federated Fine-Tuning of LLMs on Apple Silicon with Flower.ai and MLX-LM
Add MLX support to Pydantic AI through LM Studio or mlx-lm, run MLX compatible HF models on Apple silicon.
Reinforcement learning for text generation on MLX (Apple Silicon)
A comprehensive toolkit for end-to-end continued pre-training, fine-tuning, monitoring, testing and publishing of language models with MLX-LM
LLM model inference on Apple Silicon Mac using the Apple MLX Framework.
📄 Generate and fine-tune large language models on Apple silicon effortlessly with MLX LM, integrating seamlessly with the Hugging Face Hub.
MLX inference service compatible with OpenAI API, built on MLX-LM and MLX-VLM.基于MLX-LM和MLX-VLM构建的OpenAI API兼容的MLX推理服务.
OFX File Creator is a compact Python library/CLI that converts CSV/Excel bank exports into valid OFX statements. It normalizes vendor columns, parses dates and amounts, infers TRNTYPE via configurable YAML/JSON rules (optional mlx-lm enrichment), and includes examples, tests, and GitHub Actions CI.
Fine-tuning open-source LLMs for Corerefence Resolution task using mlx-lm
Add a description, image, and links to the mlx-lm topic page so that developers can more easily learn about it.
To associate your repository with the mlx-lm topic, visit your repo's landing page and select "manage topics."