Helper plugin for helping installing the remote assets into public dir.
Note
This project is part of (and also associate to) the Project AIRI, we aim to build a LLM-driven VTuber like Neuro-sama (subscribe if you didn't!) if you are interested in, please do give it a try on live demo.
Pick the package manager of your choice:
ni @proj-airi/unplugin-fetch -D # from @antfu/ni, can be installed via `npm i -g @antfu/ni`
pnpm i @proj-airi/unplugin-fetch -D
yarn i @proj-airi/unplugin-fetch -D
npm i @proj-airi/unplugin-fetch -D
import { defineConfig } from 'vite'
import { Download } from '@proj-airi/unplugin-fetch/vite'
export default defineConfig({
plugins: [
Download('https://dist.ayaka.moe/live2d-models/hiyori_free_zh.zip', 'hiyori_free_zh.zip', 'assets/live2d/models'),
Download('https://dist.ayaka.moe/live2d-models/hiyori_pro_zh.zip', 'hiyori_pro_zh.zip', 'assets/live2d/models'),
]
})
- Awesome AI VTuber: A curated list of AI VTubers and related projects
unspeech
: Universal endpoint proxy server for/audio/transcriptions
and/audio/speech
, like LiteLLM but for any ASR and TTShfup
: tools to help on deploying, bundling to HuggingFace Spacesxsai-transformers
: Experimental 🤗 Transformers.js provider for xsAI.- WebAI: Realtime Voice Chat: Full example of implementing ChatGPT's realtime voice from scratch with VAD + STT + LLM + TTS.
@proj-airi/drizzle-duckdb-wasm
: Drizzle ORM driver for DuckDB WASM@proj-airi/duckdb-wasm
: Easy to use wrapper for@duckdb/duckdb-wasm
- Airi Factorio: Allow Airi to play Factorio
- Factorio RCON API: RESTful API wrapper for Factorio headless server console
autorio
: Factorio automation librarytstl-plugin-reload-factorio-mod
: Reload Factorio mod when developing- Velin: Use Vue SFC and Markdown to write easy to manage stateful prompts for LLM
demodel
: Easily boost the speed of pulling your models and datasets from various of inference runtimes.inventory
: Centralized model catalog and default provider configurations backend service- MCP Launcher: Easy to use MCP builder & launcher for all possible MCP servers, just like Ollama for models!
- 🥺 SAD: Documentation and notes for self-host and browser running LLMs.