A CLI tool for querying LLMs
go install github.com/mkozhukh/qq@latest
Set your API key and preferred model using environment variables:
export QQ_KEY="your-api-key"
export QQ_MODEL="provider/model-name"
export QQ_PROMPT="Optional system prompt"
Supported providers and example models:
openai/gpt-4.1-mini
anthropic/claude-4-0-sonnet
google/gemini-2.5-flash
openrouter/moonshotai/kimi-k2
qq "What is the capital of France?"
When run without arguments, qq enters interactive mode
qq
# Enter your message in the form that appears
cat error.log | qq "What's causing these errors?"
git diff | qq "Summarize these changes"
Add context with system messages:
qq --system-message "You are a senior DevOps engineer" "How do I set up a CI/CD pipeline?"
Use echo-templates for advanced prompting:
# Use inline template
qq --prompt "You are {{role}}. {{user_query}}" "Explain Docker"
# Use template file
qq --prompt-file "templates/code-review.md" "Review this code"
# Use template folder
qq --prompt-folder "./templates" --prompt-file "analyze.md" "Analyze this log"
Use --markdown
flag to format output as markdown:
qq --markdown "Explain the TCP/IP model"
QQ_MODEL
: Default model (e.g.,openai/gpt-4.1
)QQ_KEY
: API key for the providerQQ_PROMPT
: Default system message
if default variables are not defined, code fallback to the ones from mkozhukh/echo
ECHO_MODEL
- for modelECHO_KEY
- for API key
if api key is not found, it goes deeper
GEMINI_API_KEY
OPENAI_API_KEY
ANTHROPIC_API_KEY
OPENROUTER_API_KEY
--model
: Override the model for this query--key
: Override the API key for this query--system-message
: Add a system message for context--prompt
: Prompt template (echo-templates format)--prompt-file
: File containing prompt template--prompt-folder
: Folder containing prompt template files--markdown
: Format output as markdown--debug
: Show debug information--help
: Show usage information
MIT License - see LICENSE file for details