Gitai is an open-source CLI tool that helps developers generate high-quality git commit messages using AI. It inspects repository changes (diff + status) and provides concise, actionable suggestions via an interactive TUI.
Below is a quick animated demo of gitai running in a terminal:
The project supports multiple AI backends (OpenAI, Google Gemini via genai, and local models via Ollama) and is intended to be used as a developer helper (interactive CLI, pre-commit hooks, CI helpers).
- AI-generated commit message suggestions based on repo diffs
 - Interactive TUI to select files and review suggestions π±οΈ
 - Pluggable AI backends: OpenAI, Google GenAI, Ollama (local)
 - Small single-binary distribution (Go) βοΈ
 
- Go 1.20+ (Go modules are used; CONTRIBUTING recommends Go 1.24+ for development)
 - One of the supported AI providers (optional):
- OpenAI API key (OPENAI_API_KEY)
 - Google API key for genai (GOOGLE_API_KEY)
 - Ollama binary available and OLLAMA_API_PATH set (for local models)
 - Gemini cli installed
 
 
- Clone the repository and build:
 
git clone https://github.com/yourusername/gitai.git
cd gitai
make build- Install (recommended)
 
make install
# or if you want to personalize the keywords for the safety check of your diff
make install-personalized-keys "comma,separated,keys"The make install target builds the gitai binary and moves it to /usr/local/bin/ (may prompt for sudo). Alternatively copy ./bin/gitai to a directory in your PATH.
Generate commit message suggestions using the interactive TUI:
gitai suggestSelecting AI provider (flag or env)
You can choose which AI backend to use with a flag or environment variable. The --provider flag overrides the env var for that run.
# use local Ollama via flag
gitai suggest --provider=ollama
# use OpenAI GPT
gitai suggest --provider=gpt
# use Gemini
gitai suggest --provider=gemini
# use Gemini cli
gitai suggest --provider=gemini_cligitai suggest will:
- list changed files (using 
git status --porcelain) - allow selecting files via an interactive file selector
 - fetch diffs for selected files and call the configured AI backend to produce suggestions
 
See internal/tui/suggest for the implementation of the flow.
Configuration is managed with Viper and can be provided from, in order of precedence (highest first):
- CLI flags
 - Environment variables
 - Config files
 - Built-in defaults
 
You can mix and match; higherβprecedence sources override lower ones.
Supported keys
- ai.provider: Which backend to use. Options: gpt, gemini, ollama, geminicli
- Flag: --provider or -p
 - Env: GITAI_AI_PROVIDER
 - Config key: ai.provider
 
 - ai.api_key: API key for the chosen backend
- Flag: --api_key or -k
 - Env: GITAI_AI_API_KEY or GITAI_API_KEY
 - Provider fallbacks (legacy):
- OpenAI: OPENAI_API_KEY
 - Gemini: GOOGLE_API_KEY
 
 
 - ollama.path: Path to the Ollama binary when provider=ollama
- Env: OLLAMA_API_PATH
 - Config key: ollama.path
 
 
Config files
- Base name: gitai (no extension in code). Viper will load any supported format found (e.g., gitai.yaml, gitai.yml, gitai.json, etc.).
 - Search paths (in this order):
- /etc/gitai/
 - $HOME/.config/gitai/
 - $HOME/.gitai/
 - Current Git root directory
 - Current working directory (.)
 
 
Example gitai.yaml
ai:
  provider: gpt     # gpt | gemini | ollama | geminicli
  api_key: "sk-..." # Optional here; can be provided via env/flag
# Only needed if you use provider=ollama
ollama:
  path: "/usr/local/bin/ollama"Example gitai.json
{
  "ai": {
    "provider": "gpt",
    "api_key": "sk-..."
  },
  "ollama": {
    "path": "/usr/local/bin/ollama"
  }
}Examples
- Use local Ollama via flag:
gitai suggest --provider=ollama
 - Use OpenAI with env var:
export GITAI_AI_API_KEY="sk-..."gitai suggest --provider=gpt
 - Use config file only:
- Create the gitai file in any of the supported search paths
 gitai suggest
 
Notes
- If multiple sources set the same key, flags win over env; env wins over config files.
 - For CI, prefer environment variables (GITAI_AI_PROVIDER, GITAI_AI_API_KEY) to avoid committing secrets.
 - OPENAI_API_KEY and GOOGLE_API_KEY are respected as fallbacks when using those providers.
 
Core components live under internal/:
internal/aiβ adapters for AI backends and the main prompt (GenerateCommitMessage)internal/gitβ helpers that run git commands and parse diffs/status (helpers used by the TUI)internal/tui/suggestβ TUI flow (file selector β AI message view)
The entrypoint is main.go which dispatches to the Cobra-based CLI under cmd/.
To run locally while developing:
- Ensure Go is installed and 
GOPATH/GOMODare configured (this repo uses Go modules). - Run the CLI directly from source:
 
go run ./main.go suggestIf tests are added, run them with:
go test ./...- Add a new adapter under 
internal/aithat implements a function returning (string, error). - Wire it into 
GenerateCommitMessageor create a configuration switch. 
Contributions are welcome. Please follow the guidelines in CONTRIBUTING.md.
Suggested contribution workflow:
- Fork the repo and create a topic branch
 - Implement your feature or fix
 - Add/adjust tests where appropriate
 - Open a pull request describing the change and rationale
 
If you'd like help designing an enhancement (hooks, CI integrations, new backends), open an issue first to discuss.
- The tool may send diffs and repository content to third-party AI providers when generating messages β treat this like any other service that may upload code. Do not send secrets or sensitive data to remote AI providers.
 - If you need an offline-only workflow, prefer running local models via Ollama and keep 
OLLAMA_API_PATHconfigured. 
This project is released under the MIT License. See LICENSE for details.
Vusal Huseynov β original author
