Unix process model for LLMs. Build with familiar abstractions.
Python (pip)
pip install llmproc
Run without installing
uvx llmproc
Real-world automation using LLMProc in GitHub workflows:
@llmproc /resolve
Automatically resolve merge conflicts
Intelligently merges conflicting changes when possible
@llmproc /ask
Answer questions on issues/PRs
Get AI-powered answers with full context awareness
@llmproc /code
Implement features from comments
Turn feature requests into working code automatically
Quick Setup
# Install workflows automatically in your repository root:
uvx --from llmproc llmproc-install-actions
The installer will download workflow files and provide step-by-step instructions.
Required secrets in repository settings:
ANTHROPIC_API_KEY
- API key for ClaudeLLMPROC_WRITE_TOKEN
- GitHub PAT with write permissions (contents, pull-requests)No built-in system prompt overhead. Control every token sent to the LLM.
Register functions as tools with automatic schema generation from type hints.
Define LLM programs in YAML/TOML with complete customization control.
Built-in tools, MCP protocol support, tool aliases, and access control.
Prompt caching, automatic retries, callbacks, and robust error handling.
Native support for Anthropic Claude, OpenAI, and Google Gemini models.
Production-ready LLM runtime. Complete control. Zero overhead.
Install via pip:
Basic installation (Anthropic support)
pip install llmproc
Run without installing (requires uv)
uvx llmproc --help
0
Token Overhead
100%
Open Source
∞
Customization